This technology existed since 2012

Micro LED
So, after they burned OLED, AMOLED, QLED, LEDLCD and everything else they got, they're finally revealing the true stuff.
Actually, the true stuff would be Micro LETs, but whatever
The shit is also modular, which makes multiple-monitor faggots go on suicide watch.
And finally, they killed the curved screen faggotry gimmick with this as well.

Other urls found in this thread:

youtu.be/gY6fKIrQvGA
pololu.com/product/1074
youtu.be/XWKnwnJGErQ
youtu.be/B9S5WGwwhuA
ccohs.ca/oshanswers/ergonomics/lighting_flicker.html
svp-team.com/w/index.php?title=Main_Page
blurbusters.com/faq/motion-blur-reduction/
en.wikipedia.org/wiki/Light-field_camera
youtube.com/watch?v=eUprJS9sXYU
youtube.com/watch?v=wOLjvx7-C6E
youtube.com/watch?v=8sThyWQC4RY
youtube.com/watch?v=1MuFWCgDhgE
viewsonic.com/us/monitors/shop.html
flatpanelshd.com/news.php?subaction=showfull&id=1465304750
blurbusters.com/faq/oled-motion-blur/
viewsonic.com/me/products/lcd/flicker-free/
tftcentral.co.uk/articles/flicker_free_database.htm
tftcentral.co.uk/articles/pulse_width_modulation.htm
youtube.com/playlist?list=PL_Lv84oYYLR-_iExO6t3N37yeVM_EkEQD
tftcentral.co.uk/articles/motion_blur.htm
displaymate.com/LCD_Response_Time_ShootOut.htm
twitter.com/SFWRedditVideos

Probably response time. Unless moving along the visible light spectrum is instantaneous but maybe it has to warm up.

...

It's not like it'll stay expensive forever.

This is all shit compared to laser TVs / displays though.

I am hyped for that. Now the only question is when it is going to be affordable. If it is expensive and dies within 10 years then I am not going to sacrifice my whole piggy bank when I can get good enough IPS panel for cheap.

I want an OLED because I need a new monitor but I don't want to RMA over backlight issues.

I hate OLEDs because burn ins.

I think that modularity will be the king maker. No need to worry about dead pixels because once the technology has advanced enough they could simply reject individual modules that have a bad pixel.

Technically speaking, there's no reason it couldn't have been done immediately after the introduction of economical blue LEDs circa 1996.
Sauce? The best tradeoff I'm aware of is to either put a triad behind a filter, or stack them, to eliminate RGB misregistration.


Everything I hear about OLED reliability brings back strong memories of the retarded and increasingly baseless FUD throughout plasma's history.


Really, the biggest thing nobody seems to be doing right is fabrication. The way all these current LED prototypes are done is to fab a wafer of LEDs, dice it, and mount them one at a time, which is inherently going to cost too much and result in unacceptable dot pitch.

The only approach viable for mass production would be to fab a wafer with LEDs and addressing circuits as a fully functional segment/panel (for smaller devices, depending on wafer size), dice out the biggest rectangular cross section you can, and (in the case of larger devices) tile up the dice. This is already sort of how large LCDs are made, except they're fabbed on glass panels rather than silicon wafers.

It will not produce burn-ins and it will have a maximum Nits close to the brightness of the sun (I'm not joking, Sony made one with 10,000 Nits, and it can even achieve 1,000,000 Nits).
It also supports extreme scaling, they made a 6000ppi screen with it.

It literally has no downsides, it's AMOLED without the faults plus super brightness.
Now, I just wish they used Transistors instead of Diodes.

About the adoption curve, it will become standard pretty fast, as it uses exactly the same manufacturing base as OLED.

youtu.be/gY6fKIrQvGA

Really, it's time to end Pixel Geometry and put all-ranging multi-colour single atomic pixel-leds.

The advantage of laser lies in its unbeatable gamut of colours (almost every colour the human eye can see), durability and close to instantaneous response times.
MicroLED seems to have achieved the same colour gamut of lasers, it surpassed laser durability and with a TFT matrix (or just by using Transistors to light instead of Diodes) it could get the fastest response times of all technologies (except laser).

Now, laser can't be used directly into the eyes, so they use projectors now (laser screens caused eye burning) and it can't be used in small devices such as smartphones.
So it seems it's the end of the line for laser as a widespread future consumer good. It will probably be used in Movie Theaters though, and in every other application that needs a projector.

Sony even made a fully functional TV using microLED in 2012. They branded it Crystal LED at the time, but never sold the TV.
These guys just wanna play planned obsolescence.

Whatever, I am more interested if these displays would be able to work without flickering (PWM).
For some reason AMOLED could not, and many also manage to put a flickering backlight behind normal IPS array.

Something please replace CRTs already.

Mircoled with beamscan firmware could be it.

MicroLED is still shit compared to LCD in terms of lifespan

When an LED dies its a dead fucking pixel forever. When an LCD backlight dies at least the LCD is still usable. That's my main issue with both OLED and this new mLED

Just get a low-hours plasma while you can.

just buy non-bargain-bin monitors

LEDs never die if you don't overdrive them. And the funny thing is, for all the lost lifespan from overdrive, you only get maybe 10% brightness increase.

Just install emacspeak

i've had more issues with stuck pixels on lcds than with led backlights

That's almost certainly a triple-LED module with an onboard microcontroller:
pololu.com/product/1074
Remember that what is typically described as "an LED" (a tinted plastic dome, a heatsink mount, a couple pins sticking out the bottom) is different from an actual, individual LED "die" that has been diced out of its wafer, usually just a few hundred microns wide.


If you want something that bulky instead of a PC monitor, why not skip direct-view in favor of a video projector? Unlike the direct-view market, there is thriving competition between a dozen diverse technologies, producing continuous large increases in sophistication, and it has basically none of the disadvantages of LCDs, aside from lagginess which plasma also has, due to its reuse of LCD controller chipsets, in spite of the PDP itself being less laggy than CRT pixels.


My parents have an LED alarm clock (one of the first ever sold) they got as a wedding present back in the early 70s from an uncle, and it's been running 24/7/365 by their bedside everywhere they've ever lived. I also had networking equipment with dozens of rapidly flickering indicator LEDs, running continuously from the early 90s into the late 2000s when I finally had to let go of my 10BASE2 & LocalTalk stuff.

LEDs are among the most reliable electronic components I can think of, especially given their tremendous technological maturity.

Indicator LEDs I agree, but high luminance LEDs still have pathetic lifespans these days

If you're referring to luminaires used to replace incandescent bulbs and florescent tubes, those are run much brighter, much longer, in much worse cooled installations, than would ever be the case with a PC monitor.

It's shocking how much heat those piece of shit Chinese LED bulbs give off. You'd be better off wiring your house with some DC lines that use a proper quality converter somewhere than relying on the crap power supplies in those bulbs.

Yeah, but the reliability, efficiency, and heat production are still hundreds of times better than (conspiracies aside) incandescents and florescents, and the prices are incredibly low now. I'll admit the early adopter years of LED lamps were pretty pathetic, back in 2008, but all my "die in under a month" defectives from then were still in warranty, so I didn't lose a penny.

Even handheld LED flashlights have what, 20k hours of expected life? Those suckers are pasted to aluminum heatsinks and even those don't even come close to the ~100k hours of indicator LEDs

Got two smartphones with OLEDs, and I assure you the reliability troubles are real.

MicroLED is not Organic.

You need to stop thinking OLED and MicroLED are the same thing, they're fundamentally made of complete different materials.

Like all hardware manufacturers.

I can confirm that screen burn-in is a real issue. After a few seconds of some bright UI element being on screen, it seems to persist way longer than it should. and this is with an IPS LCD on a G5

maybe. at my previous work almost all of them were flickering, even the expensive ones.

Samsung makes them flicker on purpose.
LG panels (now) don't have this problem.
Check out this video, at the 18:00 minute mark.
youtu.be/XWKnwnJGErQ

And yes, this is to cut costs, it's laziness and it has a real solution.

Those are still much brighter (20k nits at the lens) than a properly calibrated PC monitor in a room at normal lighting levels (100 nits), except maybe during brief bursts of HDR.


OLEDs have been improving very rapidly. For instance, in 2008, they were rated for 14k hour lifespan to half brightness, then 36k hours in 2013, and now 100k hours for the latest models since 2016. By comparison, plasmas were rated at 30k hours in 1998, 60k in 2005, 100k in 2008.


Lights are hardly the only thing with gratuitous PWM amplitude. Most electric oven and cooktop elements use it, too, which makes some exceptionally delicate dishes impossible.

I wonder how Surface conducting electron emitter displays would have fared in that regard.

What about real lifespan? i.e. with the out of the box performances.

Well duh of course it deteriorates. Normal LEDs are literal rocks encased in plastic, nothing's gonna happen to them. The only thing that can damage them is overheating, both external from ambient heat and internal from electric heating on the junction.

...

Normal incandescent lightbulbs flicker at 50/60 Hz. It's about the threshold of flicker detection in fovea region. 90 Hz flicker is upper threshold for peripheral vision flicker detection. Anything above that is not seen as flickering.
HORSESHIT
PWM switches on and off at such rate that between upper and lower temperature peaks the difference is 0.0001 degree or thereabouts.

It would be a combination of bad things about OLED with bad things about CRT: burn-in, lag, power consumption and X-ray irradiation.

Which is to say, you get three orders of magnitude greater temperature variation just from turbulent air whirling over the dish.

...

People don't care about technology as long as it works. Manufacturers don't care about technology if it can't be shown to actualy work. Considering how shit the early consumer products are, the bar is pretty low, and whatever tech haven't hit the shelves yet can't rake in even that little brownie points to win over a manufacturing line.

I have no idea what I'm talking about: the post

How long until some dumb nigger tries to set the brightness to max, is blinded by his own stupidity, then campaigns to get the technology banned forever?

Garbage. Just use IPS.

Either that or sets fire to his house.

I have an AMOLED phone that's 6 years old. Probably spent 4 years of its life with no use at all (I don't call anybody and nobody calls me). I started reading on it 2 years ago. There's some rune-y looking wear all over the screen, different text characters are burnt over each other on the screen. The top android bar is also burnt into the screen.

that chart looks exactly like a sales prediction of disco records from 1978, and that's exactly how much i'd trust it

slim teevees are jewish go buy a sony crt.

why do people care about muh infinite contrast on a 5 inch screen?

nice try, professional ebay seller

Why have a screen capable of color, or even grayscale, if it looks like absolute garbage? Or like a black square in direct sunlight.

not really a problem today, but i guess im alone with the durability/longevity concerns

...

The industry is assuming that Apple will use OLED screens in every iDevice from 2018 on, and therefore use up all of the screens, which is why Samsung is looking into Micro LED already for the S9.

It's well known that LEDs act as photodiodes when unpowered, and some novel projects read the light intensity value between PWM cycles.
The screen is the camera.

This
also
>what is a2xghamFrlbGdoYWaxraGdhsc2dosYWxr.c2RncaGFzabGtmkamFerbGpmraGFsaa2ZkaGEK

shiggynig

That host is awful.

OLED on a phone is so stupid. Plasma and OLED both aren't meant to be used as a display that constantly shows the same image in the same region of the screen.

At least I've have something that will be better than my old Plasma now.

horse shit, I clearly see flickering up to 300 hz, maybe higher too, if I move the eyes. and it looks like complete crap and causes eye strain.

they do, but it's not as bad because they have enormous inertia, so the amplitude(difference between brightest and darkest states) is something like 15% instead of 100%. LEDs on the other hand turn off instantly, so LED on AC or PWM is kill.

bullshit. have you even tried to recover any information from a "photo" taken by a camera without any lens attached?

I have no idea what I'm talking about: the post 2: the retardening

But does it do true black?

okay, nigger, it's 100/120 hz, now you're happy?

You're more knowledgeable now so yes

I never said they flicker at 50/60 that was another guy

back to 4ddit, rakesh

Then you wasted his opportunity to learn on his own, so fuck you

please go to doctor, for the love of god

Just buy flicker-free, didn't you guys know such thing existed?

Yes.

Flicker free = full persistence
Full persistence = motion blur

Flicker = short persistence
Short persistence = sharp picture

VR screens had a grip on this basic fact for like what, 5 years already? Yet desktop monitors are going in complete opposite direction? What the fuck is wrong with you people?

Must suck to look at anything that emits light, other than the sun

Guess what, dipshit - vision is continuous, flashing at any rate can be seen as discontinuous if in the time between flashes the object moves a significant fraction of its own size. Flickering is when you can see individual flashes of a stationary object.

Actually, let's explain it visually to you.

Actually I guess there IS an upper limit, defined by your retinal resolution and the speed of light. If a flashing object passes in a straight line directly in front of the cornea at the speed of light, then it would be about 3 cm of travel over 1/100 000 000th of a second, over entire field of vision. Peak retinal resolution is 0.6 arc minutes per line and falls off sharply immediately off center, so let's give it 1 arc minute per line over 5 degrees, average. Total field of vision of an eye is about 160 degrees, so that's about 3.15% or about 1 millimeter of travel over 1/3 000 000 000th of a second. With average resolution taken as 1 arc minute, over 5 degrees that's 300 lines. Assuming the source of light is a point and it's perfectly focused on the retina, this means that you can distinguish at most 150 flashes over 1/3 000 000 000th of a second, or 45 GHz flashing.

You think I was somehow not aware of it? You think basic PWM is some hot shit knowledge?

Wait my original fraction is 2 orders of magnitude smaller than it should be, final value is 4.5 THz - just a little 100 times shy of seeing individual photons.

Flicker is good for low-FPS motion, but causes eyestrain for stuff like office work. The two applications are mutually incompatible.

There is one higher-level alternative, synthetic motion interpolation (software like SVP, for instance), but it causes lag, so it's unsuited to interactive applications like games.

So what's their jiffy this time? Unreliable PSUs that shut off after a year for "electrical safety reasons"? Subscriptions and "Screen-as-a-service"? Loads of money?

I applaud you, nigger, none of what you said was correct or even made sense. Just like flickering incandescent light bulbs don't cause any eye strain, flickering monitors do not either. That's because they flicker too fast for you to see it, effectively it's continuously lit. Short persistence display is good for high framerate motion, not for low one. That's because low persistence display flashes an image for a very short duration and uses your natural persistence of vision to fill out the blank, resulting in very sharp and clear individual frames - this is how CRT animation looks better than LCD. But if you gonna display the same frame for more than one flash in a row, you're just gonna induce double-vision (triple-quadruple-etc, depending on how long you show the same frame). Meanwhile, full persistence displays induce ghosting - by design. Because the image is continuously lit, a movement of the eye across the screen continously smears the image across the retina. The whole point of using short persistence is to avoid this smear and ghosting. Motion interpolation has absolute jack shit to do with any of it, if you're not gonna flash the screen rapidly you gonna get the same ghosting, because that's how your eyes work. Front-end motion interpolation is at best a shitty gimmick that fails in any sort of non-trivial scenario. It should be done on GPU during rendering, and even then if and only if the GPU couldn't render enough frames, because just like completely synthetic interpolation, it fails in any sort of non-trivial scenario.

found the jew

another nigger-tier horse shit
flickering does cause eye strain, it has been proven a lot of times, and the only reason to keep it is to save shekels on production because doing stuff properly is more expensive and most normies think that flickering is okay or can be easily brainwashed to think so.
given the same frame rate, it doesn't

Last time they had a technological leg up on the competition like that, they patented the shit out of it and proceeded to dominate the market for decades. Why would they purposely avoid releasing a successor to the Triniton?

(((TalmudVision)))

Ever used a pinhole camera?
Ever heard of a compound lens?
It's easy to focus a light sensor to measure points of light by simply putting it behind a tube.

Sony has made plans beforehand. It wasn't just Samsung that unveiled the microLED TV at the CES, but Sony as well.
Actually, I don't know if you remember, but some decade ago Sony laid off all of its Senior Engineers, responsible for their top notch tech because they were "expensive" - and Samsung hired them all - since then, Sony products have been decaying steadily, from absolute leadership to small market share.
Due to Sony's financial collapse in this past decade, they lost too much and I don't even think they manufacture their own screens anymore, but outsource them either to China or Samsung.
That may be how Samsung got the hold of this microLED technology in the first place, either by industrial espionage or by blackmailing a deal.

But here it is, Sony's gorgeous microLED screen:
youtu.be/B9S5WGwwhuA

There is no pin hole between your body and a fucking monitor. Of course I heard, so what?
Yes, and?
That user tried to imply that a bare LED monitor may serve as a surveillance device. And this is horse shit.
Do you know any people who look at the monitor through a narrow hole or a tube?

Wrong. Even if flicker is fast enough to pass the perceptual fusion threshold, flicker at frequencies up to hundreds of Hertz still causes eyestrain, even if higher frequencies do cause less eyestrain:
ccohs.ca/oshanswers/ergonomics/lighting_flicker.html
Wrong. Motion interpolation allows you to run a monitor at higher refresh rates, reducing eyestrain due to flicker, reducing perceptual motion blur artifacts, and reducing discomfort caused by low-FPS motion.
Not the cheap ASIC garbage in a typical HDTV or set-top BD player, but good software like SVP. Watch the video on this page to see what I mean, the quality is very impressive:
svp-team.com/w/index.php?title=Main_Page

Flicker does cause eyestrain, but it IS good for smooth motion. See here for a more thorough overview of the subject:
blurbusters.com/faq/motion-blur-reduction/


PJs are great, but not really suited to some applications like everyday desktop workstation use.


Other poster is a drooling paranoid retard high on memes, but his spastic flailing reminded me of a really cool technology called "lightfield imaging", that uses a pinhole camera for every pixel. It allows you to take volumetric "images" with HDR, 3D, and infinite multifocus in each individual exposure:
en.wikipedia.org/wiki/Light-field_camera

Looks like nocebo effect to me. I'd like to see some blind testing where the subjects cannot idenfity if the light source was continuous or flickering. There haven't a huge stream of headaches complaints from several decades of use of incandescent lightbulbs, but with introduction of fluorescent lamps that flicker at exactly the same rate, suddenly that became a problem - basic case of calling horseshit 101. Fluorescent lights however are different from incandescent lights in radiation spectrum - fluorescent lights have narrow band with a lot of very sharp spikes, incandescent bulbs have smooth and very wide spectrum of a black body radiation, same as natural sunlight except lower intensity. And what do you know, turns out that having to compensate for low average intensity by having very strong sharp spikes in the spectrum causes eyestrain, which is why Laser TVs never made it into the market. So I don't believe that has shit to do with flickering, I think it has everything to do with cheap crappy fluorescent lamps having poor emission spectrum.

There is no good software for this and won't ever be, period. That's because inevitably it has to produce data out of nothing, and inevitably that will look wrong.
Yeah it can do the linear interpolation of a motion field, I can write a software that does this in a span of few minutes. What it can't do is to make up for lack of frames, whenever there's actual animation (as opposed to basic tweening) the intermediate frames are simply missing, like they are in the original anime. I don't find that to be any sort of impressive, and if you tried to make a point that there can be good interpolation using this as an example, then you only made my argument stronger.

There's a concept you need to understand with incandescent bulbs compared to fluorescent tubes and CRT, usually called "inertia". Even for devices of all three types that flicker at the same rate, the difference between the "top" and "bottom" of their emissive cycle differs. Observe:
Incandescent 120Hz bulb (shot at 300-1200FPS):
youtube.com/watch?v=eUprJS9sXYU
Fluorescent 120Hz tube (shot at 400FPS):
youtube.com/watch?v=wOLjvx7-C6E
CRT 60Hz display (shot at 240-1000FPS):
youtube.com/watch?v=8sThyWQC4RY
Notice how the incandescent barely dims each cycle, while the fluorescent nearly turns off, and the CRT is pitch-black aside from the scanline.
Because most people are too gay to use projectors, and RPTVs suck. True laser projectors are slowly trickling into the market, and share some of the advantages (and disadvantages) of CRT projectors over panel-based DLP/LCD projectors.
It can do a lot more than that, and the results look excellent to my eye. Obviously not as good as if anime and movies were filmed at 120FPS instead of 24FPS, but the massive improvement in motion quality far outstrips distraction from tiny spatial artifacts IMHO. Here's a side-by-side comparison to make the effect clearer:
youtube.com/watch?v=1MuFWCgDhgE

Just eliminate the flicker, screens can work without PWM.
ViewSonic actually makes screens with 2nm and constant frequency, no alternating.

Again, read Blur Busters. Even though flicker is generally undesirable for most applications (officework, reading, etc.), it is desirable for motion applications like gaming and video, to the extent it can be added artificially on zero-flicker displays using "blank frame insertion".

Again, they've fixed that, they even have gaming monitors.
viewsonic.com/us/monitors/shop.html

Absolutely, flicker-free displays, with optional flicker for some applications, and dynamic sync, are ideal. Now we just have to wait for them to transition away from LCD garbage.

They're using LCD (TN and IPS) because an OLED monitor is idiotic - monitors are meant to last and OLED can't provide that reliability.
Now with microLED though, it's just another story.
I'll probably buy from them when these start selling, no other manufacturer seems to have correctly fixed the blur and flicker stuff apart from them.

OLED lifespans have improved very, very rapidly, from 14k hours to half-brightness in 2008, to 36k in 2013, to 100k in 2016:
flatpanelshd.com/news.php?subaction=showfull&id=1465304750
This compares favorably with CRTs or fluorescently-backlit LCDs, which were usually rated for 80k hours.

I'd still like silicon pixel technology, even if only because of greater headroom for HDR, but OLED already outperforms every other technology on the market right now across the board, and better choices in manufacturing (along the lines of Phillips' LEP inkjet-style tech) could make them insanely cheap.
That's certainly really annoying, but my biggest peeve with most modern monitor integrators is their chipset's lagginess, low FPS caps, and unwillingness to use high host bandwidth (nearly every mid-tier GPU since 2010 comes with 6 DP/HDMI outputs, and all current gen GPUs have DP1.3+, yet nearly every monitor to this year is stuck with un-ganged input over DP1.2 /HDMI1.3).

horse shit

Read it:
blurbusters.com/faq/oled-motion-blur/

We've seen that the blur problem got fixed, there's no basis to advocate for a gimmick (flicker) to prevent ot anymore.

You can't "fix" the blur problem without:
a) flicker
or
b) extremely high framerate

You can, you're just working on a frequency-based thought.
If you eliminate the disparity between the screen and the content (which is what the ghosting is), making only one variable out of them (the content), then you'll have only the pure experience, instead of a gimmick of trying to match both frequencies.

The motion ghosting comes from your own eyes, dipshit. The only way to combat it is to use the wagon wheel effect, i.e. rapid flicker.

You know the viewsonic monitors have flicker, right?

They don't, read their literature. Neither LCDs nor LED backlights have any inherent technical reason to flicker.

Nipniggers couldn't control brightness using methods other PWM. That and they can't spare $3.5 extra for a proper PSU rather than simple rectifier bridge without even a fucking ballast cap.

Lol, what a retard.

They don't.
viewsonic.com/me/products/lcd/flicker-free/

Not the case anymore. There are many monitors that use true analog DC amplitude control, here's a list of such displays from numerous manufacturers:
tftcentral.co.uk/articles/flicker_free_database.htm
And here's your choice of an article discussing it, or a video series (especially the 2nd video) demonstrating it with an oscilloscope and special camera:
tftcentral.co.uk/articles/pulse_width_modulation.htm
youtube.com/playlist?list=PL_Lv84oYYLR-_iExO6t3N37yeVM_EkEQD


Yes, it is indeed an inherent artifact of human visual perception, called "flicker fusion". The fact that people can be tricked into perceiving motion from sequences of still images at all is itself another quirk of human physiology called "persistence of vision", but it produces perceptual artifacts such as motion blur in "sample & hold" displays like LCDs or OLEDs (or, indeed, film projectors) that can only be reduced by reimplementing CRT/plasma-style flicker (which causes eyestrain), or by increasing the framerate (whether through computer synthesis of fake frames, or by originating and storing content at higher FPS). For a good overview of the subject, either read the following article:
tftcentral.co.uk/articles/motion_blur.htm
Or watch the above linked video series (especially the 1st video) in Youtube's 60FPS mode to see a demonstration of the effect with your own eyes.

...

You and that guy missed the point: I know it's human vision, but you need conditions to trigger it, and they must be in the monitor/content, so to fix it, fix them.

Rereading your posts, you seem to be referring to one of two things unrelated to what I'm talking about:
>b) Mismatch between playback framerate and display framerate with commonly divisible factors, on a flickering display (film/CRT/plasma). For instance, 24FPS or 60FPS playback being displayed at 120FPS, which results in perceptual ghosting, but also decreases motion blur and flicker (also, note this ghosting caused by displaying the same frame repeatedly is a perceptual illusion occurring in the person watching, and isn't the same as physical "crosstalk" artifacts where the display actually shows simultaneous ghost images). Ghosting and motion blur are not the same thing, though very large amounts of ghosting can appear similar to motion blur.
I'm talking about a third, totally unrelated effect, whereby "sample & hold" flicker-free or low-flicker displays cause you to perceive motion blur.

Ok, you finally understand me and I got your point.
Now, my solution is different from yours: to me, it should be the case where all monitors become flick-free (like ViewSonic) and the content adapt to it by increasing the frame rate.
Also, display persistence is hardly a problem - 7 is too much, but you can have top class IPS today with 2 (I've only seen 1 in TN and never saw 0).

Like I said before, there's only two ways to do this: Either synthesize fake frames using a computer (software like SVP), which will of course result in artifacts and latency, or originate and store content at a higher framerate (MUCH higher, like hundreds of FPS). If you have low-FPS content and don't want artifacts or lag, flicker is the only solution.
I don't think LCD manufacturer figures are credible. For instance, while this was back in 2009, an LCD rated as "8ms response" yielded 2x-4x that much lag, depending on GtG conditions, when actually tested:
displaymate.com/LCD_Response_Time_ShootOut.htm
And attempting to get around this by switching from slow laggy tech like IPS to something faster (less slow, really) like VA, TN, or BPM, means sacrificing even more of LCD's already inadequate color/tone performance. LCD is just absolute garbage in every way.

I don't think it would be so much impossible. You see, there are 3 things:
OS and programs run native, so no problem.
Rendering would just require new APIs (or not even that) to remove any fps cap things might have and just boost it all. 120fps is enough for that, and achieving it in 1080p today is done easily. If it's static like Autodesk stuff, then there's no problem here either. Hardware power could solve this - for very old stuff, like 90s games and such, either make an update for the cards to override their limitations or rely on future patches or revision versions of the game.
For video, I don't think there's any solution apart from that horrible "motion flow" stuff present in TVs.

Yeah, native content or other CGI with source files available obviously wouldn't be a problem, except as you noted for some legacy (especially emulated) or badly written stuff that's too hard to reverse-engineer. "MotionFlow" is Sony's consumer branding for the computationally interpolated postprocessing I referred to. Even on the matter of future content, interframe compression means the storage/distribution penalty for higher FPS video isn't necessarily that bad given well-optimized CODECs. Keep in mind, however, that going totally flicker-free would require very high-FPS displays, faster than any LCD on the market right now, and rendering stuff like games at typical resolutions (2k+) in realtime would max out a $1k PC.

Without such a hundreds-of-FPS setup though, you're right back at the binary choice of "flicker or motion blur".

Oh, and I noticed a moment too late I didn't read that article I linked quite closely enough, since the "8ms" figure was for rise/fall response, while the test was just for single-transition, so the manufacturer was actually exaggerating '''4x-16x slower" response.