Interference between the crystals and the polarizers are what causes poor viewing angles in such a thin display. Plain and simple.
CRT (and every other technology aside from direct-view LCD) has no noticeable distortion until you're a few degrees from perpendicular, at which point it rapidly degrades. LCDs, on the other hand, show noticeable color and gray distortion even a few degrees from dead center (especially vertically).
Is it? Aside from the often questionable judgment of professionals at the industry level (like the transition from film to video for even the highest-budget movies, and my own personal experience with prepress being force marched onto LCDs in the mid-2000s), do they really have a choice? The rapid shutdown of general market CRT and plasma factories/laboratories eliminated them as an option for other markets, the supply of OLEDs is still trickle-thin (the OLED version of the display you linked still costs almost double!), and comparatively tight research budgets for OLED mean lifespan and brightness is still far lower than it should be.
bpm-media.de/en/Post-Production/Video-Monitors/27-47-Monitors/Sony-PVM-X300::357299.html
bpm-media.de/en/Post-Production/Video-Monitors/27-47-Monitors/Sony-BVM-X300::368862.html
meme from reason TN still competes against IPS in different markets today just like always
Ah, I understand what you're saying now. In other words, instead of fake 8-bit (from 6-bit), it's fake 10-bit (from 8-bit). As opposed to the actual >8-bit everything other than LCD has always been capable of.
That still shows 2.1ms controller latency before the panel even twitches. That's very good—in fact—many times better than 99% of PC LCDs (which is terribly shameful in and of itself, but anyway…), but it's still infinitely slower than the nanoseconds it takes for a GPU's RAMDAC to command a CRT. Why? The same is true of panel controllers for plasma, OLED, DLP, LCoS, and all the rest, yet the task they're performing is simpler in principle than a RAMDAC. Somebody will have to unfuck panel controllers, one way or another.
You realize scaling in software, or even on your GPU, will look FAR better than the worthless scaler in your LCD, right?
!?!?!?!? I've never heard of SED/FED leaving the handmade prototype stage. Anyway, my ideal tech is synthetic LED, since it's far more mature than anything else, I've never understood why it's treated as so difficult (by, for instance, Sony's Crystal LED project, or mLED projects) to manufacture even though they're made on wafers just like LCDs.