This pisses me off

This pisses me off.

Other urls found in this thread:

en.wikipedia.org/wiki/Peripheral_vision
en.wikipedia.org/wiki/Visual_acuity
roadtovr.com/understanding-pixel-density-retinal-resolution-and-why-its-important-for-vr-and-ar-headsets/
github.com/nagadomi/waifu2x/issues/117
emulation.gametechwiki.com/index.php/Texture_Packs#Sprite_Replacement_(2D)**
emulation.gametechwiki.com/index.php/Texture_Packs#Sprite_Replacement_(2D)
twitter.com/SFWRedditGifs

It's cheaper and easier to take four already manufactured lcd screens and stick them together with a logic chip/gpu then it is to manufacture new truly 4k screens.

...

filterfags caused this.

You deserve all the shit, 1080p is going to reign for a long time now, specially because codecs are hiting unmanageable complexity walls and chips are reaching their physical limits.
We're very likely to maintain this same level of technology we have today for a couple decades now.

Technology never stalls, it is due to mobile advancements that we are on a plateau if anything.

>it just sometimes exhibits all of the characteristics of stalling for

...

You can play the pessimist all you want but at the end of the day things will keep moving forward. We're not at the end of microprocessor architecture. There are numerous ways of avoiding the "transistors being too small to be reliable" issue.

Also, just in case
samefag

nigger what? You can add more decoders and more LCDs easily. Worst case you reach a point mp4 needs to change to support more parallel decoding but that's also easy.

...

This pisses me off.

Kek'd at the second figure.

plz stahp making lcds


Even without native content, high-end scalers like NNEDI3, NGU, or Waifu can produce substantially greater detail
Things are still advancing apace, VP9 & h.265 are permeating the market rapidly, while AV1 & JVET's efforts should bear fruit sometime next year, and of course the prospect of wavelet or fractal encoders as always holds even greater potential.
This is the one area where you have a bit of a point, and that's only because display manufacturers are being stingy about external interfaces. HDMI is substantially behind DP, and even for video modes exceeding DP 1.4, it is easily possible for them to make displays that gang together multiple DP links, as the first generation of >2k LCDs did using multiple DVI connections (in particular, the IBM T220, which supported up to four to achieve full speed). With all but the cheapest GPUs now supporting 6 simultaneous DP connections at full speed, this is a perfectly practical solution nowadays.


Ancient Chinese secret: If you want more vertical space, use multiple widescreens in portrait orientation, 16:9 (~1.777) is very close to the golden ratio (~1.618) long favored by printers.

Golden ratio is a spooky Confucian bullshit, the most fitting aspect ratio for human video is closer to 5:4 or even 1:1.

Only if you're a popeyed pirate

There is no value to peripheral on a computer where you can move the camera instead of your focus as you get almost no information from it despite paying the cost of rendering a billion pickles. It's why Oculus's resolution per eye is 1080×1200 and not ultrawidememe.

Dispensable
Do you know the computational complexity AV1 is requiring? 88 cores.
Kys, cry me a river for your useless 4K screen.
Yep, you're retarded and don't know what you're talking about.

Same thing happened in the 4:3 to 16:9 transition.
The solution is to buy a much bigger monitor to compensate.
But in the end though, Ultrawide is for retards.

The value of peripheral vision is that you can be aware of multiple things happening even if you can't see all of them clearly, since even inside peripheral vision you can't actually see very well. Look closely at the diagram in my previous post, and keep in mind that assuming you have nominally 20/20 vision:
In case you're wondering why you seem to be able to read 6-point type across an entire page, it's because your eyes are subconsciously making tiny, extremely fast (the fastest muscle in the body) movements at all times, called saccades, and your brain averages these together to fake a larger field of view. Also how the "blind spot" offset 20° from center in each eye is made invisible. As a result, the priority when designing a monitor is to have as much information available for the entire field of view (constrained both by eyesocket rotation and obstruction from the brow, cheeks, and nose) while having enough resolution for the viewer to focus their attention on any part of it with eye movement.
en.wikipedia.org/wiki/Peripheral_vision
en.wikipedia.org/wiki/Visual_acuity

As for the current state of VR, that is a terrible example. The real reason is because they're recycling last-gen cellphone panels using corrective optics, so that the grainy 20-25µm dot pitch is spread across their tunnel-vision-riffic 90-100° horizontal FoV (as compared with 220° human horizontal FoV) very linearly to produce 10-arcminute-wide pixels (for comparison, 20/20 acuity resolves parallel lines to within 1 arcminute, and vernier angles to within 8 arcseconds) at the center of each eye's view:
roadtovr.com/understanding-pixel-density-retinal-resolution-and-why-its-important-for-vr-and-ar-headsets/


It's an alternative to multiple monitors, or at least so many multiple monitors, especially the curved ones. The real "for idiots" option is attempting to be productive on one monitor.

...

Upscaling a 1080p to 4K is worse detail-wise than seeing said 1080p in a 1080p screen. Yes, you're retarded, using gimmicks to fix a problem you yourself created.
LOL. Do you thought I said that? This means increasing encoding complexity - which affects WHOLE INFRASTRUCTURES. They're reaching a wall, specially due to processors.
Yep, you're REALLY retarded. Go shitpost elsewhere.

If you can't see a major improvement, see a doctor.
GPGPU has advanced to the point where the entire codec can be run on the shader pipeline, and GPU power is filthy cheap. As for lower-quality realtime encodes, AV1 ASICs will be ready about a year after the spec is frozen.

One day processing power will be large enough to run waifu2x as a real-time video upscaler.

github.com/nagadomi/waifu2x/issues/117
Maybe for old 2D vidya on a barely affordable screaming quad-SLI rig, but barring major optimization, probably not for 1080p-4k upscales.
For console games, probably the best approach is offline-upscaled spritepacks using special emulators:
>>emulation.gametechwiki.com/index.php/Texture_Packs#Sprite_Replacement_(2D)**

There's nothing I hate more than these sentiments.

Isn't there an OpenCL port of waifu2x? And what about the CUDA acceleration is has? Who knows, it could be in a few years for 1080p.

Does it use double or single precision? I wonder if using half precision would make a big difference since it's twice as fast on newer GPUs. It might still look better then NNEDI3.

fug, screwed up pasting and spoilering url:
emulation.gametechwiki.com/index.php/Texture_Packs#Sprite_Replacement_(2D)

Waifu2x actually looks fucking ugly on anything that isn't very specific moeshit anime.

It's almost like it's been specifically trained for this dataset.

...

This is retarded. A wider screen with the same height, undeniably, has MORE space.

Does it really matter?

If you don't mind watching videos like you're looking out a prison cell, sure

He's just clinging for affirmation for his poor choices. He fell for the meme.

I have a CRT and projector for that, and only use LCDs for text and lineart.

Good grief. Not only you're burning your eyes out, you can easily get neck pain from having to look up like that.

3/10.

Get your head out of the gutter young man

LARPer (or moron) detected
LOL

...

What are color gambits?

Just don't listen to the guy, he's baiting, he can't be that much retarded.

Ergonomically speaking, the top edge of your screen should be level with your eyes, but unless you're a midget or slump in your seat, that should be about 2' above your desktop.

...

o