This pisses me off

...

Upscaling a 1080p to 4K is worse detail-wise than seeing said 1080p in a 1080p screen. Yes, you're retarded, using gimmicks to fix a problem you yourself created.
LOL. Do you thought I said that? This means increasing encoding complexity - which affects WHOLE INFRASTRUCTURES. They're reaching a wall, specially due to processors.
Yep, you're REALLY retarded. Go shitpost elsewhere.

If you can't see a major improvement, see a doctor.
GPGPU has advanced to the point where the entire codec can be run on the shader pipeline, and GPU power is filthy cheap. As for lower-quality realtime encodes, AV1 ASICs will be ready about a year after the spec is frozen.

One day processing power will be large enough to run waifu2x as a real-time video upscaler.

github.com/nagadomi/waifu2x/issues/117
Maybe for old 2D vidya on a barely affordable screaming quad-SLI rig, but barring major optimization, probably not for 1080p-4k upscales.
For console games, probably the best approach is offline-upscaled spritepacks using special emulators:
>>emulation.gametechwiki.com/index.php/Texture_Packs#Sprite_Replacement_(2D)**

There's nothing I hate more than these sentiments.

Isn't there an OpenCL port of waifu2x? And what about the CUDA acceleration is has? Who knows, it could be in a few years for 1080p.

Does it use double or single precision? I wonder if using half precision would make a big difference since it's twice as fast on newer GPUs. It might still look better then NNEDI3.

fug, screwed up pasting and spoilering url:
emulation.gametechwiki.com/index.php/Texture_Packs#Sprite_Replacement_(2D)

Waifu2x actually looks fucking ugly on anything that isn't very specific moeshit anime.

It's almost like it's been specifically trained for this dataset.