With monitor resolutions getting higher and higher, do we even NEED Anti Aliasing any more?

With monitor resolutions getting higher and higher, do we even NEED Anti Aliasing any more?

Other urls found in this thread:

en.wikipedia.org/wiki/Visual_acuity#Other_measures
en.wikipedia.org/wiki/Mars_Climate_Orbiter#Cause_of_failure
archive.is/Duzjs
archive.is/fWyVu
twitter.com/AnonBabble

yes.

Do we need consoles?

I don't think we ever needed AA in the first place. My eyes have always perceived it as blurring, and constantly try to focus the image. I prefer my images clean and sharp.

Also 4K monitors are a meme.

>works on my computer fucked up brain

For now, yes. We'll need it until a single black pixel lit up in a white field is small enough to be almost indiscernible. It's not as important now, but as long as individual pixels are visible, (yes, even at "retina" resolutions), we need it.

just use supersampling to render at a higher resolution than what your screen can actually fit

That depends on angular resolution. Your retina peaks out at 1.6 arcminutes per line, i.e. 100 lines per 1 degree, give or take. If at your viewing distance, screen dimensions and resolution, you get more than 100 pixels per degree, then no you don't need anti-aliasing, because your eyes can't resolve any more fine features than that.

This is usually a topic to bring up in VR headset threads, not desktop monitors.

Same here, but last time I started a thread saying something of this nature I got called a consolefag among other things, and a lot of people seemed to think I had no idea what AA was or did.

Also, to give simple example:

If your full HD monitor takes 20 degrees in your field of vision, or 4k monitor takes 40 degrees, then this is pixel density that maxes out your eye perception capability.

You got memed on. Consolefriends love blurred shit. 720p upscaled to 1080p? Game covered in hideous chromatic aberration, layering blur on top of blur? LOOKIN' GOOD!

They probably don't disable hdr and bloom even if they're able to.
Disgusting!

For a 24" monitor to take up 20 degrees of your vision, it would have to be sitting more than five feet away.

HDR I go on a game-by-game basis but I turn off bloom if it's ever an option because I have literally never enjoyed it at all.

Sounds like about adequate distance for such a large monitor.

Are you retarded?

lol nevermind

But hey it's your fault anyway. If you didn't had such ass-backwards retarded metric system, I wouldn't have fucked it up.

Let me know when the metric system has put a man on the Moon.

I'm running 1440p and still need it but it's much better than 1080p. I haven't spent enough time on 4k to tell but I suspect you could get away without it on a sub 30 inch monitor.


Problem is eyes aren't digital and we are hardwired to spot patterns aswell as anything that looks off. If you have a horizontal line with a single pixel offset in the middle you are going to notice even if your eye can't clearly resolve the square edges perfectly.

1961 was it?

Metric system put a man in America.
A terrible mistake.

No you're not going to. Draw such line in paint, then back away from your monitor to max out angular resolution.

Eyes aren't perfect and neither are our brains. They can be tricked.

Maybe yours, poorfag.

1969, done with slide rules and analog radio and unmatched by anyone else in the world in the nearly two generations since. I don't know if the rest of you are even trying.


But Spain didn't adopt the metric system until the 1850s.

That is a shtty comparison of unrelated effects, it's like using video related to argue audio quality doesn't matter if visuals effect what we hear.

I have a 1440p monitor and the difference with AA isn't really noticeable to me in most games. Text and other 2D vector graphics on the other hand is absolutely noticeable, not that I don't like
(´・ω・) non-AA fonts
from time to time.

New technologies just mean that devs will get even lazier. "lol who needs sharp textures when you run the game at 4k?"

Better question
Is the significant performance loss for incredibly insignificant quality increase between Ultra and Medium at 4k worth the investment and should it be the benchmark standard?

I mean, not that I am the best judge when it comes to this kind of thing.
I still play most games in 480p.

4K is a bad choice for gaming. It's a downright retarded choice for console gaming, where you are sitting too far away from the TV to have any hope of noticing the difference.
1440p is a good middle ground for workstations, with better resolution and slightly increased pixel density allowing for larger displays with more usable real estate. 4K monitors are tempting but have software problems with DPI scaling, and limited refresh rates for games. Whereas 144hz 1080p monitors are cheap, and 144hz 1440p monitors are affordable.

I see you are a man of taste.
                    /:::::::::::::::::::::::::::::::::::::::::::::::\                   /::::::::::::::::::::::/::::::::::::::::::::::::::::::::::ヽ                /::::::::::::::::/ /::::::/:::::::::::::::::::::::::::::ハ             /:/::::::::::::/_ l:l::::::l::::ハ::l:::l:::::}::::}::::::::i                ;::/::::::::::/i{  _ l l:::::l:l::i-}:l:::!l:::l::::;:::::::::}                 {/l::::l::::/,ィ芍ミ`  ̄ "-‐Lノ !;j:::/:::::::::!              { i::::i:::{ `{沙j     仍芯

I wanna fuck that font

I think you have no idea what AA does.

Without it, you get all kinds of flickering pixels on small/distant objects and high contrast edges, pixels and edges that would fade out smoothly and suggest sub-pixel details will now just pop in and out of existence. Things can and probably will actually look less comprehensible without AA, people who spout this "muh blur" shit are probably using some meme aliasing like that old nvidia thing that literally just blurs the screen, or a shitty game boy resolution where half the screen is antialiased pixels.

No. While anti-aliasing does add additional detail, and that's less necessary at higher resolutions, it also removes various aliasing artifacts that persist regardless of resolution.

Like said, moire, undersampling, shimmer, jaggies, and numerous other irritating artifacts are greatly reduced by even the simplest anti-aliasing.

Note that while the same is true of texture filtering and the similar artifacts it removes, texture filtering doesn't fix artifacts generated by geometry. Contrariwise, unfiltered textures can produce entirely new artifacts when passed through frame-level AA, so both texture filtering and AA working together are necessary to eliminate all image artifacts.


Aren't the only things that matter to gamers from a purely mechanical standpoint spatial resolution, framerate, and latency? Leaving aside the fact that LCD's garbage response times mean caring about latency or framerate is near-pointless until somebody makes an OLED PC monitor, DP 1.3/1.4 means 4k/120Hz monitors are coming very soon. That's a good compromise between precise visuals and fast action.


Pleb-tier

These pictures mean less than they should when they're covered in jpeg artifacts anyway.

Click them, Holla Forums recompresses thumbnails itself. Either way, you get my point, right?

...

You're fucking blind if you can't tell the difference between a 144hz LCD and a 60hz LCD, though I also eagerly await the day OLED becomes a viable PC monitor tech.
And there's a difference between looking at a regular font with no antialiasing and horrible curves, and a font that has been specifically designed and hinted (or even hand pixeled) for low-res non-AA display. Jap fonts in particular rely on this because there was no way to display kanji fonts at 90s/2000s resolution without human intervention. They have charm for the same reason that pixel art does when done right.

 (((;;;:: ;: ;;          ;; ;:;::)) ::)   ( ::: (;;   ∧_,∧   );:;;;)) )::: :; :))    ((:: :;;  (´・ω・)っ ;;;; ; :))     ((;;;  (っ ,r patpatpatpatpat・・・・・          i_ノ┘ ((;;;;゜;;:::(;;:  ∧__,∧ '';:;;;):;:::))゜))  ::))) (((; ;;:: ;:::;;⊂(´・ω・`)  ;:;;;,,))...)))))) ::::)  ((;;;:;;;:,,,." ヽ ⊂ ) ;:;;))):...,),)):;:::::))))   ("((;:;;;  (⌒) |patpatpatpatpat・・・・・         三 `J         .∧__,,∧       ⊂(´・ω・`)⊃    ☆   ノ   丿 screee      ヽ .ノ  (⌒) 彡       と_丿=.⌒            (⌒⌒)      ∧_ ∧ ( (You) )     ( ・ω・` ) ノノ~′       (⊃⌒*⌒⊂)       /ノωヽ

Jagged edges on a sloped line isn't a big deal. The problem comes if there are a series of line containing the same slop but varing offsets. Then the jagged edges create the optical illusion of a wave like the one in the image.
So until the digital resolution is greater than twice the maximum spacial frequency, anti-aliasing may be an needed to remove things like the moiré pattern that cause flat surfaces to look wavy.

This for fuck's sake, I can't believe so many people don't get it. It's all about how many pixels there are spread over your field of view, not overall resolution. Not much point in going from 21" 1080p to 36" 1440p, unless you only wanted a larger monitor rather than better resolution.

AA always looks like shit. What's needed is better filtering to get rid of shit like moire patterns.

AA doesn't help that, look into 3D rendering filters in regard to that.

The lack of AA is always shit. Only nostalgia goggle sporting hipster fags think no-AA is good.


I think you mean texture filtering.

Specifically, Mipmaps and Anisotropic filtering.

False you mong, you could just sit slightly farther away with the 36'' monitor.


I second this, 60hz vs 144hz is dramatic. I have run 60 and 144 next to each other in dual screen for over a year. You can easily see it by moving a mouse around your desktop or dragging a window around.

Like I said upthread, texture filtering is basically AA for textures, meaning it doesn't work on geometry, which will only become more important with the increasing prevalence of tessellation shaders.


High FPS LCDs are better, but the smearing and ghosting is terrible compared to CRT/plasma/OLED/etc, and it only gets worse as FPS increase. Not to mention that every method of mitigating LCD's refresh problems trades off worsening the already awful black/gray/white/color performance of LCDs.

What's the point of buying a larger screen if you sit at a distance that effectively makes it smaller, apart from wasting money? Also since we're talking monitors, your distance to it is mostly constant, unless you want to not utilize your desk and sit with the keyboard on your lap.

I like the first fence image more than the second

sue me

...

...

Your shitty genes save you bandwidth. Kinda neat.

I need to fix my eyes friendos

yes. I play at 4K on a 24" and its still ugly w/out AA

I never thought of it like that. It's 2017 and my internet speeds will only allow me to watch videos in 360p.

anti-aliasing is really nice for very high resolution stills. Game play in movement a subtle motion blur will get the job done and be less demanding. A CRT has a natural blurring that maintains both fluidity and image quality superbly, far better than any LCD.

Since I play games, not screenshots, I don't care much for anti-aliasing.

Higher resolutions don't suddenly eliminate the finite nature of monitors. And we're much far from the aliasing being beyond human to spot, even 4k phones still benefit from anti aliasing.

But you don't.

Alright, you don't know what you're talking about and just want to sound clever.

Depends on the AA, if it's msaa then yeah it won't. If it's supersampling yes it will.

Not to mention higher refresh rates help with temporal aliasing.

That's because your brain fills in the gaps when something is missing detail. You can notice it when you look at an old pic of someone and suddenly you're caught imagining colors or detail that just isn't there. Things like that are also why people see jesus on clouds, and they're the reason why graphical comparisons on heavily lossy videos are irrelevant.

Yes. If you think otherwise, you may need corrective lenses.

...

Anti-alising is still needed on our current resolutions as it's still possible for things to line up just right that you notice the pixels, but running it past 2x or with any of those graphic card-destroying special algorithems is useless.

Texture fidelity that matches your resolution when your camera is pressed up against something is more important anyway than model jaggies

Get the fuck out, you autistic faggot.

...

Antialiasing has always been and will always be pleb trash for morons. Muh graphfix. Idiots.

I see you

...

Not for long
Fucking consolefags

I've heard 4K doesn't require AA, and I sure as hell don't want to overwork anything with AA on top of AA

4K on top of AA*

it's a shitty joke because it's not funny

I know people who play emulated 2D SNES games, and layer on so many fucking filters that it looks like a blurry mess and all detail from the pixel art is lost so well-designed sprites that use pixels to hint and details just look like fucking coloured blobs.

I hope these types of people kill themselves.

I'm not switching to 4k until 2020 at the earliest.

...

Probably you never saw or used a 4K screen.

I've been playing in 4K since 2015 and it still needs AA. Playing without AA is less disgusting than lower resolutions, but its noticeable.

You can live easier without it, but will use it if you can. You want to use TAA, TXAA, SMAA if you play in 4K.

...

This angers me, i know a faggot who does that shit and says it looks just like a CRT all the while he could've just plugged his computer into the fucking crt tv that sits besides him.

I don`t mind filters like that too much, I was more talking about the fucking smoothing filters some retards like. I`ve been told they use it because they "don't like pixels", but you lose so much of the fucking detail it triggers my autism hard.

...

I've been fortunate enough to do a side by side comparison of the P2415Q to my U2414H. Do you know what was the only situation where I found the pixel density resulted in a big difference in fidelity? Fucking emulation with CRT shaders. How sad is that? A $500 meme monitor to replicate a $40 CRT.

Ayy. I know people that are actually like that.

That's why I sit with my monitor on the front rim of my desk (using a pull-out mouse/keyboard tray) with the screen like 18 inches from my face.

Stop lying.

Specifically with the Kurozumi fork of crt-royale.

That's almost certainly because you're referring to shit AA like FXAA or TAA.

...

The image on the right looks horrible. Who would want to play something like that?

Yeah, that one. Give me some AA that doesn't rape my poor GPU, and doesn't blur-fuck everything, fam.

Here have some more.

I can only assume they must all be underaged.

PC Faggot Race are never satisfied. Sit AWAY from the monitor idiot. That's why you faggots need glasses because you sit too close to the screen to the point where you impair your vision.

And to answer the other faggot's question, yes we need consoles because it's more fun to sit on my couch and play games on than to sit on my computer chair sitting extra close to see if there's any flaws in the games graphics instead of just enjoying the game and not giving a flying fuck about flaws.

I'm sitting on my couch, my PC on my big screen TV, typing to you right now. Checkmate consolefag.

Pretty shit bait honestly

I think it's mostly a holdover from the ZSNES days where quality CRT filters weren't a thing yet, and we associated blocky pixels with "old" shit, so in came those gross scaling and interlacing filters that made the games look "new", at least compared to their "old" look. I've got friends in their 30s who still think that way.

HTPC is the true mustard race.

Don't talk to me about fun kiddo.

I'd tap it

fug

Haven't used AA in games for almost 20 years. I don't care if a game looks like shit, I care if it plays like shit. muh grafix fags are the worst, and a big reason consoles went to shit.

...

This is wrong.

You need a minimum of two pixels per line you want to resolve since in practical reality the lines will never line up perfectly with the grid of pixels and give you horrible Moiré patterns and other kinds of aliasing. It's the same reason you need a 44 kHz sampling rate to resolve 22 kHz sound (in practice a little less at about 20 kHz since you want some headroom for the low pass filter) and all properly managed video is low pass filtered to less than 0,5 lines per pixel for these reasons.

Another way to put it is that 100 pixels can resolve exactly 100 evenly spaced lines, but not 99 lines, 98 lines or any number down to 51 lines. From 50 lines and below all values can be properly resolved, so this is your usable range. Look up the Nyquist–Shannon sampling theorem.

Either that or Idort
Hard to beat HTPC though

Gentoo is the true master race, however HTPCs running free as in freedom operating systems are nobles.

You must have played that shit on emulators then you underaged faggot.

It doesn't appear like pixels on an actual TV of the era. If you weren't underaged then you'd know this.


Most anti-aliasing only effects edges. Some also apply to edges of shapes on textures.

Your image is exaggerated FXAA which applies Vaseline to the entire buffer and nobody uses that AA.


Do you guys seriously not see the gritty boundaries on very shape?

Get your eyes checked. I don't mind aliasing too much but it's extremely noticeable regardless.

AA is nothing more than blurry bullshit. I'd rather play without it than with it.

*FXAA is nothing more than blurry bullshit

I appreciate good ol scanlines and CRT filters but then a shitter comes in with supereagle and claims it is the way developers intended the game to look like

Shitty edges. It's worse on some games than others.


Most scalers look awful on most games. Some are okay though.

shit tv might be on to something

Underrated post, this is why optical resolution is usually given in "lines per inch" rather than dots.

On a related note, the eye's ability to detect patterns such as the angle of lines is about ten times better than its ability to distinguish discrete lines or dots:
en.wikipedia.org/wiki/Visual_acuity#Other_measures

shit tv filter is the best filter, 2nd favorite is scanlines.

Yeah, I laughed harder than I should at the titles for some reason.

I know that the imperial system has put a probe in Mars's atmosphere, where it violently disintegrated.

en.wikipedia.org/wiki/Mars_Climate_Orbiter#Cause_of_failure

...

On every shape? No, but sometimes I do notice it with transparencies, and it bugs me, but bugs me less than the full screen blur I end up with otherwise. School me fam, which AA isn't shit? My only experience with AA turned on over the decades is typically during first-time startups before I get a chance to turn it off because it's raping my eyes.
Unsurprisingly, I actually have super shitty vision. Everything outside of a ~3 foot radius is pretty heavily blurred which is why I keep my monitor ~18 inches from my face.


The Shit TV filter is actually an NTSC S-Video shader (ntsc-320px-svideo, 256 instead if using it on the snes). Meme Lines and Bloom is Hyllian's glow shader, crt-hyllian-glow

Polite sage since I've derailed this thread too much already..

What is these meme lines thing? I've seriously never seen this. Is it some shit hipsters do to make their game look retro?

b8

It's how TVs worked back in the day whne you connected you console via the antenna input because RCA didn't exist yet.

But I first played Sonic 3 on PC back in '99.

Well that explains it.
Pretty much anything that isn't FXAA, however keep in mind if you force anti aliasing through your gpu's control panel it's a possibility it'll indeed blur your image but that's 100% the fault of an incompatibility with the display driver forcing something into the game and not the AA technique itself.
There aren't actually that many kinds of anti aliasing techniques and big part of them are just a new flavor of another. Quick rundown ahead, i'll separate it between post processing anti aliasing and oldschool anti aliasing because they're 2 fundamentally different things.
POST PROCESSING AA METHODS
Technically just because you're applying a filter to an already complete image doesn't mean it'll end up blurry, but nobody ever made a post processing AA method that won't blur the image. Known for having extremely small performance impacts, another side effect of post processing anti aliasing methods is increased latency (around 1-3ms). They all on a basic level work the very similarly but i'm not going to get into it. They take a complete image and run an edge detection algorithm.
Name means Fast Approximate Anti Aliasing. It's an extremely fast method of anti aliasing that is done by processing an already complete image, but it's known for it's extreme blurring of a picture.
You could in a way call this FXAA's father, it was first used in crysis. You won't find any modern games outside of planetside 2 that have it but it tends to be slightly worse than fxaa overall. Works much of the same and the name is self explanatory.
AMD's take on post processing anti aliasing. It doesn't blur a lot but doesn't get rid of jaggies as well as FXAA. I personally find this one acceptable in terms of blur. Name means "morphological anti aliasing" and the performance cost is very slightly more than FXAA
Name means "Enhanced Morphological Anti Aliasing". A take on morphological post processing anti aliasing that can optionally make use of a depth buffer to improve the edge detection at an extra (however small) performance cost. It's very innovative and it blurs even less than MLAA while having even less blur, however it tends to be nearly 2x as heavy as FXAA.
OLD SCHOOL ANTI ALIASING (A.K.A TRUE ANTI ALIASING)
They're all extremely heavy but have no blur at all while having the best image quality and aliasing removal effect.
Names mean "full scene anti aliasing" and "supersampling anti aliasing", none of which actually tell what it even does.
Quite simple: render whatever you're rendering at a multiple of the current resolution, then downscale it in real time with a filter. You can use different filters, you can render pixels in different patterns and you can render the scene at different multiples of your resolutions for different (and desirable) results. The variants normally get bigger names for instance FSAA that uses a sparse pixel grid instead of the usual ordered grid that is implied with simply saying FSAA would be called SGFSAA which stands for "sparse grid full scene anti aliasing"
Name means "multisampling anti aliasing". This one also has a stupid name, MSAA would be a good name for FSAA but not for itself.
An improvement upon FSAA, but not to image quality. It looks worse than FSAA, but it's much lighter on your gpu (it has better performance cost to image quality ratio) and doesn't help with shader-based aliasing.
Different names for different things, CSAA means "coverage sampled anti aliasing" and EQAA means "enhanced quality anti aliasing". CSAA was made by nvidia while EQAA is the exact same thing, except made by AMD. They're an enhanced version of MSAA that is slightly heavier than it (although much less than FSAA) but looks better. Recommended if your GPU is good enough for MSAA, but not good enough for FSAA.
————————-
That's pretty much it. t. autist who works with that stuff.

You like fucking fonts, eh?

Does FSAA take a huge performance hit at higher resolutions as you need to render at at least double the display resolution?
I have run FSAA on my 1070 @ 1440p and it gets better frame rates than I would expect it to get a 2880p.

TFW CRT PC Monitor

Get on my fucking level you casual faggot babies. If you have to emulate for Scanlines just fucking kill yourself.

You're right, but for all your "underaged" insulting, you forget that 18 year olds right now didn't even grow up with PS1. I'm 19, my first experiences with these retro games was with emulators with only bilinear filtering, and I liked it, but I've since been using the shit TV filter. Playing without the filter is fine, but anyone who uses that HQ2x shit or upscales PS1 to 1080p and doesn't think it looks hideous needs to have their brain checked.

Thanks for the quick rundown, much appreciated. Which out of SMAA and MSAA offers the best IQ/perf ratio?

Have a rare filterman for your troubles.

If you are proud of still having scanlines in current year I don't even know what to say, maybe throw out that heap of shit and get a graphical design quality CRT instead of thinking any old CRT is better than modern LCDs.

MSAA better, more perf cost. SMAA low perf cost, better than nothing if MSAA is too costly. Never use FXAA or EdgeAA.

Millenials played it when it was new. You're thinking of Gen Z, retard.

Well what do you know, started EYE, turned on 8x MSAA, and it actually looks really good, and not at all blurry. Why the fuck is that disgusting blur-mess FXAA even a thing, and why is that the standard?


There is no agreed-upon ending date for millennials. You can be born up to 2004 and still be considered a millennial by many demographers.

The term meant the generation was "coming of age at the turn of the millennium" so no.

Really the way I like to say it is that if you remember what the world was like before 9/11 you are a Millennial, if you don't you are Gen Z.

We're looking at quadruple the display resolution with 2x FSAA, because your horizontal pixel count is being multiplied by 2, and so is the vertical pixel count. Judging the performance impact is complicated because 4x the resolution doesn't mean doesn't mean 1/4 of the frame rate. Technically running a game on your 1440p monitor with 2x FSAA is heavier than running your game on a 2880p monitor because on the 1440p monitor the image will have a filtering step to downscale it to your monitor's resolution, whereas on the 2880p one it will just be pushed out.
The reason why 4x the resolution doesn't have 4x the performance impact is because not all things scale with resolution. example:
Let's imagine you have a game scene and part of it is a 128x128 shadow. Now let's imagine a simple shadow algorithm that makes use of the depth buffer (part of the rendering proccess of a 3D image is having a space with data relating to depth): a single point in the depth buffer is considered a light source, then your gpu works out the light source against an obstacle and calculates the shadow it would produce in 128x128 pixels. Generating that shadow has the same cost in whatever resolution you're rendering your game at. Generally 4x the resolution will have slightly more than 1/4 of the performance. And we're not even looking at cpu bottlenecks. Your gpu could be running at 30% overall usage but your cpu could be maxed out, so if you suddenly double the gpu load without touching the cpu load your frame rate would be exactly the same

FXAA is very low performance, does smooth jaggies, and when played on a console where you are seated some feet from the TV, the blurring isn't noticeable (or if it is, barely so). So it suits the needs of the console master race :)

I personally believe 1 generation is every 10 years. Because after 10 years every kid is a teenager and every teenager is an adult.

It looks ugly but the performance cost is minimal and some people find the blur acceptable. It's alright to have it when you can turn it off if you don't like it but a few retarded game develoeprs won't let you (mostly on console games).

If it means coming of age at the turn of the millennium, I guess I'm not one, because I was born in 97. I think the generation after millennial is being called the Homeland generation, because it's so sheltered. I prefer millennial to that.

That's because the media turned the term into a meaningless catch-all for "young people stuff I personally don't like". It means people who were reaching puberty around 2000.

its not really that simple since few CRTs accept the 31khz signal PCs send out. Unless you have a video card with s-video out or something, you probably are gonna need a CRT monitor or a more complicated setup that involves sync stripping and signal conversion.

MSAA doesn't affect aliased textures (shit like chain-link fences where they use a texture with transparency instead of wasting fucktons of polygons) or shader aliasing but SMAA and other post-processing anti-aliasing methods do, so it is objectively better in certain kinds of games.
There's also the different kinds of temporal anti-aliasing, which eliminate jaggies, crawling artifacts (mostly seen in newer games using Physically Based Rendering), and temporal aliasing. It's difficult to explain unless you see an example in motion, but I've included a GDC presentation on it just in case.


Sadly, MSAA doesn't play nice with deferred rendering and certain modern forms of forward rendering, leading to cases like Deus Ex: Manking Divided where enabling even 2x or 4x MSAA completely tanks your framerate and eats up fuckloads of VRAM.

Why do you know so much about AA

What? Are you talking horizontal refresh?
You can just create a custom resolution with the specifications of an old game console, run the emulator fullscreen and disable deinterlacing.

Are you the ASCII user from the /sudo/ threads after the hack? You were a ray of hope in the encroaching darkness.

PCs have a hard limit on what they are capable of in the horizontal refresh. The hardware unless designed for 15khz out, will not go below 640x480p. You'll want to take an analog signal so you want at best DVI-A to get the signal from, but you have to have a way to combine the RGBHV signal into something your display can use, likely composite RCA, s-video, or component RCA.

this for example is what you'll get when you send a 31khz signal to a 15khz set. It'll likely also be scrolling, flashing, etc.

(checked)
I'm 19 and I grew up with a PS1.
I've been playing video games since before I could from coherent thoughts.
not jokeing

That'd be what, 86-90? I'd be fine with that definition of Millennial, but today I still see fags being born after 9/11 getting lumped in with those of us born in the fucking 80s.


Should have seen that one coming.

One-size fits all AA solution when?

I grew up with Spyro and some other assorted PS1 games but I'm assuming most people my age would have just started off with PS2 or GC


Never

they tried that with shit like edge AA and FXAA and it kinda works but not really. multisampling is the best way to do AA. A lot of games haven't been supporting MSAA weirdly, but I just think we need a lot of options. Quincunx is a nice sort of middle ground which isn't really noticeably worse in motion. AA really is a necessity for screenshots, but in motion you don't need much to get by.

What if you double the vertical refresh? that means you'd need a crt that can do high refresh rates but it's something.

doubling the vertical refresh wouldn't solve the problem. It's just an incompatible signal, simply, it's still the same signal. There's hard limits here, and unless you're in the scene of multiformat monitors, you won't be getting a very desirable result. You can use a PC CRT, blank frame insertion at 640x480 120hz and get a true scanline effect for 240p content. Or go down the road of converters, but that's another mess. Best way to do it with a CRT is original hardware, RGBS, multiformat pro monitor if you really want that 6th gen 480p support.

I don't know nearly as much about AA as I'd like to, but I'm probably going to write a shitty /agdg/ project from scratch soonish and I'd like to give players some decent AA choices.

Supersampling and temporal anti-aliasing (if the implementation is good) are objectively the best in terms of image quality, but supersampling can apparently add some latency in some games and temporal AA adds a little motion blur to get rid of temporal aliasing, making it not the best choice for arena shooters. Otherwise, I'd always suggest going for MSAA (the default and only option in most older games) or SMAA depending on the game.

vid related is a demonstration of the shimmering effect temporal anti-aliasing gets rid of, since a picture or description usually isn't good enough

i bet you thought that delta force looked like literal potatoes

Most people our age started with Halo and ended with CoD
Back when I was a kid all my friends plated Halo, so I tried to convince my Dad to get me an xbox. He said Halo was to violent and got me Starwars Battlefront instead. I can't thank him enough.

...

The shimmering is pure eye-rape in the video, but I don't recall playing any vidya recently that looks like that. Is that a side effect of some new lighting method or some shit that's only in bleeding edge games?

Very nice.

I was wrong and you were right but my idea still kind of stands. I didn't know some crts could only do certain horizontal refreshes and i didn't know about those limitations. However after googling some stuff at least my idea means one could play those games on VGA CRTs.
What if with a VGA crt you double your frame rate so you're at least close to the right resolution (31khz 320x240), then you start decreasing the resolution but you also start increasing the front and back porches, and also the blanking interval? It means you'd be getting a lower resolution, with the 31khz horizontal refresh without making the vertical refresh incompatible with the game. It can be potentially dangerous, but if your screen's manufacturer gives detailed specifications or the CRT rejects things it's incompatible with instead of blowing up it can be safe. And it would also actually increase image quality because the CRT would be taking half of the time to draw each frame (and then it would just draw the same image again over itself)

My computer's too shitty to handle those graphics while playing though. Here's how I usually play.

If we had high DPI monitors we could do away with it. Sadly the DPI never fucking changes as consumers want a huge McDisplay more as they can't understand "sit closer and it's larger".

nuDood had a lot of shimmering with temporal AA disabled, especially on weapon reflections and blood, but the temporal AA blurred the shit of things and destroyed detail so it was preferable to just live with shimmering. It wasn't as bad as the completely unacceptable blurring of Unreal's temporal AA solution but it was still very bad.

I'm thinking it's shader aliasing on steroids, but i have no idea what causes the shimmering.

Is it even possible to make a specifically temporal AA that won't blur the image? All the examples i've seen in some way or another are basically "fxaa but for comparing frames". SSAA does help greatly with temporal aliasing at least.

It's a common side effect of physically based rendering, which a lot of newer engines are switching to because it makes creating textures and materials that look good under any lighting condition way easier, see pic related. Of course, Bethesda somehow still fucked it up in Fallout 4, giving players all of the performance overhead and none of the visual benefits.
I'd recommend reading archive.is/Duzjs and archive.is/fWyVu if you want a good understanding of how it works.

Probably: with the move to PBR there's a lot more developers working on temporal AA and eventually they'll come up with something good. SIGGRAPH 2016 had some presentations on it from Naughty Dog and Activision that I haven't read yet (supposedly Uncharted 4's temporal AA was really good, but I'm not wasting my money on a PS4 and that turd just to check out the AA), and on the freetard side, I remember Tesseract having really good temporal AA that didn't add way too much motion blur when you whipped the camera around.

Interesting, thanks for taking the time to spell it out to a pleb end user user.

I have a 24" 4k freesync monitor sitting my closet that I bought for $200.

Bigger is better to a point and I wish my monitor was bigger than 27". Given my desk and seating I would happily go to about 35" before it was too big to fit in my vision.

PBR is just a marketing trick used to describe what has been the logical evolution of graphics and had already happened before the authors of the book even wrote it, I have no idea why they got an award for doing literally nothing and inventing literally nothing. Most of the techniques described in PBR already existed or were already in use by the time it was written, some things even in video games
it's 300x worse than claiming get smart was revolutionary because it featured flat/thin screens and mobile phones a lifetime before they existed even though at the time everyone was obsessed with thinning/flattening up CRTs and sticking phones on everything because they wanted something hand-carriable but the tech of the time wouldn't allow it, and soldiers have been using portable (as in carried by a horse and deployable) mobile radios since ww1

Even if the ideas didn't originate from the book, they're still really useful for rendering and asset production.

the 640x480 120hz with blank frame insertion is a technique that has mixed results.

The thing is, I don't think it's about right or wrong. It's just the idea is the PC is sending a data payload that has a minimum size it must be, which fits in that 31khz standard. Essentially, that standard must be met so a PC CRT would be ideal. You will want that classic aesthetic provided with the blank lines in a CRT as it provides a far truer representation of the games look. This is very much analog discussion and we're in a world where the majority of us - myself included - grew up when even digital was overtaking the analog foundation in things like analog displays. I'm a huge hobbyist when it comes to this stuff so I spent a lot of time studying CRT and how it works, maintenance and everything - I don't expect anyone to know but thanks for listening and looking it up yourself.

Pretty much it's not worth playing with the resolution once you hit 640x480. If you refresh at 120hz, the VGA standard will be at its firm limit. You are also at a manageable multiple of the 240p 60hz standard that with blank frame insertion you get a perfect result.

You are right about it being dangerous for the hardware, it's a hard limitation you are targeting, and it's not a trick that always works, more one that should work in theory than anything.

Since SSAA/FSAA is the best AA, I want to see very low resolution down sampled. Really curious.
My GPU is shit to do that.

And you see that's where they went wrong :'^)

I can still see aliasing even at 32x, which proves that eventually resolutions will be so high that a. it would pretty much pointless and b. at that resolution it probably wouldn't worth the performance loss anyway…

get on my level fam…

Is that sprite really in foreground?

Wew boy

I'm 29 and you're retarded. Just because the shitty cheap CRT you grew up with looked like shit doesn't mean that was the only way to play games at the time, there are these things called Professional Video Monitors or PVMs, maybe you should look in to them. The pixel look is how the developers originally saw it (only with scanlines).

Some of the most beautiful games in the world are 100% pixel art, so I really don't understand why you'd be against it.

NEVER FUCKING EVER

see
FXAA was a nice gimmick when it first came out, but there's lots of drawbacks like UI degredation. SMAA is a pretty decent substitute for real AA
Also
Consolefag confirmed


They did exist, my family never had antenna only tvs because Im not a poorfag

That gif is line a vaseline coated shitshow, re-do it.

Play in downscaled 4k on a 12 inch monitor and you won't need AA.

Only gameplay matters to me, I don't care about graphic enhancements.
t. poorfag

Never needed it, embrace the jaggies.