So, I've finally got a non toaster PC and I've finally seen 60 FPS old games and 60 FPS ppppu

So, I've finally got a non toaster PC and I've finally seen 60 FPS old games and 60 FPS ppppu.

So, why people don't ask more for 60 FPS, why some people claim the eye can't see 60 FPS?

why do we have to settle for mediocrity?

Other urls found in this thread:

downloads.bbc.co.uk/rd/pubs/whp/whp-pdf-files/WHP169.pdf
web.archive.org/web/20060801000000/http://www.au.af.mil/au/2025/volume4/chap03/b5_6.htm
youtube.com/watch?v=kPW7ffUr81g
christiedigital.com/en-us/product-support/discontinued-products/christie-MirageS4K-digital-projector
twitter.com/NSFWRedditGif

console limitations

Wait till you see 120+ fps.
Wait till you get free sync.

I used to hate fps drops till I got a free sync setup, and now it's not noticeable.

...

...

Limitations that are starting to become a thing of the past, welcome to the modern age, now get out of here

didn't think i'd be fapping tonight.

framepacing is the next thing we have to deal with.

Because people are fucking retarded.

Also this.

Last year I upgraded to 144fps and became interested in what is the upper limit of the average eye, that lead me to this paper by the BBC downloads.bbc.co.uk/rd/pubs/whp/whp-pdf-files/WHP169.pdf
Basically 300fps is good for most people but young males seem to see the biggest difference and notice increases up to 500fps.

As for why people accept shit it's because the devs chose between sexy textures / effects or FPS and go with textures / FPS because it's easier to market the improvements to people without faster displays (ie. the majority of people).

Because of jews. They sell people garbage and make up lies that it isn't garbage so gullible idiot normalfags will think they're getting a quality product when they shell out $80 for the newest TripleA turd.

>So, why people don't ask more for 60 FPS,
>why some people claim the eye can't see 60 FPS?


Maybe


Those aren't people


Man you ever thought about that


That you don't become a person just because you can throw some shit opinion into the air or internet


Have you ever thought about that you fucking cuck?

Take your meds.

How do you figure? You can still see res increases at 8k and as I said here even your mum can see fps improvements up to 300fps. Assuming res increases beyond 8k can't be noticed that still means we need 8k @ 300fps before we are close to removing limitations.

subtle…
:'^)

Is it gay if I find her jawline more attractive than tits?
Has sexual tolerance gone too far?

waste of quads

I know what they're trying to get at, but you can get the same results with 30fps if the thing is programmed for it from the beginning.

If the sixth gen can have 60fps games, then it must be the result of lazy programming today

...

haven't played anything on 60 fps for a long time thanks to having a toaster. For some reason, even older ames tend to stick to around 30FPS, probably due to compatibility issues. I play modern games on about 20 fps

We are eternally deprived.


It's all because of muh grafix faggotry today and it pisses me off to no end.

Do you have v-sync on? That would lock older games to 30 fps if they can't maintain 60.

Oh look, another dubs thread.

because 60fps feels too gamey I like 30 because it feels more heavy like a movie xD

epic reference fellow gamer :,,-D

Not falling for it, Schlomo.

So motherfucking pissed off. I have a Titan video card, all set up, I had the motherfucker All set up. Now I need together my manual out and justest out one wil.;

diminishing returns, 60 v 120 isn't major compared to 30 v 60.
you are being trolled

balance between framerate and graphics leans towards graphics

thread is already in console war territory

Feminine necks and jawlines are also nice

also collarbones

60 vs 90 already is a major difference.
Even at 120 vs 144 it's about one being 20% faster.

wtf kind of a clown are you? 60 fps is outdated, you have to aim for 120+ now.

Typical jap programming.

t. nodev

she does asmr, what the heck is she doin there?

Because, in all honesty, 30fps is not bad for most genres. FPS and 2d platformers being obvious examples where 60fps is just fucking necessary.

Colors aren't technically necessary for most games too.
Would you like if games started coming out without colors for no reason other than "it's not bad".
(I'm not even talking about easthetically monochrome game like Mad World, I'm talking about games like mario 64 going full game boy.)

What a stupid loser comparison that is

lol you got me there XDDD
It's an exaggerated analogy, but the point is that if people think that it's fine without it, they won't do it, and it's clearly much better with it even when it's not really necessary.

Even if the game is timed and scaled appropriately for 120 fps, the difference in smoothness is negligible.

-t. Someone who uses a 200hz iiyama display

that doesnt really exist?

also, 5wankzillion hz is useless if it doesn't at least display 4k resolution

I run 144 fps and this is what I see disclaimer, I'm a 33 year old male, that matters when talking about human vision from 60 to 90 is very noticeable, 90 to 120 is noticeable, 120 to 144 is barely noticeable.

This is because "the eye can only see 144 fps" but because as the rates get higher you need a bigger spread for it to stand out. Using extremes as an example if you go from 15 fps to 20 fps it's a huge difference but from 90 to 120 while still the same percentage change it's much less noticeable.

If you actually want to read up on the ultimate limits of human vision and diminishing returns read the paper I liked here

No, I'm tired of people saying this. muh grafix faggotry held back games in the 7th gen because the consoles were underpowered pieces of shit that couldn't handle anything actually pretty and run it well

Games are being held back today because, despite hardware being exponentially more powerful, these shit game companies only do diversity hires, retards fresh out of college (which is taught by diversity hires), and low cost imported street shitters from india.
None of these people actually know how to code worth a fucking shit.

Look at Retro City Rampage. There's a dev documentary where he talks about how he made the game; that guy is a fucking genius and even made a custom debugger for the NES just to port his game to it. RCR runs exactly as well as it should for how it looks, aka it runs smooth as butter on literally everything including the NES.

Take The Binding of Isaac now, for example. Despite having the graphics of a shitty flash game, it's so horribly programmed that it requires mid-end modern hardware to run. It was denied a 3DS port because the 3DS couldn't run it (though the n3DS can, poorly). That is the current average in the games industry. Poorly programmed shit that requires hardware several orders of magnitude stronger than it should because nobody cares about quality programming anymore, and just rely on the consumer brute forcing it with powerful hardware. That's why graphics have stagnated, and that's why games released today run worse than games released 10 years ago despite looking worse graphically.

Compare Ghost Recon Wildlands and Crysis. The latter is significantly better in terms of graphical fidelity, yet a card that can easily run Crysis at 1080p60fps can't even run Wildlands.

Assassin's Creed Syndicate is such a poorly coded piece of dogshit that your performance will be bad regardless of hardware. It makes more than 5x the DX11 calls per frame than the API supports, which bottlenecks the game and causes poor performance no matter what. GTX560, GTX 1080ti, NASA supercomputer, it doesn't matter what kind of hardware you throw at it, it will always run like shit because it's designed that way.

Expecting graphics to actually fucking improve over the course of a 10 year period isn't muh grafix faggotry. Publishers/developers taking advantage of every improvment in hardware by, rather than improving their games, increase amount of bullshit narly code they can get away with thanks to brute force. Games today should be looking like the original pre-downgrade Witcher 3, instead we get fucking Fallout 4.

|
|>
|3
|

It's not only consoles, i think a big share of the PC market has toasters or mid tier PCs at best, yet they want to run the latest AAA game at decent framerate, this is also pretty much putting them in the same place as console users.
If developers stopped trying to make games compatible with hardware that is 5 years old or more then maybe optimization wouldn't be such a mess

whos the tranny

t. retard
You can calculate all the phisics in the background without it being tied to the framerate.

That's exactly what happens with cloth and hair physics in PC port of dragon's dogma but in reverse.
Try to run it first at 30 and then at 120 fps and look at some cape, you'll be amazed how cloth transforms into wood.

...

How new are you?

Quite.

What digits?


get the fuck out

Buyers remorse mixed with sunken cost to make them so deep in denial of reality that all they see is their loyalty to their system.

Shit optimization is what they use to push new hardware these days.

Consoles.

That's exactly the problem though, not a single one of these piece of shit games even remotely justify the hardware they demand. GTX 770s were running the Witcher 3 pre-rape build at locked 60fps, yet a GTX 760 2GB can't even run Bayonetta. Developers aren't trying to make games compatible with 5 year old hardware. They're not trying at all. We're talking about a market where even a mid-end card from 5 years ago shits all over the current gen consoles yet none of those games ported to PC run well even on top of the line hardware.

See as well.

Speaking of which, this little bastard is a tile for tile port of the original GTA with EVERYTHING in it!!!

There is NO excuse for games being in the shitty state they are.

Will I find it on SWFchan?

I used to think that it was likely in the next decade or two we'd run up against the limits of Moore's law which would force developers to optimize their games if they want to keep increasing graphical fidelity on the same hardware, but I can't see Intel/Nvidia/AMD ever letting that happen as that would be catastrophic for sales and the eternal jew never sleeps.

I suspect they've been deliberately drip-feeding us incremental upgrades for years, when they already had the capability of producing faster and more efficient hardware. I think we're dealing with a Phoebus Cartel in the hardware market.

The smallest possible transistor has already been shown to the public (just 7 atoms), it's supposed to hit the market around 2020 and it's expeccted to be the limit of Moores Law, because anything smaller than that has uncontrolled quantum fuckery going on.

And that's where quantum computing comes in and (allegedly) the government already has that.

IIRC; thanks to some schrodinger's cat level fuckery that means a bit can be both 0 & 1 at the same time until "measured", it has all results active at once. Kinda.
Lets say you search for a file, rather than go through each file one at a time checking your condition, a quantum computer just plops out the search result instantly by doing all possible searches at once and only displaying stuff that meets your criteria.

(You can skip to 1:57)

Personally it sounds like from a hacking point of view- you could change a file just by looking at it, but by then you'll need to be a physicist & a hacker in one.

...

You were going to be fucked anyway

...

maybe so, but fuck you.

it's your own damn fault, you're lucky they didn't catch you looking at an image about exterminating the jews.

Because the framerate you want is really 120FPS. The proof in this is that 60FPS still causes motion sickness in VR and it only stops making you sick at 120FPS.

Sorry you got fired dude.

Congrats for being a retard who uses a work computer to go on an imageboard. Just be glad it wasn't loli or something advocating the total extermination of kikes.

Wow I just got a promotion, thanks user!

...

Facepalm

the reddit club is two links down

Guess I'm already here!

She's pretty fucking terrible tbh

Doesn't make me sick because I'm not a weak fucking jew stereotype nerd melvin. VR is shit, but not for this dumbass reason.

Motion sickness is caused by the inner ear sending different signals than the eyes / other sensors. You could have infinite FPS and it wouldn't matter. People get sick on the boat will still get sick with VR.

You want trans-cranial movement sensation excitation. This exists. Kind of weird though. It came out of brainwave studies on Sleep Paralysis. Some people hallucinate the sense of floating or falling while they're consciously awake, were able to isolate it. Now we can use RF pulses to recreate the waveform in the region of the brain – but alphabet soup folks don't want contractors to reveal their MKULTRA power level because it would spoil the decades long PSYOPs of using such tech on civilians already.

Additionally, with VR you have you eyes fixed at a certain focal length. This leads to headaches trying to maintain that focus rather than in reality constantly changing eye focal length. Can't fix that unless you got holograms.

USAF has had 3D projection "holograms" for decades.
web.archive.org/web/20060801000000/http://www.au.af.mil/au/2025/volume4/chap03/b5_6.htm

Now the commercial sector is getting them (had to crowd fund because govs would buy up private contractor tech to keep it hushed up – cover for ongoing PSYOPs).
youtube.com/watch?v=kPW7ffUr81g
Oh, look, it's a "UFO"…

Don't buy the Oculus hype. FPS doesn't fix shit. Some people are just not affected by motion sickness as much as others. If you stick with it you can get your "sea legs" for VR, to overcome motion sickness. This is true for old VR and new rigs. I've had VR since years before Doom came out (vid related).

I write my physics code with fixed 30fps tics. This is so that toasters can keep up. Without a fixed time step the game will desynch. A computation with variable elapsed time is not consistent for network games or re-play.
.update( 0.016666667 ); // sec; 60fps

.update( 0.016666667 ); // sec; 60fps
is not the same as
.update( 0.033333333 ); // sec; 30fps
Only the simplest updates is this the same because I could have collided something and already started bouncing the other way between the top 2 calls. The resolution of Time has to be consistent to get a deterministic replay / recreate same results from inputs machines, otherwise I have to send far more frequent snapshots to synch the world. Networks are slow as fuck – latency is shit due to buffer bloat and overselling bandwidth, that's another reason for shitty physics and games with slow movement (to mask network lag / reduce desynch).

However, my graphics code interpolates between physics frames so it can run at infinite FPS if you have it available. To reduce visual jitter I keep track of the current visual position of things, then I have the physics version of things, which may change due to different physics results vs interpolation based on speed and direction vector in the rendering code. The server also corrects the past, sometimes invalidating the client's physics prediction of other players / objects. The renderer may get ahead of physics very slightly so I interpolate from visual position to actual position, and that same code smooths out jitter due to network updates. The result can sometimes be slushy feeling if network latency is high or physics code is chugging, but normally the result is a very smooth render.

The shitty way a lot of engines work is to have the rendering code only drawing the exact result from the physics engine. So, to increase FPS they have to increase physics tick FPS too. Shitty code locks the FPS because you'd just be re-rendering the same exact scene over and over again until the next physics tick.

My god, that was the cringiest fucking thing I've seen and heard this year.

...

I recommend reading the manufacturer's specs page on the projector for proof on some of the claims i make myself
christiedigital.com/en-us/product-support/discontinued-products/christie-MirageS4K-digital-projector

That paper is bullshit and made by people who don't know what they're doing. First off it doesn't even include the specifications of the projector besides "it can do 100fps" (although the specs page on the manufacturer say it can do 150hz on lower resolutions, but they only tested 100hz and below), second off they do claim to have 300hz material but it has every 3 frames averaged and turned into 100hz (and lower). I also don't know where you took your 300hz and 500hz claims from because the papers only talk about testing with 100hz and below.
Also another thing to note are pixel response times: If the biggest ones are bigger than the interval between frames on a given frame rate you might be playing Xhz material but the display won't be displaying it fast enough and you won't actually get Xhz material. This was a big issue with early LCDs that accepted 60hz input (a frame every 16.666666666~ seconds) but the pixels could take longer than that to shift to the right color which means you wouldn't necessarily be getting 60fps on a 60hz screen with 60fps material. The specifications page of the manufacturer doesn't even make a mention of that and you'll have to believe me on this one thing: it normally means the response times are fucking horrible.

I wouldn't go that far. All he did was standard procedure. One of the reasons people are generally unable to do things like that is because your average 2017 programmer knows nothing beyond what his 3 year course in c++ taught him

I'm assuming you don't know much about the subject but a specific resolution doesn't mean anything. Try a 4K phone, those have ridiculous pixel densities and they couldn't be farther from "we wouldn't spot a difference with denser pixels than this". and you can probably stack 4-5 of them horizontally and vertically on a desktop-sized monitor. Don't worry, a few years from now you'll see it yourself.

yes plz.

Is she bald?

Tojo detected.

Why do you? 165+Hz g-sync IPS master race
4k

Let's just all set our screens to 320x240x16 @30i , unplug our graphics cards and use the integrated ones, remove memory sticks until there's only one, format our hard drives in FAT12 and go back to commandlines then. No games need all the other jazz anyway.
honestly this reeks of pretending to be retarded but I don't even fucking know anymore with the amount of consoletards running freely

I'm felling pretty lucky.

Any action oriented game requires 60fps or more you dumb fucking console shit.

...

...

...

So is there a threshold when the FPS gets so high that it's just as bad as low FPS, if we disregard the GPU blowing up before reaching it and any eventual FPS tied programming quirks?

Wow thank God we have user here ready to dump his collection of shitty screenshots nobody wants to read that kills any good discussion
What a great user, thanks for existing and making this place so much better, truly a hero of our times

Once the refresh rate exceeds the rate at which your optic nerve can move information, further improvements just aren't improvements, which is ballparked at around 380fps.

Try 144, fucking nothing supports it.
Why do these monitors even exist?

...

What sort of shit games are you playing?

A lot of games are just plain locking their games at 60.

its really shitty to do that, but at least they arent locking at 30 fps. that shit is god awful.

Her tits aren't that good looking. The animation's pretty jank overall too. As a demonstration video it's not much good either.

Ancient meme that some undoubtedly have taken serious.

lets see only games I know of are path of exile, dota2, csgo that I've played and they supported 144hz

If you are stupid enough to browse imageboards at work I hope they find thumbnails CP in the cache.

Programming physics to be tied to framerate is one of the hallmarks of a lazy dev who doesn't think very far ahead into the future.

I expected to rage, but all i got was keks instead.

...

Consoles and devs no longer having to optimise their games since hardware can just brute force shit.

no, it's not gay, questionable but not gay.

I unironically can't spot the difference in 30 / 60 fps unless you do like a comparison back and forth.

If only because Sony will go out of business before we get a Yakuza game on PC.

If the business is not having games I see how they could be making those claims.

I hope you don't post here from work then

wshseriously? I can't even look at my roommates playing console gaems because sub 60fps make me want to vomit.

Play Pokemon X/Y in 3D. I dare you nigga

nah i hate jrpgs kiddo

Because fanboys like pic related still exist.

...

WHAT!?!?!?!

It depends on how good your eyes are.

I, personally, can't perceive differences in frame rate above ~58FPS whereas someone with far better eyes than mine might have to get all the way up to ~90FPS before they can't perceive drops in frame rate.

Black and White games (with a splash of colour) are atheistic as fuck. Though I wouldn't want it to become the norm.

Also I hope OP is and English Second Language person. But anyways it's because 30 is passable for a game, not idea. It's just that people don't know how much better 60FPS is for actions (IE Fucking games being played).
Also people saying the eye can't see 60 FPS are just trying to justify their cocksucking. The eye can apparently see up to 1000FPS, though the real limit to seeing FPS is your screen, not your eyes.
Honestly the only time Frame rate is an issue for me is when it fucks with the physics or something gameplay related (why the fuck devs are still tying shit to the FPS is beyond me) or when it's noticeable drop (Modded NV is a son of a bitch).

This shit pisses me off I fucking hate the nigger who invented reeee mainly because r9k was my homeboard and now all its memes are fucking ruined by autistic normalfags and wannabe social outcasts its fucking annoying

just tell them you're checking to see if you're still gay, and they're oppressing your lesbian tendencies, as you identify as a womyn of colour

Damn that's pretty fuckin neato

Yeah, but then you have your average drone today saying how "it's your PC bro, you have to upgrade" despite the fact I have modern hardware. I truly fucking hate this new PC culture.