Would you rather have

Would you rather have

A.) Ultra settings at a limited 30 fps

Or

B.) High to medium settings at a full 60 fps

Consoles need not reply since they usually run games at both medium settings and at 30 fps

High to medium at 60, been doing it for years and couldn't give a fuck

...

60fps
Always
60 fuckin fps minimum
Anything else is pure garbage

A lot of PC games still run physics on the GPU so anything beyond a 60 fps limiter breaks them. So yes, effectively 60 is max unless otherwise stated

I like muh graphix so I try as much as I can to run at ultra settings. 30 fps honestly looks like shit unless you have motion blur hiding the shit framerates. Without motion blur 30 fps is absolutely out of the question for me

Motion blur makes me sick. So does 30 fps. I dont care about quality so i would just run at low ro medium for 60 fps.

I'd rather just play the game and not worry about that shit. So long as it's not dipping below 20 frequently, I'm golden.

Would you ever consider running TF2 at pic relateds settings for the sake of getting thousands of fps?

If that's the only way to get 60fps, I would.

TF2 is for retards, though.

And for me it entirely depends on the game. Some games framerate doesn't matter much.

it all depends on frame pacing tbh fam.

60fps with shit frame pacing

but better frame rate and pacing is more important than Graphical fidelity. Aesthetics are important but not integral. Good game play trumps visual every time.

On my old computer, i would. My new computer runs TF2vintage at 60 on low. My monitor refresh rate is at 60 anyway, so more fps would just be a waste.

Would you rather

A.) poo in the street

Or

B.) poo in the loo

street shiitters need not reply since they always shit in the street

/thread
Doesn't matter how pretty the grafucks are if they are behind 30 fps.

The absolute bare minimum I would ever consider acceptable personally is 25 with motion blur. Without motion blur I cannot go below 45. On my old PC I could get TF2 on high settings at around 45 fps which I could tolerate. I didn't want to lower my settings to push for 60fps. Framerate is important, but I do believe graphical fidelity adds a lot to the experience as well. You need a good balance and you need to set a reasonable threshold for yourself and what you can tolerate

Why not have the gameplay and visuals compliment one-another? I admire Okami for that kinda shit.

...

higher FPS would help with reaction times and smother mouse DPI. Which could give you an edge in competitive online gameplay but the average reaction time for humans is 0.25 seconds so the higher Frames would be diminishing returns beyond a certain threshold.

I agree, good aesthetics are nice but I can do without the extra post processing effects if they hurt framerate.

With online games you ALSO have to consider the servers tick rate or how often the server polls for user input. Which widely depends on the game but unless it's a particularly shitty game the tick rate should be close or at 60

???

If the devs are not shit, they can find a way to optimize. Of course the art of a well optimized game is dying out the more we go into the future. Buy more ram goyum. Get a better gpu goyum. Your games take up 60+ gb goyum.

Isnt the new forza going to be like 100+ gigs on launch?

the difference between max settings and medium settings in games has significantly minimized since the early 2010's.

I know what you mean, but….

As shitty as nuDoom was I was pleasantly surprised at how fantastically optimized it was. I ran that game on ultra like a fucking dream

I just wish game devs would fucking STOP with the ugly as fuck object dithering for coming close to distant objects. That shit looks plain ugly. surely there's a better way to transition between geometry resolutions as you approach them

The problem there is texture resolution. As you increase texture resolution size from 1080p textures to 4k textures the filesizes rise exponentially rather than logarithmicly (increasing the texture res 4x does not necessarily yield a filesize increase of 4x)

My problem with Nu Doom is the color scheme. It seems a lot of devs have been making games with one color scheme in mind. Fallout 3's green. Nu Doom is brown-orange. Deus Ex is orange. To me that's really off putting.


If devs used file compressions or files holding textures, wouldn't that help with file sizes? I use the same argument for sound files because apparently to devs, having uncompressed shit runs faster.

Having uncompressed shit does run faster, it just takes up more memory. With texture compression it's more CPU demanding because the CPU has to perform some level of decompression unless it's lossy which would look like shit.

1)Depends on the type of game. For a 4x game, 30 fps is perfectly fine.

2)Is the framerate constant? If it dips to 27 fps, that is a big problem. If it dips to 54 from 60, that is far less bad.

As someone who's PC is slowly becoming a Toaster, I am not looking forward to the Forced Obsolescence of modern Games. I scratch my head when I can run NuDoom at over 60fps but can't run Nier Autotomato nor will I be able to run half the shit slated for CY+++ just because they keep pushing system specs despite (resolution) games looking worse than they did in the OO's. The last big leap in Graphics was the ID tech 4 engine. Crysis 2 a 6 year old game was the High water mark. But devs still remarkably struggle to optimize their games.

I meant to say * other then resolution

Okay, that makes a lot of sense actually. Thank you.

Tbh games dont really need to look any better. Games already looked good enough a decade ago. More poygons and textures only bloat system specs. MGS4 is still one of the best looking games ever made.

A lot of developers just aren't as disciplined as others. A big problem is failure to commit to FPS budgets. The CoD series for example commits to a strict 60 fps budget on consoles and I think their hardwork does pay off while a rival like the Battlefield series could barely keep 30 fps even on consoles. Games SHOULD be more optimized than ever since consoles now use the same hardware as PCs. But even then you get unoptimized shit even on consoles. You can't help lazy developers

Ultra at 30 fps. That's how I usually play my gemes, I prefer the looking good to runing at high FPS.

I always scale my graphics for 60 fps minimum. Honestly if you're running under that you should turn down your settings because your shit simply isn't strong enough.

even minimum settings is better if they're 60

Hello, Todd.

That color scheme you're talking about is very common in modern film as well, it's basically that some hack made a science out of "this color scheme looks extra cool", and while film makers manage to pull it off without it being too noticeable, when retards apply this to gaming it creates a very negative effect.

It depends really, if you have a shit harddrive you could be I/O bottlenecked and compression could actually speed up things. Or if the hardware has optimized decoders inside, like it happens with H264 video, it could also be faster to have compressed assets since they take up less memory and are more cache friendly.

...

The Id Tech team is completely separate from the Gamebryo team, IIRC the guy who was head of CryEngine is now heading Id Tech in Carmacks absence

60fps or more.

Fucking this

12 fps on lowest.

I know you're joking, but there's something satisfying about beating a game under terrible conditions.

High to medium settings at a full 60 fps is the objectively superior choice, since you're playing games for the gameplay and not for the cutscenes.

That actually looks pretty neat

You can't call NuDoom optimized when MGSV exists.

The game itself isn't much fun, but it runs on toasters and can still look good.

MGSV is truly a underrated master peace.

seamless 15 fps ofc.

but on a real note, frames over EVERYTHING, quality settings, resolution, doesnt matter. frames are first, with probably exception of fov, but im not sure if increasing fov affects framerate.

...

Best game in the series.

No textures 60fps

No need to get your pantries in a cyst over a little mistake.

would you rather fuck off back to
>>>/reddit/
or
>>>/4chan/

i'd rather have a good game ?

You don't fit in yet
or
You really don't fit in yet

I choose C. Set the options to Ultra and lower the few settings that affect performance. Typically they are shadow render distance, god rays, lighting quality, and post processing effects. Most settings can be set to Ultra without impacting performance. The Low, Medium, High, and Ultra settings are often poorly thought out.

As long as it's 60fps and it works.

High to medium 60 fps, as long as draw distance is good and objects pop-in are minimal. Those are the only two things that manage to break my immersion and bother me to the point of spend hours tweaking config files.

Are you for real? I'll take my goddamn 60 fps with high settings, grafix are good and all but would you really sacrifice the smoothness of the gameplay just for some extra polygons?

This nigga right here

This guy gets it. Jet Set Radio for example is locked at 30fps but feels like it runs at 60 due to the great frame pacing.


It is though.

A. 30 FPS doesn't bother me as much as shitty textures.

I pretty much had to do that to play multiplayer games such as unreal and counter strike growing up. So did a lot of people straddling the border between oldfag and newfag.

here's the problem with OPs comparison. Normal and up look almost the same as ultra. There's discernible differences, but actually playing the games I can't see how anyone could notice.

Ultra settings, one of the biggest perks of PC gaming, haven't meant shit for like 9 years. Engines don't work like that any more where they pick from low textures which are 256x256 or ultra textures that are 2046x2046. I'm actually considering those high resolution textures are where the diminishing returns might be. I don't see how a game could take advantage of such high resolution textures, at least no artists have shown that there's an advantage to it. Most games rely on overlapping "painted" texture systems now which blend textures to create the final image instead of tiling one massive texture.

and this is supposedly from a game made for the PC from the ground up. Graphics settings do not mean shit any more. Low in the source engine vs high is a world of difference.

Ultra graphics have been a meme used to sell, to be honest, unnecessarily overpowered, and overpriced hardware to stupid normalfags who then go on reddit and post about "muh 20k$ PC, muh 1000$ PC case which is literally just a black plastic box with a window and some neon lights"

You still see fags here parading around high settings on their 900 dollar oil rigs as if it's an actual leg up compared to what consoles are getting.

It's technically superior by like a single digit percentile. That's it, it's fucking weak.

I'd consider being able to run games above 60fps a significant leg up.

I can run most shit that comes out these days at 60 fps with a 750ti and some shitty quadcore i5. Sure, most of the AAA games will be on low or medium-low settings, but it'll run at sixty or more.

Ultra settings are a meme and making the game look maybe 10% better at half the framerate is fucking stupid

if devs actually optimized games we would have Ultra 60

Likewise, but it's not like the majority of multiplatform shit will actually benefit from this. The games that benefit from it already target it even on consoles.

I'd rather have
A.) Buffalo
Take a Diarrhea dump in my ear
60 fps all day every day.

Often in competitive games you get a significant benefit by lowering texture quality and shadows. Visual fluff does nothing to help you win most of the time.

He'd rather eat the rotten asshole of
A.) Roadkill skunk and down it with beer

It's hilarious how often lowest settings in a game like battlefield will cause hiding spots like bushes to not render or low poly versions of assets will reveal enemy positions.

It's clear these devs give even less of a fuck about properly supporting PC games

depends on how important graphic are towards enjoying the game. As a general rule of thumb the latter sounds superior, but for some games, lower/locked FPS is better, usually on the older ones.

I'd rather not have this thread.

literally unplayable, consoles can run 60fps games now, so you should aim for 120fps

fps > everything else

hit me up with those settings

Frame rate is always more important then graphics.

pleb

C.) I own two 1080ti's I'll play with whatever settings I want faggot

...

Rather run it with no fps limit unless you want to purposely bottleneck your PC and the gameplay then that's what you're going to get.

144fps on an 2K ultrawide curved monitor running at the absolute maximum with useless settings like motion blur turned off.

Depends on the game. For fighting, racing, extreme sports and whatnot I'd want 60 FPS for sure. For action adventure games I'm less picky. 30 FPS is abysmal though, I'd want at least 40.

If your FPS starts going into the thousands, you can expect it to be a strain on your graphics card.

With the midgen upgrades PS4P and boneX that's now becoming a thing in quite some games with explicit support for them. I still have no idea on what flavor of cocaine they were when they started that midgen upgrade plan but so far it's working out better than expected.

...

Yeah, provided you have enough vram, else you're screwed.

Yeah, provided you have enough vram, else it runs like pure trash.

you should only ever run games at the highest framerate your monitors refresh rate supports, and make sure frame timings are consistent. You don't need to be going 100% GPU in a 10 year old game for 560FPS, it doesn't make it play better, if anything it will cause inconsistent frame data to be transmitted to your monitor. Worse yet is judder in frame times which could make a game running at 56 fps feel worse than a game running at 30fps.

How do you actually get it like that?
I would play it like that if I could get more than 10 fps on tf2-v like that. It actually looks kinda cool.

WEW
E
W

How come low looks the best?

...

ya gotta stay optimistic, nothing wrong with 2nd hand hardware. Stay true to yourself, stay off consoles and you'll make it.

It has no bloom, hdr, motion blur, or depth of field. Those effects you see everywhere because they're easy to implement and drop your framerate(gotta make ultra seem better somehow), but they just make everything look like a blurry fucking mess.

Jesus Christ talk about cherrypicking. Not saying most "Ultra" settings don't look like shit like Chromatic aberration and sometimes even a bad AA implementation. But texture quality and draw distance are NOT something that you really should be doing without. You're also ignoring native resolutions entirely. At a low resolution of course it's going to be impossible to resolve a difference between low and high settings.


With online games the higher the fps the better because it means faster response times. Even if those response times are negligible at best you need all you can get if you want to press that trigger precious milliseconds before your enemy.

Dumb question because you're already asking PC players specifically who already have the choice to fine-tune whatever they want, and
get the fuck on my level m8 I have a 144hz monitor.

But yes, framerate is objectively more important than graphical fidelity.

Game developers need to step up their lod games and make more clever use of texturing. Basically make all future games with FOX engine

Tell that to game devs that run the physics engines off the GPU

server tick/fps, some games are capped even as low as something like 20 tick on a server.

TBH it depends on the game. But my PC is good enough to run all games at high settings at around 150+ FPS. Only redditors like 60 FPS.

How about C: Fuck your false dichotomy and get 60+FPS AND ultra settings at the same time.
Unless it's Fallout 4, then you can't do full ultra unless you have SLI'd titans because godrays on ultra are Clear Sky-tier.
Or gamebryo/creation games in general, where you need to cap it at 100 to prevent the game running in super speed mode.


Either Tim Sweeny left after UE2 or forgot how to make a good engine because UE3 and 4 are SHIT.

Sounds like a case of sour grapes.


That's nothing to brag about and has zero effect on anything. There's probably someone out there with a core 2 quad and an 8800GT trying to justify his use of decade-old hardware because he can still run games at 640x480 low.

It's at 1920x1080, but sure, whatever you say fam.

...

Ultra settings at 60 fps faggot

I just reinstalled Dark Souls 1, applied DSFix but as usual capped it at 30 for the sake of consistency and to avoid glitches… I can't play it anymore, it hurts my eyes.

Even if your computer is capable of playing at ultra and 60+ FPS, you also have to consider consistency

Would you rather have 60 fps with occasional dips to 40 fps or just a constant 30 fps with no inconsistency at all?

I really encountered only one glitch during my last DS 60fps play-through.

Dips
Constant 30 is rare as fuck as well.

I got one important one… jump distance and pyromancy.

...

You know, I thought at first you guys had some sort or RAID setup to get a drive more than 1TB or you were ballers or something with 2TB drives

But I looked it up and 2TB HDDs are a thing for less than 100 bucks now. I mean fuck this shit evolves fast

Anything less than 60fps is shit.


I got a 5TB HDD for less than $250 this spring. Storage is rather cheap.

Why would anyone proudly show off their mistake. You can get a great 1TB SSD for $300. You don't need more than 1TB unless you're a filthy data hoarder that never cleans up their drive.

Enjoy needing to replace your overpriced shiny garbage every 3-5 years and having a perpetual record of everything in it in the event you get arrested for wrongthink. I'll keep my 8TB of encrypted data that has mostly disappeared from the internet over the years of censorship and corporate takeover.

At this point HDD storage is just too fucking slow. I would be able to run GTA V perfectly fine on ultra if not for the HDD bottleneck. Even with 16GBs of DDR4 memory the model pop-in becomes way beyond acceptable when I'm driving too fast

Depends on the game, for something like the Total War games I'd rather have a more graphically impressive spectacle of my hundreds of heavy cavalrymen plowing through my enemies.
I would even go so far as to say that in Arma 3 I'd rather have a superior rendering distance and more detailed terrain (so that even at 3-4km distance, soldiers don't end up UNDER where the ground is rendered) than 60fps.
It all depends on how much the gameplay benefits from a higher framerate.
For the vast majority of games however, I'd definitely go with higher framerates over graphical fidelity.

...

High to medium. Smooth gameplay is important.
Plus, I'm not really someone who gives that much a shit about graphics.

The problem with that image there, I believe, is effects. Bloom, depth of field, and that sort of shit make stuff look worse than it actually is for no real reason.
I mean, hell, in this, I'd make the case low looks better.

Original doom remakes can have hundreds of thousands of demons.

I need socialism so I can listen to the Mona Lisa on my PC while eating an apple in a cell phone factory.

No shit, midgen upgrades are DLC/Microtransaction type kikery and I've seen faggots here defending it.
Its making these shit tier consoles look even more like PCs and yet these fucking retards will be buyimg parts to build a console one day and still refrain from building a PC.
Brand loyalty is a fucking disease and most people are fucking retarded.

You would have a point here if the console business was about profit through HW sales or support for the midgen upgrades was sold as DLC. Neither is the case.

...

23 fps because human eye can't see more anyway

Both with dynamic resolution scaling and async time warp.

The problem is that while deferred shading simplifies rendering to one shaded fragment per pixel, it makes it impossible to do proper transparency blending or edge-based anti aliasing. That ugly checkerboarding is the only way to fade in higher quality view models without doubling the amount of draw calls. NuDoom only managed to introduce their hybrid forward/deferred/clustered rendering solution last year and even that ended up using checkerboarding for the holograms.

framerate uber alles

I'm actually seeing some really low quality shit in some threads today. What retarded shit happened on the internet now?

How about Ultra setting and no fps cap

60 fps always. Smoothness of the game is far more important than looks.

enjoy your widely variable fps dips and screen tearing up the ass