Anybody know any games that can take advantage of my 1070 that aren't shit?

Anybody know any games that can take advantage of my 1070 that aren't shit?

Other urls found in this thread:

lmgtfy.com/?iie=1&q=good games for 1070
straightdope.com/columns/read/973/why-is-the-action-in-old-silent-movies-so-fast
twitter.com/NSFWRedditVideo

Why even buy one if you didn't have plans for it? You should have known better to wait until the industry gets unfucked.

What said. It isn't even the best purchase you can make in the price range. Stop being a stupid faggot.

75% off, couldn't resist.

old games

Just play Crysis at maxed out settings and pretend it's 2007 again.

Mass Effect Andromeda

Doom

dishonored 2 on medium settings

Emulate WiiU games in 4k.

Where? How?

designed to run on a 386, just a smidgen too old to 'take advantage' of a 1070.

AMD aint perfect but at least they keep their standards open, why not go with them?

Just play good games, most new games are shit.

I wanted to create a thread, but then saw this one, so not to shit this place up I'll post here.

Are there anons with either 1080 or 1070? I know internet is filled with detailed analyses and comparisons and shilling and all sorts of shit, but I want to hear from somebody with personal experience.
Do you or do you not regret getting either 1070 or 1080?
Did you have to compromise at any point in any game?
Did you ever wish you bought another one?
Does new card gives you any advantage in playing older games, emulating, running virtual machines?
I only care about video games, so I don't need whatever benefits are there for video editing or rendering or whatever.
Are there any real video game disadvantages of using 1070 instead of 1080?


user, with all due respect towards fellow anonymous poster, AMD is just not up to scratch and is just AS jewish.
And while the fact that most games deliberately fuck with AMD and their drivers is pure evil, it's not like you can deal with it in any way other than getting nvidia card. Or write your own drivers I dunno.

Buying Nvidia always means you need to upgrade sooner.

AMD drivers really aren't that bad anymore.

That doesn't answer any of my questions.

If you use real operating systems the AMD drivers are actually better than the Nvidia ones since the open source one is better than both the proprietary drivers for each card and the open source driver for Nvidia (which is still surprisingly good nowadays).

AMD is nowhere near as jewish as Nvidia though. All the tech they invent is open standard and their equivalent to Geforce Experience isn't literal spyware unlike Nvidia's.

My answer was directed at none of your questions, but instead directed at your statement of AMD having bad drivers.

Literally the only thing a card like that is good for is GPGPU.

I own a 1070 and Im very happy with it so far, I havnt encountered a single issue. If youre considering an upgrade Id either go with the 1070 or wait for the 1080ti.

...

Did you play any of the recent AAA games? I know Holla Forums tends to hate on most of them, but they serve as good benchmarks.
Do you emulate shit?
Do you play at 4k/120+ fps?


There is still hope for man.

You ignorant fuck.

AMD tends to have better free games bundled with their cards.

lmgtfy.com/?iie=1&q=good games for 1070

It's Mass Effect Androgynous

Dark Souls 3, Titanfall 2, and what I played of Witcher 3 all ran perfectly on ultra at 1080 60fps, I have no input on higher resolutions or refresh. As far as emulation goes I think thats more dependent on CPU. I do have a 144hz monitor coming in friday I think so I could try some benchmarks

...

That seems like overkill if you're just doing 1080/60, I've got a GTX 760 which is pretty old by now and I run DaS3 on max settings with no issues as well.

I think the only time I'd get a card that ahead of most games is if I wanted 4K/144 or to use it for things other than games like rendering.

Dipping to 40 fps is not "no issues" user.
I can play Automata with 760 at stable 30-40 fps, but that's not "no issues" either.
It's time to upgrade, and the question is whether 1080 gives any real benefits over 1070.

Well it's fine on mine, maybe you have a limiting factor other than your GPU

also

also
getting a 760 was a yuge mistake that I deeply regret, the next time I find a good game that I can't max out at 60 I'm definitely going AMD.

This. Try the PS3 emulator as well

Just did a PC build two weeks ago and did a shitload of research on this, user. I went with the 1070. Why?

>no real games worth playing that need something this beefy anyway, so it's mostly an indulgence

Just go with the 1070. Take the extra cash you'd save and put it into a SSD for games you want to play often and for the OS; you'll see a much better improvement there.

Then you aren't playing it "maxed".


Isn't it just a meme? Does it really improves anything?

Sure it makes a huge difference, user. No mechanical components on SSDs, faster read / write / seek times, etc. I've noticed a huge difference in load times on my OS and in big games like GTA:O.

Just get a cheap, 100GB SSD for your OS and a bigger one (250GB+) for your favorite games.

Holy fuck are you serious? They take loading times in games down to nearly nothing and reading and writing is maximum of what a SATA can do. They are a total game changer and you seem pretty dumb saying SSD is a meme.

...

You don't want to store shit like pictures or important documents on an SSD because unlike a hard drive, when it fails it just dies completely and you're completely fucked if you want to get anything off it.

Yes, this is a very good point. Keep all of your movies, pictures, animu, porn, and important stuff on a normal platter drive. Basically, SSDs provide speed, but they come at the cost of total data loss in the event of a hardware failure.

For mine, I just use them for the OS and certain Steam games; stuff I can easily replace if the drives shit themselves.

Should've thought about it before buying, good goy.

What is SSD M.2?

What about hybrid drives? those seem like a better option

You can play old games and supersample the fuck out of them on a high res monitor, other than that there aren't any.

I have a 1070

I do not regret getting it.
I have had to compromise in Rainbow Six Siege because I want 120+ FPS at 1440p. But you will almost always get 60 FPS at 1440p at max in games that aren't future-proofed.
I don't wish I bought another one.
It does give advantages when emulating harder to emulate consoles. AFAIK it doesn't matter with older games and VMs.

It's a good card, I got mine for about $320. I had been using a 570 SC for four years and then a free 670 FTW for eight months prior.

Stolen silly faggot
Here in taco country they steal these things off plenty. Even my 1060 was rising eyes and getting mean looks when I bought it.
People is starved like that.

For 1080p/60hz vidya, you really don't need anything more powerful than a 970.

I still have GTX780 and play all relevant games at 60fps 1080p, so who gives a shit? Most AAA is garbage and the games which are less garbage have no problem running full-speed (like MGSV).

M.2 is actually a newer and more capable replacement for the SATA connections that we all use now to hook up most things in our computers, like hard drives and disc readers.

It can be used for a lot of things, but the main thing that M.2 is used for is as an SSD. A m.2 SSD is similar to a normal SSD, except in a much smaller form factor and is capable of even faster speeds if connected up to a motherboard that can handle the extra tech that M.2 provides.

If you have a motherboard with m.2 on it, it's probably worthwhile getting a m.2 SSD instead of a standard SSD, because it would be faster and take up less space for a similar price.

Hybrid drives are just a disc-based standard HDD with a small sector of fast memory on it. It basically behaves like a normal HDD, but some files that get written to the fast memory will load quicker. Overall, it isn't really comparable to a SSD, but they could be more capable in the future as it gets easier for the manufacturers to put larger portions of flash memory in a hybrid drive. Right now hybrid is mostly a meme but if they start putting flash sizes into the gigabytes on them, you could get SSD-like speeds more often which would be cool.

(780 is also basically just a tad beefier version of 980, so that you know)
I had it before, but it got fucked and replaced by 780.

The 1060 is basically a cheaper 970. If you getv the 6gb ram version, you are pretty much set for mods and high resolution texture packs too.

Wrong.

How so?

Oh fuck, I mean 680.

My point still stands though

Well fuck 1070+a couple of ssds seems like a viable alternative to 1080.
I just fear than when I get 144Hz monitor my games won't be able to perform at that framerate.

I have a 1070 and can play games 1080p @144hz just fine

I sort of regret it only because my Motherboard+CPU is underpowered in comparison, so my build is inefficient.

However, I got it 10% off at black friday so there's that. In addition, I'm being told that even though my build is inefficient, it'll last me for a few more years or so before I should upgrade motherboard/cpu.

What games?

Well, that question rules out ps3, ps4 and psvita emulation.

It doesn't matter because your brain can't process anything higher than 60, don't buy into the fucking myths. Anyone who actually bought a 144hz monitor is the same kind of dipshit with a collection of Betamax tapes sitting on top of his HD-DVD player, right next to his N-Gage. Idiots who buy shiny things because they don't know any better deserve the rope.

Lucky shitskin. Can't say that often.

Everything, you will have to adjust graphics settings for some of the more intensive games though.

Nigger I know for a fact that my brain can discern between 60 and 120 fps.
So fuck off.

Btw my full specs if you're curious are i5 6600k @ 4.5ghz, 32gb ram, 1070 with a modest 300mhz oc(have FE edition)

Remember when your brain could only process 30fps? It's always changing

Bloch's Law says you're full of shit, and that was made by actual scientists and not some autist on a Tibetan moving tapestries imageboard. Bloch's Law applies even MORE for the subset of gamers that those snake oil monitors are made for: the FPS crowd who spends most of their time staring at the middle of the screen. Read a book sometime, you mongoloid fuck. Or don't, I've got stock in Asus and I am loving this free equity courtesy of idiots like you.

Yeah sure some random ass "scientist" is more important than my personal experience lol.
Fuck off.

This. I get triggered when I notice anything dipping below 60fps.

What an embarrassing post.

Far Cry 4 was breddy gudd. Also MGSV, Doom, The Witcher 3, Dark Souls III, etc. There's lots of shit, nigger. Pick something.

Hahaha holy shit, 144tards everybody. They write your jokes for you. Lemme dunk on your stupid ass while I'm at it. The foveal region of your sight (thats the front, mlst important part, since I am sure you didn't know) has been proven to only function at 25hz thanks to the vanRullen Report. Your peripherals can extend to 90hz. Neither of those come even close to 144hz.

Hey buddy post your portfolio, I want to avoid putting money into anything you've ever thought was a good idea.

Posting smug niggers doesn't change science, shill. Go sell your vaporware somewhere else.

Post your portfolio, bitch.

I can tell you what I didn't invest in: a shiny new useless monitor that slams my power bill. Why does that make you so upset? Don't take your poor financial decisions out on me, bro. Take those 25hz peepers and Google yourself some of those papers I posted about. You'll be talking with the rest of us grownups in no time, sweetpea.

Fuck off retard, Air Force testing shows that frames as quick as 1/250th of a second were noticeable, there is no FPS limit the human eye can detect

Look, I just want to use your talent at being superhumanly fucking wrong to help my dad retire. What's wrong with that? Just post your portfolio.

Oh god, someone save this thread. This is some good shit. Post like five more times, I wanna see what other funny shit you admit being wrong about.

Hey taco user, you should start up a racket smuggling jacked graphics cards over the border.

Uh… it's detected images that flash on screen for 1/250th of a second, that doesn't mean 24fps you fucking dipshit

Man, it is really easy to get you upset, cupcake. Please allow me to continue doing so.

1/250th of a second means nothing by itself, but you are completely ignoring the logistics of sight. Your eyes are like a camera, and their aperture only works to a certain point. Thanks to scientific research we have discovered the most comfortable point for panning and stationary is 24hz, which is reflected in the fact that many movies and TV shows are made in 24p, which is 24 frames a second, each one exposed for, wouldn't you know, 1/250th of a second.

I don't think you quite realist who's upset right now.

...

Nigger what? The XX70 is literally the exact same fucking thing as the XX80 but a lower binned chip with some cores disabled.

Pulling shit out of your ass I see.

...

Ffxiv

ProjectCars is glorious in 1440p @ 144fps
if you didn't get a monitor with that card you are out of luck

The observations Bloch's "law" is based on would still leave us with over 1000 fps.

technically it is because there's nothing competing against it except if you can still find 980ti somewehere.

hahahaha

Nah I don't do that stuff, but know where to buy them stolen though.

This is the most intensely autistic argument I've witnessed on Holla Forums in quite some time. Keep going: why stop now, girls?

Yeah autism this autism that, beech.

OH SHIT SON

Are there really niggers than can't see a difference beyond 30 FPS?

...

Shit b8 m8.

Local retailer

You can't win no matter what you do, faggot.

There is an instant noticeable difference between 60 and 59.99. It's bizarre but things look far less smooth when they move when it drops below 60.

that's because of shit vsync implementation.

You can fix it by overriding the game's internal vsync most of the time.

Or just get a gsync/freesync monitor and never think about it again like I did.

If you're gonna spend $500, why not just get the 1080 TI for $700 just in case you decide to get a higher resolution monitor or VR later?

but he didn't spend $500

If you re gonna spend 500 why not spend 700, why not 1000? Save money later by spending it now!

My GPU latest GPU upgrade was from 460 to 770 which I still use and my rule of thumb is that I upgrade whenever graphically demanding games start to list my hardware as the minimal requirements. Now Nier Automata listed a 2GB 770 as their minimal and while they're just retarded japs who can't into optimization it shows that the rest of the world's devs aren't too far away from doing the same.
I'll probably go for another upgrade once a 1170 or equivalent comes out.

...

Did you upgrade from a fucking ATI Wonder to that while trying to run PS2 emulators?


I don't struggle with 30FPS. Anyone that says 30FPS is unplayable is probably on the same tier as "muh realisitc grafix".
You can see a difference when you do a side-by-side of 30 and 60, however. 30 FPS stable with not a single drop looks fine, and 60FPS dropping down even by a single frame will really jar your eyes.

...

if we go by basic math then a moment lasting 1/250th of a second means you can fit in 250 of those moments meaning it would be 250fps.
t. retard

...

Top-grade, weaponised autism there, friend.
One could assume that, if you can fit 1/250th of a second into a whole second 250 times, it would actually be 250FPS.

Anyways, the human eye has no legitimate cap when it comes to FPS. There are known stages where things get smoother and what-have-you, but since the eyes are, and I'm dumbing this down, analog rather than digital, there's no legitimate end to what the eyes and the brain can process. You can keep splitting a second down to smaller and smaller fragments.

...

Sorry, all good games can easily run on integrated graphics. Dedicated GPUs are only good for heating your room and letting everyone know that you have shit taste in videogames.

Starting to suspect you're just a faggot that gets triggered when other people have nice things. Pic related is my current set-up and everyone that's played on it loves the 144hz main monitor and the 120hz TV. They go on and on about how "crisp" it looks and how there is less input lag. They play mostly play /fightan/ on it but the jump in quality is noticeable even with a locked frame rate of 60fps. A lot of it boils down to less input lag compared to the consoles they play those games on or the fact that I edited config files to remove built in lag in games like SF5. Also, since they use consumer sets locked at 60hz with no ability to turn off post-processing (no game mode) it would feel much better even if I hadn't taken the time to dial everything in.

FPS games are even better since I don't play those with vsync. The difference in those on the 144hz monitor is massive especially when it's combined with other features of that monitor like begin able to dial out all the blacks (no more dark corners to hide in). Games like Super Meat Boy become even more fun with the frame rate locked to the hz. Ever play it at 144fps? I doubt you're good enough to make it past the second world when it isn't running at its native frame rate.


Betamax was better than VHS so I own one to transfer old/rare movies that never got a re-release on a good format. It's also fun just to play around with. It was my Dad's so I doubt I'll ever get rid of it.

What a lewd sentiment.

Same, running the 770 and it's still mostly fine but the VRAM has made good progress and it's getting too small for the games that do need it.
Considering to just go all out and get the 1080Ti

They don't, unless you're playing at 4k or something. It's just lazy devs.

Wut?

Yes there are games that do need more than 2GB of VRAM for large scenes let alone if you want some edge smoothing.

The biggest irony, in my opinion, is that ever since devs stepped away from the ancient 360 and PS3, optimization has basically vanished. Lots of games were playable on 3-4 year old PC hardware, later on even 5, when suddenly 4 GB VRAM is the new 1 GB VRAM and requirements went through the roof for frankly very little visual gain.

no
Come on now, that's just adding stuff on top. In fact the way MSAA works is rendering the depth and stencil buffers at a higher resolution, and SSAA is just renderinge verything at a higher resolution.

Yeah, that's for sure. I think it has less to do with the actual frame-rate difference and more to do with stuff like response.

Much shorter flashes can be as visible as to leave clear afterimages. Case in point - lightning bolts, 30 microsecond average, 30 000 FPS equivalent, still clearly visible.

Also daily reminder that 24 FPS is the lowest framerate that human eye still considers a moving picture rather than a rapid slideshow.

It's called microstutter. You notice it because the motion is no longer smooth. You probably couldn't notice it on freesync monitor, but I wouldn't know.

Actually that's around 12-16FPS, which is why silent movies were shown at that rate. The film industry only switched to 24FPS as a side-effect of optical soundtracks requiring the film to move fast enough for comprehensible sound when "talkies" happened:
straightdope.com/columns/read/973/why-is-the-action-in-old-silent-movies-so-fast

As for the 25p/50i PAL/SECAM framerate, that had to do with the mains electrical frequency in most of Europe being 50Hz.

In other words, 24FPS has no inherent significance whatsoever.

That's some >implying owls are birds shit right there

amd would be cool if they could release a single card that can compete with the higher-end nvidia cards but they'll forever be a poorfag/budget pc part maker