I just got a GTX 1060

What should I do with it?

Other urls found in this thread:

raptorcs.com/TALOSII/
overclockers.com/intel-lga775-pad-modding/
twitter.com/NSFWRedditVideo

Mine buttcoins with it

Trash it and play shit tht doesnt need an insane graphics card.

Hotglue it

Return it and slap your own shit for wasting dosh on a graphics card that you don't need.

STICK YOUR DICK IN IT

Throw it away, just like you did your money.
Buy real video card instead.

Throw away the entire computer and buy this.
raptorcs.com/TALOSII/

1080s are like 300-500 on amazon. Thats insane. No game worth shit needs that.

...

They used to. There's just been nothing good from AAA in a long time and indies can't into graffix.

You could have bought something useful with that money.

Wait, i read it wrong. Its 1060, not 1080. Still, 200-300 for a graphics card is still alot and i wouldnt pay for that especially when a 400 dollar laptop plays everything good anyway.

If 30 fps is "good" then you might as well buy a console.

take a shit, scrape it along your asscrack and post pics

You sound like a BR. My monitor alone was $800.

Learn to english, retard. I used good as in "good games", not good as an adjective. If i meant that i would have used the word "well".

unironically cant think of anything. every game i play can be run on a toaster.

Oh if by good you mean old and emulated games, then my point that you wasted your money stands even taller.

...

I can use a computer for other things too. There is just no point in buying anything more than a laptop because i can play everything good on it.

BR.

...

...

Play Doom 4 at 140 FPS maxed out

...

...

Play EDF 4.1 OP, its pretty good

Is there a single modern game that needs a high-end graphics card? I thought all PC games where tied down to console level anyways, aside from shit that is coded like garbage, in which case, it doesn't even matter how good your cpu/gpu is. A computer I built like 5-6 years ago for $700 with outdated hardware not even on the fucking shelves can run most games at a reasonable FPS, although I don't play AAA garbage.

shove it up your ass then jack off on it

Use it to crack hashes.

Play shit with it, duh

I used a high-end graphics card on a modern game but it was to crack the steam ids in their "anonymized" telemetry.
Other than that, GTA Online was fun with a 1080 until I discovered the hard way people weren't making up that "banned for no reason" stuff and I was banned.
Really, the money is better blown on a ThreadRipper 1950X so you have good single core for Factorio and Dorf Fort.

:^)

I still can't believe that game isn't multi-threaded. I haven't tried factorio yet. Lately I've been in the mood for a game like dark souls. I just want to run around and smack things, but also don't want the game wiping my ass also half-decent graphics, I've been emulating from DOS to XP games on wine and I'm starting to get sick of polygons

...

Very guilty of this.

Hollow Knight is an ok dark soulslike souls experience. Didn't really catch me but a lot of anons like it. And Factorio is fucking fantastic if you were into the autism that was modded Minecraft as it's optimized to ridiculous levels and can scale to huge megabases. Takes a lot of effort to hit FPS/UPS death.

On the surface it doesn't even look that hard to multithread dorf fort, there are a trillion things that can be run in parallel and already happen in parallel in game anyway.
t. pissed off he had to overclock his cpu to 4.7ghz to delay fps death on a FUN world and just had it creep back in.
I'm thinking about pumping up the BCLK a bit and setting my 1600mhz memory to 1866mhz while keeping the (relative) latency the same.
I actually broke a 10+ years long overclocking hiatus for a game that has text for textures.

Overclocking used to have some crazy bullshit you could do.

ghetto overclocking has always been the best.
I did it all. this kind of autism was why I told myself to stop.

What?

I came very close to having a custom waterblock milled so I could fucking watercool my workstation thinkpad because lenovo can't be bothered to give a fuck anymore. The heat sink and pump would have been mounted to the lid.
Then I stopped moving around so much due to a new job so I finally got a real computer instead.

Basically in the past there have been CPUs with overclocking locked in which the lock was really just a missing pin.
You could fix it by sticking anything that's conductive and pin-shaped in the spot with the missing pin.
Intel also used to do stuff like this overclockers.com/intel-lga775-pad-modding/
The FSB frequency on those cpus was set by positive/negative voltages, and once people figured it out they started modding their CPUs to get higher FSB frequencies.

Smash it and use the shards to cut your wrists up you braggart.

Mine etherium for a month to make some cash, then list it on ebay as second hand barely used for even more cash.

Mordheim is good, Styx is good.

Could probably do it in 600 and it'd still run everything there is.

Play a current and demanding game like Nier:Automata
Realize that you bought a weak GPU
Upgrade to 1080

Download nvidia inspector. Set AA to "override", AA mode to "32xS" and TAA to "8xSGSSAA". If your framrate still doesn't tank, use DSR to enable resolutions higher than your display natively supports and choose the one that's twice as wide and high as your displays native res. Congratulations, you're now playing with 128x Supersampling in realtime. To put that into perspective, some Pixar movies are rendered with 64x Supersampling.

The best looking ones tend to also be well optimized but if you've spent the money then there is no reason to settle for anything less than 60FPS/1920x1080 at Ultra.

Misleading thing to say since they are rendered with raytracers, something not even 1000 GTX 1060s would be able to do realtime at that scale

Nvidia is completely retarded when it comes to naming their anti aliasing settings and your math is wrong

A few things to point out: nvidia calls one of their transparency supersampling settings "SGSSAA" or "sparse grid supersampling". It's actually "RGSSAA" which means "rotated grid supersampling".
Different kinds of filters are used when downsampling the higher resolution framebuffer, when not specified the filter is generally a linear filter in 2 dimensions or a bilinear filter (because of course your monitor isn't just a single row of pixels).
Pixel grids can be rendered in different patterns, generally they're done in an ordered grid pattern which looks just the way your pixels are ordered on your screen. So when someone says "SSAA" they mean "ordered grid linear supersampling" and when someone says RGSSAA they mean "rotated grid linear supersampling" and when someone says gaussian SSAA they mean "gaussian ordered grid supersampling".

SGSSAA is when you supersample but your pixel grid is sparse. How they're actually sparse can be randomized or use a pattern, there's some snowflake names given based on how exactly you make your sparse pattern but discussing it is irrelevant because sparse grid supersampling is useless and nobody ever does it. SGSSAA only really exists in theory because it's completely retarded to do and use, and nvidia doesn't despite naming their option after it.
RGSSAA is when you supersample but your pixel grid is rotated. Picture it as if someone drew a square and used a corner as the base instead of a side, like people generally do when drawing other losanges. The idea is that it'll anti-alias vertically as well as horizontally, whereas ordered grid generally anti-aliases horizontally more than vertically. It's then mostly preference.

The 32x mode on nvidia's settings isn't 32x any kind of AA. It's 8x msaa with "24 coverage samples". I put "coverage samples" in quotes because nvidia's explanation on what exactly CSAA is makes no sense but it is what they say.
You'll be running the game with 16x transparency supersampling (2x ordered grid gaussian supersampling x 8x rotated grid linear supersampling) and 16x hybrid sampling anti aliasing (8x ordered grid linear MSAA x 2x ordered grid gaussian supersampling + 24 of those coverage samples).

It's still very heavy and effective but it's nowhere near the "128x supersampling" you talk about.
If you're going to do heavy anti aliasing on nvidia I recommend you instead use multisampling and transparency SGSSA (as nvidia calls it) set to the same number. There's a bug in the windows nvidia drivers that makes the entire scene be supersampled instead of just the transparency if you set MSAA to the same value as whichever of the transparency supersampling options you pick, and then it adds MSAA on top. The story behind it is pretty good, people found out about the bug and started using it because they liked the look of RGSSA, nvidia went and fixed the bug but people asked them for an option to do what they were doing and instead of adding it nvidia just added the bug back instead.

I don't know why I autism'd over this but there it is.

blow it out your ass

throw it into the same trash bin you left your brain in

Fun fact of the day: Only a few scenes of Cars were rendered with raytracing to get the reflection right, otherwise it's simple rasterization like in vidya. Remaining difference is that they're doing most computation per vertex instead of per pixel, so they have vastly higher polycounts to reach sufficient quality.


Only 4x could be argued to be a rotated and scaled grid, both 2x and 8x MSAA/SGSSAA use a sparse sampling pattern. 4x is implemented the same way as 2x and 8x, it's just the case that the best sparse sampling pattern for 4 samples resembles a scaled and rotated OG. If you want a true RGSSAA implementation you need to hunt down an ancient S3 Deltachrome GPU, those are literally the only GPUs to ever use a rotated OG for AA, which is why the step above 4x on a S3 DC is 9x instead of the usual 8x. Well, I'm not sure how it was on the even more ancient r200 based Radeon 8500, maybe that's a second case of true rotated ordered grid.
No. 32xS is a hybrid mode of 8xMSAA combined with 2x2 OGSSAA, delivering 32 color samples in total. Upgrading the MSAA samples to SGSSAA via the TAA option delivers 32x SSAA, of which the upgraded 8x MSAA part delivers a SG. Combining that with a further 2x2 OGSSAA step by using a virtual resolution twice as high and wide as the native display res you really do get 128x Supersampling by 2x2 OGSSAA of a 2x2 OGSSAA of a 8xSGSSAA image. Also, I know that story very well as I was a part of it. Damn shame 3dcenter is a shadow of its former self, we got so much shit done in the good old days.

Very interesting. Was Cars an exception or are they moving away from raytracing in general? With the movies that have a lot of fur, subsurface lighting and so on?

All I know is that's the first time they raytraced anything at all.

Maybe you could max out Crysis. Give it a try.

what you do with every $400 GPU and extra 8GB of RAM and new $300 CPU.

Browse imageboards because there sure as fuck aren't any games for the PC.

Try to run Doom 2016 at 1440p at 144fps on a $600 box.
A streetshitter box can technically "run" everything, but at console quality. Who would buy a PC to run games like that?

You can try modding skyrim special edition and max out all textures to 4k, 8k.

Why the fuck would you want to play nudoom?

Because it's a good game, if played on nightmare where it reduces the hand holding.

Can anyone explain this mining bullshit? I don't get it at all. Please explain it like I'm a complete retard.

So this is why new GPUs and CPUs still come out. I'm okay with it.

Alright, that's probably–
why? Most monitors are [email protected], unless you pay out the ass for a 144hz monitor, which could have gone into your budge to get a better graphics card now and have it last even longer if you actually play AAA games.

Imagine you could make money by doing literally nothing. It's everyone's dream I'm sure.
With cryptocurrency it became a reality, there are tards who will pay you to waste computer time.

if you mined bitcoin for 2 hours years ago you'd be a trillionaire by now but you didn't so now you better jump off a bridge

Terrible explanation. Cryptocurrencies have complicated math behind them, and require math to be done in order to function. The ones who do the math are paid for their efforts with a little bit of the currency. You don't get the money for free since doing these calculations pushes your hardware to its limits, consuming a lot of power. Mining used to be the exclusive domain of chinks with specifically designed hardware and access to very cheap energy, but there are new currencies that have new algorithms where an everymans GPU does the task pretty well, and there are no specifically fabricated ASICs yet. You can legitimately mine ETH in the west and make a profit, with almost any GPU.


Kek. What is this shit feature doing here in the first place by the way?
I can say that high refresh rates really are worth it, it makes a significant difference even just in desktop use. It's one of those things that you can't possibly evaluate without seeing and using it.
If you skip the gsync meme, they are not that horribly expensive. I got a sweet 1440p 144Hz IPS 27" monitor for around 550 eurodollars.

Being tsundere aren't you?

i guess you're right, just wanted to give a clean explanation without the cynicism on top.

You can get nice 1440p 144hz IPS gsync (yes it matters) monitors, and they're fucking amazing. They're expensive, but the point of a PC build is to have higher quality than a console, not find a price point that matches it.

No it fucking doesn't. I have a gsync laptop and i can't tell if i have it on or off. Also buying gsync, additionally to the price premium harshly limits your choices to very few monitors, all of which at the 1440p 27" range have terrible panel issues, like the Dell or the Acer Predator one.

On Nvidia cards you could overclock with a piece of pencil lead. You used it as a resistor.

The main point you should keep in mind about gsync and freesync is that it matters more at lower refresh rates than it does for higher refresh rates, which makes the available monitors utterly pointless. Let's say a monitor has a max refresh rate of 75Hz, there gsync/freesync would make a real difference as judder from lower framerates, let's say 50fps for example, would be really bad and look horribly unsmooth as you get every even frame displayed once and every odd frame displayed twice. Map the same framerate to a refresh rate of 144Hz and the judder is 1. almost gone and 2. so small it becomes practically imperceptible.

Son, I got a good enough FreeSync monitor, [email protected] for $150. That extra $400 could have bought me a Vega 56 or upgraded the CPU or bought an SSD, or just flatout saved me $400.

Listen, I get you don't want to have substandard experiences in games, but at some point, your idea of a "standard" is far removed from "beating consoles" and is well into "overkill".


I've found that I was less blown away with FreeSync than I had hoped but removing screentearing on MAME games that run at a funny refresh rate makes it nice. Considering GSync is the nVidia equivalent, I don't see why I would want to pay out the ass for a "neat" factor rather than just stick with what I've got, which is good enough. I don't have experience with 144hz so I can't say if it's stupendously worth it over 60/75hz.

I agree that on a budget you shouldnt put a lot of money into the monitor, but i've had my 1080p 60Hz screen for more than 7 years, i don't think its unreasonable to upgrade or overkill at all. Let it sink in how old the 1080p 60Hz standard is, every other device has made gigantic leaps in resolution, just the PC mainstream stays locked at it.

sell it and give the money to your local natsoc party

they need all the help they can get

...

Pretty pleb of you tbqh famalam.

Buy shitty french games

...

Patrician taste, user.

Then you're just unobservant, user. Either that, or you've fucked up and have it doing vsync. Tearing is very visible in FPS games.

The issue is your turning speed, and unless you're a Polygon reviewer you turn fast enough at 144hz to create quite dramatic tears. It's not solved by high frame rate.

Then enable vsync and make sure it's double buffered vsync and not triple buffered vsync to avoid lag. At fps=max refreshrate, gsync/freesync behaves exactly like double buffered vsync. Fastsync aka true triple buffered vsync is even better as it allows fps to rise above the displays max refresh and doesn't require an overpriced gsync monitor.

Whenever it dips below refresh rate that will cause a dramatic shift in FPS. It's why we have gsync. It just stays smooth.
dohoho, enjoy your jitter, poorfag.

return it and get an actual good videocard
t. 1060 owner

With fps below max refresh, fastsync behaves like classical triple buffered vsync and therefore worse than gsync/freesync, that's true. It also doesn't mean shit at 144Hz where the remaining judder is at worst 1/144th of a second. If that's worth the added cost to you, have fun with your overpriced monitor. It's still a mostly imagined benefit but not quite vacuum sealed gold speaker cable tier as there is a real difference, just an imperceptible one. It'd be another thing if we'd be talking about ultra high res displays where the limiting factor is the connection between GPU and display, there gsync/freesync is a worthy addition to a monitor as it helps achieving smooth frametimes with a limited refresh maximum. But that's not the typical gsync/freesync monitor, those are overpriced placebo 144Hz rockets.

How would you know, you've never experienced it as you're a poorfag. As someone who enjoys the finer things in life, let me tell you that it makes a huge difference. When NVidia updates its telemetry/driver and it screws up the gsync settings it's immediately obvious.

Well, you got reduced to ad hominems as you don't have any real arguments left, I got trips of truth. Enjoy your placebo rocket.

dohoho

TRIPS VS TRIPS

WHO WILL WIN IN THE MONOLITHIC BATTLE OF INTERNET FAGGOTS THAT EVERYONE ELSE IN THIS THREAD IS TIRED OF

Get a PS3 controller and emulate any PS2 game you want. 360 controllers may not offer the best experience when playing some PS2 games that heavily depended on the analog buttons of the PS2 controller.

Did you mean "monumental" or do you just like misusing words? Oh, right. You're retarded. Sorry