Fucking nvidia

so it looks like nvidia is only going to be optimizing for DX12 on their new cards

the 960 TI in SLI wipes the floor with the GTX 1080 ONLY when you don't turn on D12

Other urls found in this thread:

techraptor.net/content/windows-10s-next-update-doubles-start-menu-ads
newegg.com/Product/Product.aspx?Item=N82E16814487084
poloniex.com/
extremetech.com/gaming/216285-nvidia-announces-new-streaming-options-twitch-integration-future-game-ready-drivers-will-require-registration
youtube.com/watch?v=CP8CNp-vksc
wine-staging.com/news/2015-09-25-winehq-integration.html
wine-staging.com/news/2016-05-18-release-1.9.10.html
twitter.com/NSFWRedditImage

Why haven't you gone to AMD yet?

I'm so glad I went back playing old vidya, best decision of my life

...

Isn't DX12 for both Xbone and Winshit10?

Go eat a bag of dicks, nvidia

yes and yes

theres no reason to port between both xbone and windows 10 because

even games ported from xbone to windows 10 run like ass and are incapable of playing.

I feel like vulkan is going to be on top, because having your game accessible to more platforms makes in more cash. with very low effort to make lots of cash is a good way to win.

I feel like I've seen these two (is it technically two?) giving someone a blowjob

someone should post that

...

Unfortunately, at least following the steam charts of most used OSs, Windows 10 is on top as the most used one.

I wish it wasn't, but tons of people are using it.

But yeah at least performance for all DX12 games is complete shit, and no dev that hasn't been payed by microsoft is going to be dumb enough to use DX 12.


I've seen that, but I don't have it

because windows 10 has been forced upon plenty of users

What's the point? What was the last good game that required a powerful grafiks kard?

console ports, which make up 80% of the PC game library.

crysis

nice

Try again, now with a bit more thinking.

don't be silly

...

Crysis 1.

here's your reply

my 770 plays every recent console port fine tho. like, i can't think of a game I want to upgrade for unless it's like Planet Coaster which has a lot of objects and shit in it. and even then in beta it's been running fine.

thanks, but I'm still waiting for your game. Crysis was nearly 10 years ago. Surely someone must be making major PC games that people actually care about.

Child, we're not talking about pc vs consoles here. Leave.

because AMD has shit drivers. games are broken for weeks before they patch shit fuck that company. i hate both AMD and nvidia

you were the first one to mention consoles m8, tell me, what are you upgrading your PC for, why should this new hardware entice you

This kid is dumber than Will Smith's son.

why would anyone not upgrade to Windows 10 when it's free?


just dual-boot Win10 and your favorite linux distro and only use Windows for gaming. it doesn't matter if it's spyware if you have literally nothing to hide.

He's right about the demanding part. Some games a PS4 and Xbone can run are unplayable on even low settings for PC.

Here is y-

o_O

So you actually buy modern games for $60 dollarydoos on release?

Because if you weren't a top-tier goy, waiting a week or so for a patch wouldn't affect you in the slightest.

Why would any company give you something for free? These people are not your friends, they want to make money.

its "free"

techraptor.net/content/windows-10s-next-update-doubles-start-menu-ads

get ready for start menu ads nigger can't use ad block on them, they got you by the balls.

Demanding and 'made by technically illiterate mexicans' are two different things.

it's actually just a piece of shit that sucks to use even without all the spying

console ports are PC games, you dumb fuck

muh free windows 10

Console ports are garbage and not demanding, billy.

oh, I guess arkham knight never happened. Or any of the major PC releases of the last year

nice opinion

enjoy playing modern games at 15-20fps instead of a silky smooth 40-50fps lmao

Alright, Ill bite

>Red Orchestra series dont actually buy Tripwire
A small taste of recently released games excluding older titles that consoles will never get and soon to be released ones.


Christ OP, this thread is really fucking pointless.

wheres the good, finished games man
where are they

Keep trying.

there part of the OS you can't disable them. out side of not updating your computer

you cant do that on windows 10

...

Jesus Christ user, apply yourself

...

What super needy games do you play, user?

Those are indeed finished baiting kun

0/10
Where are you retards coming from?
Im just going to ignore you, this is beyond retardation.

AHAHAHAHAHAHAHA

nigger, apart from there being like 40 players online right now, how is just one finished map even close to a finished game

Am I still on 4chan/NeoGAF?

kek to think my resolve was weakening and I was going to throw up my hands and get an NVIDIA GPU next rig.

God has intervened to keep me on the straight and narrow.

We never left it seems.

Yes.

nvidia should invest in some actual video games journalism that'd tear shit games apart because right now there's no reason to get any new gfx card and won't be for a long time.

stellaris wasn't finished though lol.

i mean, not trying to harp on your parade cause i play pc games too but most of what you list are not finished or were so shit that they're still patching it.

sshhh don't run counter to the pc gaymen narrative, pc gaming is the savior we all need, heil reddit and praise be to gaben le master race

there are no flaws if you don't recognize them! remember the mantra

Game getting patches = unfinished
Okay fam

My 670 and 3570k still kill all modern games worth playing.

Thanks consoles

...

that's not what i mean and you're just being contrarian to prove me wrong or something.

Sword of the Stars 2 is a good example of a game not finished and being so shit they had to patch it for years and it's still not really there. Red Orchestra 2 was shit on launch, and is still not very good. The "patches" for Stellaris will be dlc for features that should've been there to begin with, like trade or espionage. X-Rebirth is another.

lol you're legit retarded.

next you're going to tell me ubisoft games are finished products cause at least they came out on the day they said they would kek.

user that isn't a very good example.

That guy didn't even mention any swords of stars, buddy. Nor x-rebirth.
You sure you're in the right place buddy? You seem lost.

No, I don't talk about ubisoft games because they're shit. Try harder, chromosome hoarder.

hes not the one fanatically defending pc gaming like some redditor.

How is it even possible to fuck up ports in this generation?

I can understand all the previous generations, where you had crazy hardware architecture on consoles like 7 core PPC processor.

But the xbone and PS4 both use X86 AMD CPUs with integrated graphics. They're literally just PCs with a bare bones OS, and check for specific encryption keys before they're allowed to run any code. Porting to PC should be a straight copy paste job at this point.

Oh hey, you're the batman arkham knight guy, how's it going, still stuffed with tyrone's dick?

the topic of the conversation was "big finished games on pc" lol. there's not that many of them currently coming out.

it's a problem on console too, but instead of admitting it and being angry at devs you rather be a shill and defend horse shit like a fanboy while arguing with me. so i don't know what you want me to tell you tbh fam.

And one of his examples was Stellaris to which you responded with "nu-uh because they'll dlc it!!"
As some retard say around these parts
lol

Jesus Christ you are pathetic.

2 victory conditions that are only militaristic with a boring mid-game and basically 3 exactly same crises in what's supposed to be a grand strategy 4x doesn't equal finished lol.

In what world does "wipes the floor" entail "5-10% better, for $400 more, nearly twice the power draw with all of the disadvantages SLI brings to the table, and only in DX11".

I already pre-ordered two GTX 1080s and upgraded to Windows 10 so I can play games like rise of the tomb rader Fallout 4, and the new DOOM at a blistering fast 60fps.

Oh buddy, you misunderstood me, possibly because of your mental retardation.
There's no golden gaming utopia right now, hence why I'm baffled by people who want to upgrade their pcs.

You should have just said "I am a consumer whore", it would have been shorter.

And you know they're gonna dlc it because your dad and his sister who's also your mom work at nintendo.

Take a game that is designed to run at

or I could have said "I am rich, you are not, therefore I can buy whatever I want and experience the finest things in life".

stay mad poorfags.

Uhuh. That explains why you're furiously defending your PC gaming waifu's honor and autistically insisting that not only is upgrading a PC not required but that people should subsist on the same games they've already beat a million times before because nothing that requires a GPU is any good anyway.

The finest things in life indeed.

all platforms have their problems, but unfinished, technically crippled games aren't much of an issue on console. Generally, if you're making an exclusive, the console mfg gives you tons of support to make sure the game looks great and runs well on their system, with as little technical issues as possible. No one wants a big title to represent their system poorly. It's a much more PC centric issue.


because having a dedicated hardware platform really, really matters. It also seriously helps to have someone who can support and fund you to make the game play well on their system. This means devs focus less on technical issues, and more on making their game. Less obstacles, more production. PC is a different story, the audience is also significantly smaller, If something goes wrong and your game doesn't run on 10% of potential players machines? Not the end of the world, a patch will be done in a week or two. 10% of a few hundred-thousand isn't a deal breaker. On consoles, that 10% is 10% of several million on the other hand. A much more sizable and profitable player base.

This is all very logical, common sense stuff. Something tossing a $300 GPU and $250 CPU at won't fix with pure horsepower. Something that does hurt peoples interest in PC gaming

Shit nigger, if you want to upgrade your pc to play more amazing AAA titles then go right ahead, don't forget to pre-order too.

actually I'm dual-booting Win10 with Ubuntu and only use Windows for gaming, so there is literally nothing to spy on. it's like going to an arcade for a few hours per day and crying that the security cameras there are spying on you.

I know how to disable the ads. I do it by not installing windows 10.

do i need to remind you colonial nations was part of a dlc for eu4? lol

what is bethesda, ubisoft, ea, and others for 10 alex? you're as bad as the pc fanboy lol. enjoy your bloodborne and the order tho lol.

enjoy never EVER experiencing the superior DirectX 12 gaming experience. keep clinging to your Vulkan meme lmao.

nvidia is doing vulkan tho.

Ok, I will.

ok

but most games aren't doing vulkan. even emulators aren't. look at Dolphin, the Wii emulator. huge performance boost with DX12, but it doesn't support Vulkan at all. :^)

in fact here's a video that nvidia did of DOOM with vulkan. i remember above you saying you wanted to play doom at blistering fast fps right?

i already have a gamecube tho.

er i mean wii and gamecube, i forget dolphin does both.

good, you can sell them to help save up money to buy the GTX 1080 and emulate Wii/GC games at 4K resolution with 8x AA and other enhancements thanks to DX12.

but i'm not a poorfag like you, so i don't need more money. i can just buy the console and play it there lol.

A fool and his money are quickly departed.

why use Vulkan when I have Win10 and the game runs better with DX12?


lmao

What are you, a faggot? Just run a hardware accelerated VM inside linux using pcie passthrough.

Is 2e8353 just baiting or is he for real?

I can't tell

it's just mario and shit lol. i just have them for multiplayer mario kart and stuff. why would i care about having anti-aliasing with no multiplayer support when i rather just want to hook up multiple controllers and get drunk? lol. i'm not poor enough to not have friends like you.

lol lol

yea lol, ikr?

damn this game actually looks great, why does Holla Forums always shit on it? is it because it's not cracked yet and Holla Forums is full of poorfags who are mad they can't buy it?


why would anyone do that when dual-booting gives better performance?

inb4 more epic linuxdrone botnet memes


casual detected. also if you are rich you don't need to have friends. all peasants have friends so they can share the few possessions they have.

It's obvious bait user

Its morely likely that he's one of those memer kids of average intelligence who made a simple mistake by having a bad opinion on the internet, but due to his limited mental capacity, is unable to see past the argument in front of him. This causes him to spout more bad opinions that he likely doesn't even personally agree with just to defend himself, which is hilarious considering this website is anonymous and there is no such thing as a personal attack on here.


Yeah, like 1-2fps difference, Wow.

Enjoy the reply.

more like 5-10% better performance, and it is far less of a hassle to setup than using a VM and all that passthrough shit.

lol. wizard detected.

Come one now, I had a good chuckle to that

The Windows Store is fucking aids DRM. It treats programs like proprietary closed off phone "Apps", none of the Xbox One DX12 games that have been ported are optimized and are a broken mess.

Because the games are "Apps" you can't access or modify game files and Microsoft enforce their own approved setting for you, such as;

It's just Windows Live 2.0, why would I support this?.

what i think is really funny is that people bitch about how origin, steam, uplay are all shitty drm.

it could only be microsoft to outdo all of them and do something that is actually worse.

video games dying is pretty funny to me tbh.

nice try linux shill

Wait, what?

I'm a poorfag in the process of putting a computer together for the first time. I got all the essentials, but was waiting on a videocard when this was announced the day before I put my order in. My last videocard was a GTX 295, so almost anything is a step up, but I was excited to hear about the performance leap this had made.

Do you have to have Windows 10 to benefit from DX12? If not this, what should I get? I've heard of the Vulkan thing, but am not familiar with it. It's just a DX alternative?

Halp.

You are better off asking somewhere else, this thread is full of baiters baiting baiters

You really shouldn't buy a newly released card. Honestly a 980 is still way overkill.

If you actually want to play shitty console ports, which are mostly terrible, anything from a few years ago will be more than enough.

Don't fall for Nvidia jewings.

My interest in overkill was actually in wanting to teach myself 3D modeling and animation.

t. Someone who has never pirated a Windows Store exclusive game on Windows 10.

Oh dude, I'm currently using a laptop my uni gave me with a 2GB GTX 860M and using Modo and Zbrush really easily.

As a beginner, which basically means the first 5 years you'll be doing 3D modeling, you won't need anything stronger than a mid range card. Especially since these programs are actually very well optimized, unlike console ports.

You don't need much for 3D modeling or animations. because it's all done at low resolutions.

It's the rendering that you need power for, but even then, you hardware just improves completion speed and your personal time saved.
It's not GPU based anyway, It's all CPU and Ram speed.

Huh. I see. I'm glad I don't have to worry on that front, but for the sake of argument, let's say I wanted to future proof for games. I'm looking at the NewEgg page for the 980, and it's not that much cheaper than the 1080. (Which will apparently be $600.)

newegg.com/Product/Product.aspx?Item=N82E16814487084

Is there something fundamentally wrong with the 1080? Someone mentioned something that loosely implied that it required Windows 10, which is an instant no-go for me.

This guy is COMPLETELY full of shit, pay no attention to him. Don't get a videocard yet though, wait for the new 16nm cards to come out.

Technically, the 900 series supports DX12, but it lacks async compute features that are necessary for the mad gains in FPS inherent to DX12. There's also the fact that SLI is shit.


It's about $600.
The benchmark is 4K. Almost no configurations can run modern games at 4K 60FPS. 1080p 60fps, on the other hand, will definitely be easy to achieve.

There's nothing wrong with 1080, no. Sure its overkill unless you're wanting 4k gaming, but if you're only on 1440p or 1080p with a 144hz display, you're golden with the the 970 or 980

Got to love how all AMD tech is open source so everyone can use it to optimize their shit, while JewVidia is all locked down so they can fuck over AMD.

Say I recall someone saying JewVidia sued AMD for their hair tessellation code because it made CrapVidia cards run like shit. You know the same shit they do to AMD all the time. Anyways is there any truth to this?

that would only be good in their lesser models, no one will buy their 1080 card to play 1080p games the 970 does that end even then it can do fine in most 1440p games.

native and always max settings @ 1440p will however be the field of a single 1080

OPs post was mediocre, but goddamn, thread went to shit fast.

Yeah, that's where I see this card getting used the most. I might just get one at some point because 1080p 144fps makes my dick diamonds and is super easy on this card according to some benchmarks

The problem with Nvidia is that they push the devs to release shitty ports to force you to buy a new card every few years to play everything at max.

If you bought a 1080 it would be old in 2 years already.

There's no such thing as future proof for ports. For anything else on PC, a 980 is way overkill.

user some people aren't shills, they just have very very shitty opinions and standards.
Look at Nu-com 2

Arguably GTA V

other than that I'm gonna have to say Crysis 1

You mean NuDoom?

but GTA 5 was extremely well optimized, it even ran at playable framerates on integrated on-die apus.

...

I'm pretty sure that's was a photoshop.

That would be unwise considering there are hardly any DX12 games.

...

He's likely a shill by M$ to convince people to upgrade.

There's been rumors some exec's at Redmond are pushing hard to get the 1 billion device install to get bonus money.

What's funny the adoption rate has fallen and W7 is still on 50% of windows machines.

...

bullshit. Substance painter is a GPU RAM and CPU hog.


just get a 1080 or a 1070 when it comes out. You will need a lot of ram. I've got 16GB and I still run out if i have say max, painter, firefox and photoshop open at the same time. AMD cards are good but since I use painter I need an nvidia card to use iray (nvidia GPU based raytracer). you might want to wait for the new version of a titan which ill probably be better for modeling and shit but It'll be ridiculously expensive.

I can't imagine how salty these guys are. SLI hasn't worked properly ever except for a couple of specific games. what a rort.

...

It's been 8 fucking years can you shills not give it a fucking rest?
especially when goyvidia regularly releases drivers that literally kill older gen cards so the good goyim will upgrade

AMD employee, the point is amd has shittier drivers.

>>>/oven/

Amd is indeed shit, you heard about that corrupted cursor? That shit still happens on amd drivers and I doubt they are ever going to fix it.

I hate my amd card so hard, I consider buying a nvidia card, but I honestly don't want to support those dickheads either.

Just this year nvidia released "game ready" drivers that made the game run worse (DS3), and several unstable non-beta drivers. They're both shit. However, looks like AMD is the better value for money shit this time.

So Way of The Samurai or the upcoming EDF are not good?.

Even the poorest people have friends user.

First time I ever heard of this.

I had only one amd card so far, and with that one I got that error. And when I google it I can find people from early 2000 having the same problem and also very recent threads about it.

Can it be at least a steady 38 fps? Like it will never drop bellow 38fps? Just curious.

Not calling you a liar, just never heard of it and I owned AMD cards since early 2000.

You did this to yourself, I'm still running a 670 and will be replacing it with a 1070 because I only upgrade when there are die shrinks.

...

Can confirm corrupted cursor is an amd thing and they are never going to fix it. Whenever I get it unplugging and reconnecting my second monitor seems to fix it.

I wasn't saying never buy any GPU ever and like it or not nVidias jewish anti-consumer practices have made them the best option for many games as they are optimized for nVidia, what I'm saying is when you know DX12 and Vulkan are coming don't buy a new GPU.
By redesigning the core they can physically optimized for these after the APIs are released, clearly they can't do it before they have the APIs so why would you expect a core designed before DX12 and Vulkan existed to preform as well as cores designed specifically to be optimized for these APIs?

Use your fucking brain and wait for major changes before upgrading if you want your hardware to support those major changes.

No asshole, it's called Legacy Code. Something that NVidia is phasing out all the time so that it can stop slowing down modern cards. Because they don't need that special shadow technology that they invented for Doom 3 10 fucking years ago, now they have a new hardware implementation of shadows.

The reason that the 1080 gets better performance than the 980 is simple: because HBM, because they've improved the features supported in hardware (again), and because they don't force newer cards to brute force render features that used to be hardware-supported, so they swap out the code from previous versions of a technology for one that works best for the modern hardware.

This is why older cards benefit most from using older drivers, because you want to use the driver that best supports the hardware virtualization that your GPU can support. Right now the 900-series is the premiere platform, and the 700-series is not, so if you're updating your drivers on a 700-series GPU you're actually reducing your performance because they're removing 700-series hardware virtualization to get ready for the release of the 1000-series.

AMD does this shit too, but less people notice because there's less AMD users and the AMD SDK is a bit lackluster. You know why AMD cards outperform NVidia cards for some games? Because NVidia's SDK relies on hardware driven shaders (which is where AMD cards falter, and then some faggot claims NVidia is "jewing AMD") and AMD relies on brute force rendering (which is where you render all effects manually, resulting in graphics that don't look as good because there's no hardware to take advantage of).

When it comes down to it, AMD is still the choice of people who want to use things like OpenCL, or want to use older API versions like DirectX 9.0c/OpenGL 2.0.

How the fuck does this card only use a single 8-pin power connector?

HOW?

I have to use it temporarily for work and it's so bad I want to die. I'd rather install gentoo.

you only upgrade when its the biggest waste of money?

16nm core (so less cooling needed), efficient cooler fins, and it's a stock card. Someone will probably release a version that requires 2 power connectors that doesn't use stock settings.

forgive my ignorance if this is some Freetard meme but

yes you can?

Eh, it's his money, let him throw it away if he wants to. Although to be fair, if he's still using a 600-series card, he's in sore need of an upgrade anyway.

When they shrink the die they have to use lower voltages which generally results in lower current and lower wattage.
That is why new CPUs and GPUs are using less power than previous generations.


I ran my 9800 XT for years and I have gotten years out of my 670 GTX, I don't see any reason I won't get the same life out of a 1070 as it's optimized for DX12 / Vulkan and they will be the best APIs for the foreseeable future.

Not through the control panel like with every other program.

I actually know a guy who works at Microsoft. He only really stays there because the money is really good, he doesn't really like them.

He said the reason they're doing such a heavy-handed approach to making people use Windows 10 (and offering it for free) is because they're trying to make their OS into a services platform. The more people on your platform = the more money you're going to make. That's why DirectX 12 is W10 only, that's why the Microsoft Store is being pushed so hard.

They do sort of track you, but only for the use of targeted advertisements. They don't really give a shit about collecting your information for the government.

They removed a lot of user control mainly to make it retard proof. How many people do you know who can't into computer and completely ruined their system because they messed around with settings they weren't supposed to?

I mean it'd be stupid as fuck to remove them without a browser already installed, but you can.


See that's exactly why I stopped using Win10 though, is because they don't have a power-user mode anymore, it's now rounding off the corners for stupid. And I don't want that. I want to be in control of my desktop from customization to installing a third-party driver without needing to put my computer into developer mode to do it.

I mean, they might add that in the future? You never know, man, if enough people want it it'll probably happen. Companies are motivated by money, they won't make the move unless it's there.

If they add it in the future, I'll consider going back to Win10. Until then, I'm good. I don't care about DX11 or DX12 since there's not enough games using them anyway.

How reliable are nvidia cards compared to sapphire cards? The price of the 1070 is really great for me, but i'd like to be sure to get something as durable as my old sapphire radeon hd 5770 (like 7 years old) that never gave me any kind of problem and was planning to replace it for a newer sapphire card hoping that it could be as durable as my old one.

You're half-right

another reason for pushing a free Windows 10 was to make it easier to give Chinese Tablet makers free upgrade licenses after how poorly received Windows 8 was

I've only ever had one video card die on me in my lifetime, and that's because of a faulty water cooling system. But otherwise, I've never had a GPU die, using either NVidia or AMD.

So I wouldn't worry about lifetime of your GPU unless you're overclocking it, then you're on your own.

I can only tell you about my 670 which is 4 years old and 2 of those years I have been NEET playing 8+ hours a day, it also spent a couple of weeks mining crypto.

...

Did you get any buttcoins?

I mined shitcoins for awhile then built a 3x R9 270 miner for shitcoins, I paid off the build and ended up ~$300 up. That said I made much more by trading shitcoins on exchanges.

No fucking kidding, huh? It's like DirectX 12 relies on Windows 10, you'd think these people- OH WAIT, IT DOES.

Very nice.

Wither Windows 10 spies on you or not I really don't fucking think it does its definitely a PR disaster Microsoft should address but probably wont

If you are consider getting into it I higher recommend poloniex.com/ as they are the only exchange I know of that lost coins due to a technical error and actually paid back every cent.
XMR has a bright future but don't expect anything major short term

Nope, I'm not using an AMD card, and I don't need the money. The last time I tried bitcoin farming, I lasted all of like 3 hours before I killed it because I didn't like my GPU being so hot all the time, there's no way I wanted to keep that up for the weeks it would take to finally earn a bitcoin.

I'm not saying you should mine, just buy a little BTC and move it to and exchange to start trading between shitcoins.
After paying off my build and having a party I had 0.5 BTC left which I moved to an exchange, that is now 6 BTC just from buying low and selling high.

How much are microsoft paying them for this? Because there's no rational universe in which nVidia is doing this off their own backs.

Ah, that is pretty decent. Especially with bitcoin currently having a value of $455.91 USD right now. Anything you're going to do with the coins eventually?

I'm just going to keep building it up, if I can get to the point I'm making ~3 BTC a week I'll go "professional" read self employed NEET geek

...

...

The 8-pin power connector on a graphics card is intended to pull up to 150w from the power supply while staying within the ATX specifications.

The PCI-E slot itself pulls up to 75w to power devices. 225w is more than enough for the GTX 1080 which has a 180w power target for stock operation.

6 pin power connectors can draw up to 75w FWIW, so a single 8 pin is really no different to two 6 pin connectors, which is what the GTX 980 and GTX 680 used. So it's not really new for Nvidia or GPUs in that sense.

Vulkan might spell the end of DX (and by extensions, microsoft lock-in), and it needs DX12 lvl hardware (not software), so they might also have that in mind.

If I got to the point I was making 3 BTC a week, I'd just keep making 3 BTC a week. That's $1367.73 a week, or $5470.92 a month. As long as you're not buying big flashy things, no one needs to know how much money you're making since you can spend BTC without the government needing to know.

Microsoft has been doing this forced upgrade trash since day 1. DX10 was vista only.

Just wait for the inevitable HBM2 version that will almost surely release in 2017 you stupid fucking goys.

Also who gives a shit about DX12, UE4 has already added Vulkan (which means anyone using it, AKA a lot of devs from indie to AAA, can release a Vulkan build easily), so I imagine the rest will follow soon. What retarded developer would limit their game to Win10 when Vulkan does the same and runs on most things?

AMD has an exclusivity deal on HBM2 until at least 2018. NVidia can't get their hands on it.

In the context of the post it's the same fucking thing. If you want to play PC multi-plats you need a top of the line card because no one optimizes for PC anymore. It was pretty clear what the guy was saying but you dumbshit PC only casual fuckwads love to play like retards to prove non-points so you can argue how much superior your shit taste in vidya is to other peoples shit taste in vidya.

AMD has priority access with Hynix, not exclusivity, let alone exclusivity over HBM2 (which Hynix isn't the only producer of, for example Samsung is doing it too).

Alright, but there's the other factor which is that HBM hasn't provided any noticeable performance improvements in currently seen GPUs.

If you want a GPU with HBM on it, NVidia has the Tesla series. But what we saw from AMD's HBM offerings is that it really didn't do anything to improve their framerates, they still got BTFO on benchmarks. You can claim "THIS IS BECAUSE NVIDIA IS SABOTAGING AMD!" (which is ignorant and stupid to say), but overall if AMD's offerings of HBM units can't get their performance into noticeable levels of difference, what fucking hope do you have that HBM will in some way improve graphics card technology at all.

I mean sure, it's feasibly possible that with HBM we can see GPUs reach 32 and 64gb of RAM, and smash through the 8K barrier. But for actual performance, HBM does nothing. If you don't intend to play games at 4K or better, then HBM is a waste of your money, and yeah it'll definitely cost more than GDDR5 does.

Also I just looked, AMD's next generation of cards, the 400-series isn't going to use HBM AT ALL. Apparently HBM is already fucking dead, the 400-series of AMD cards is 100% GDDR5.

don't even waste your time user, everyone who browses Holla Forums has no idea what they're talking about when it comes to tech

This is what you get for using dual-GPU laptop that switches them on the fly.

all the power it needs you vaignal wart

I can tell that goyvidya lasts a long time and i only ever changed them because they were outdated, AMD processors and AMD/ATI gpus on the other hand have failed multiple times. However i still have an all-in-one wonder that ran on full load for 6 years, is stored on the shittiest place possible and still works like new. Sapphire is the best AMD partner and their quality is much better than stock though

nigga windows has been spying on you before you were even born

nvidia has been in bed with openGL and khronos for a long time, but i think they would rather play safe and play with the consumer's butt with dx12 this time

GDDR5 has been on it's last legs for a long time, however performance-wise something like HBM is really only of use if you're pixar or something. The speed while rarely a problem in video games is welcomed and the capacity (assuming companies won't just use it as a cheaper memory) is a great addition for higher resolutions. a thing about HBM is that it decreases complexity of the hardware you make immensely while performing better and keeping shit like temperatures and power low, we will for sure with smaller transistors and new architectures see a huge spike in performance soon enough.

Also, HBM cards will probably perform much better with supersampling anti-aliasing

hbm2 was having yield issues, currently expected in late 2017

What are you jabbering about? The 400-series has been out for more than a year now.

I turn AA off always.

So what does this mean, are they going to release reballs with HBM2 on them, or are they just going to hold off on using HBM2 until the 1100/500 series?

Ignore that, I'm retarded. Or from the future, pick one.

holding off yeah

For normalfags? I doubt it. Normalfags accepted Windows 10 with open arms.

Yeah there isn't going to be another civil war against tyrannical government. Not when youth holds that kind of pathetic attitude.

I would like to remind people that nvidia has/is going to have forced registration for driver updates:

extremetech.com/gaming/216285-nvidia-announces-new-streaming-options-twitch-integration-future-game-ready-drivers-will-require-registration


>In the future, only GeForce owners who both install GeForce Experience and register the service by providing Nvidia with an email address will have access to Game-Ready driver downloads, which will be pushed exclusively through GFE.

I would also like to remind you that Windows 10 being a 1984 spyware OS is not a 'meme' not matter how much you try to call it that. You can quite literally go to the Microsoft website right now and read the Windows 10 privacy policy and see for yourself. It's not a lie, it's not tinfoil, it's not a meme and it's not a conspiracy theory. Just go read their own policies listed on their on website.
And no, you can't 'turn it off'. Not only does the os quite literally tell you that you can't turn certain things off and for some that you can they'll just turn on automatically anyway, the fact of the matter is that the source code for Windows 10 is not available, thus you have no way to verify anything that Microsoft says. You have no way of knowing if you're turning anything off.

Nigga WAT
Additional bandwidth is the absolute last concern you have when using SSAA, you're confusing that with MSAA. MSAA eats bandwith for breakfast, lunch, dinner and midnight snacks. SSAA hits everything equally hard with the only exception being some parts of geometry processing.

...

But what if you like the shaft up your ass?

Do you even know what the fuck are you talking about, nigger? MSAA just takes multiple samples for each fragment to determine what fraction of it is going to fill the pixel, it has precisely dick to do with bandwidth and only tanks processing. MSAA scans over the picture as a single texture of that dimensions and applies some filtering, it has minimum impact on either processing and memory bandwidth. And finally SSAA is simply rendering in higher resolution and downsampling it to output resolution.

Then we'll gladly fuck you in the ass if you stop voting.

FXAA.

...

Okay. But the fact that Linux's surce code is available to you is not much more availing to you than Windows source not being available to you is. The source code for an operating system is in the millions of lines of code, if not billions, and that's just for the kernel.

And if you do find something that looks questionable, you'll probably not even realize it is questionable and dismiss it on the virtue of "I don't know what this does, but it doesn't LOOK like it's doing anything I don't like, therefore, it's probably fine."

And then you can enjoy the community of GNU assholes who shit on everything if you're not using the latest and worst implementation of something because they have a televangelist bug up their asses against anything even the slightest bit closed source or proprietary where they want everything to be made FOSS so that it's easier for hackers to find the holes and exploit them in software…

Windows 10 is awful, but it's better than Linux for a variety of reasons, and everything on the market is better than OS X.

I'll vote for whoever senpai wants me to if senpai gives me his cock

MSAA isn't post process AA you retard. Where's the depth test in your description how it works? Where are the multiple color buffers needed to hold the samples? MSAA isn't FXAA.

no, they report the bugs, patch the holes.
like what difference does it make

Wow, you really are naive.

Daily reminder that by not putting your real full name as a nickname whenever possible on the internet is breaking the law in the united states

yeah and sometimes, the developers bother patching the older versions

see

Also because MSAA can generate multiple fragments per pixel while completely discarding any information about polygons they constituted is why it's fundamentally incompatible with deferred lighting.

wut

Cite the law you're talking about.

Businesses that require security of their shit pay more for fixing that shit than shady hackers pay for exclusive disclosure of it.

Computer Fraud and Abuse Act

They don't have to fucking pay, why would they have to pay for something they can find themselves?

See, also corrected you about your misunderstanding of the tech.
It needs to shade each fragment so it effectively becomes SSAA for a significant part of the workload.

Go to beyond3d or 3dcenter to read some of their old articles about AA. They're from the time before deferred lighting but you need some refresher on the fundamentals.

Because it's absurdly fucking difficult, if you think hacking is sitting behind a green-on-black console and typing shit like this guy and finding 10 vulnerabilities in a few minutes then you're a fucking retard. And yes they do sell them, more often than not they simply couldn't find any good use for the exploit they discovered. Meanwhile malicious parties that would exploit such vulnerabilities hardly ever, if at all, have any hackers on their team, much less ones that have managed to actually find anything. Cracking some DRM is child's play.

You don't have a fucking clue do you.

Good job projecting your own stupidity. Filtered.

Except there are millions of linux users all around the globe to help out with that. Your point is moot. When ubuntu added a spyware feature into their search bar, it was found out quite fast and there was a huge backlash. Multiple modified versions of ubuntu without the spyware were released very soon afterwards, and the spyware feature was eventually taken out of ubuntu. youtube.com/watch?v=CP8CNp-vksc

None of this would have happened on Windows 10. In fact, Microsoft literally tells you on their own website that they spy the fuck out of you, and nothing has happened to change that.

Also, just use a user friendly distro, like Mint, or Manjaro or Trisquel. If my grandfather can use Manjaro, so can you.

Both Mint and Manjaro are not 100% 'Free-as-in-Freedom', perhaps look into those?

I know you're going to tell me "Virtualization" as a solution to the next issue, but software acceptance on Linux is less than 5% of all software. If I have to run a software compatibility layer, emulate Windows or run Windows in a virtual machine (all different things, but all inadequate) then it's just not worth it.

At the end of the day, I'm not paranoid enough to sit on a bed of nails and start humping a penguin as my mascot. And alternatives that aren't Linux are being made by people who don't like Windows anymore than you or I do.

Linux software makers like to "make the software work the way you want to use it," and then wonder why you don't like the non-neutral way that it works. Windows software makers like to make their software with a pay wall that activates in 10 days.

It's a shitty compromise no matter which way you go.

It was me you dumbass. I typoed some shit and made a correction.

You also seem to have precisely zero knowledge about rendering process. Without going into much detail: deferred lighting, instead of just displaying appropriate texture piece for every pixel that shows up on the screen after some shader processing, generates a bunch of snapshots with different kind of information about the shit it captured. I think there were diffuse, specular and normal maps, maybe also some other shit. During light rendering, for each light source, GPU renders a model of that light area of effect, i.e. a sphere or a cone, you get the idea. Then it checks it against depth buffer so only the geometry that actually gets lit generates final fragments that are then sent into "main" rendering. The final renderer picks up previously buffered information about base texture, shininess and light deflection direction and combined with data about light source shades the fragment. Now if you just mash together data from multiple polygons into single pixel you'll simply get bullshit results, like light reflecting at weird angles and weird shade color some finnicky reflectivity spazzing on the edges of objects and whatnot. And since fragments are discarded and only pixels are left, you can't have any more than 1 fragment per pixel. This is why deferred lighting is fundamentally incompatible with multisampling.

Aye it's also why it's fundamentally incompatible with transparency. If you need to render transparent shit, you must do it in a whole separate pass.

That's just really you being stubborn though. For most people they'll, outside of videogames, linux has a FOSS counterpart for basically all the software they use. For most people there are only a few things (like photoshop if they really aren't content with GIMP) that they'll ever need to run through Wine or a VM.
I'm not trying to shit on you, but you're exaggerating a very minor issue.

You have direct access to color samples and related depth values since when? Was it DX10 or DX10.1? DX9 deferred lighting was incompatible with MSAA. For a very long time now that isn't the case anymore, it just lost most of its efficiency advantages over SSAA so most devs don't bother supporting it and just smear PP-AA like FXAA or something else cheap to implement like OGSSAA over the scenery.

The people who are committed enough to read the OS/kernel source code are usually committed enough to their OS/kernel to report bugs. And given the vast number of people who are doing just that, bugs and such are very frequently spotted and fixed.


This guy gets it. I've been running one of several distros as my main OS for the last year or so and I've found myself missing very little Windows-exclusive software.

Nigger, deferred lighting is STILL incompatible with MSAA because of the way it fucking works. It's like direct current is incompatible with coil transformers.

Holy shit crack open a book about computer graphics.

I have an EVGA 670 GTX and frankly save a tweak here in there, mostly removing shitty effects which I always remove anyway, I run everything fine at in 1080p (which is my native res anyway). Sure it could be more fluid but to play what??? Console ports that are so slow they're literally made to be played at 20-25 FPS in 720p that I run at 40-50 in 1080? Indy games?

I'll upgrade when they've corrected after the initial launch of the 10X0 series.

But the fact is I haven't gotten any incentive to upgrade since I bought my 670… And frankly if my config wasn't showing signs of obsolescence (as in things not meant to last more that many years) I wouldn't…


You guys are so retarded… the government has access to every data collected by private firms. Worst they HAVE to give it to them. The gov' is using them as the fucking middle men in their massive surveillance state which would never get voted otherwise, then go "it's ok it's not us doing the stalking, we just pay the stalkers to do it for us".

really i thought seam OS was going to be the gaming OS. and i would just drop windows altogether. but that shit has seem to have fallen flat.

Holy shit do so yourself, the DX9 era is gone for good.

steam os*

I have a GTX470 and was planning on getting the 1070 when it came out. Dark Souls 3 is the only title thats giving me trouble and Its been a gajillion years since I upgraded the gfxxxx

I'm using Win10 Pro right now. I used the FixW10 tool immediately after installing. I didn't install this for the purposes of using it as a permanent OS, only for evaluating the claims made by people here about the OS as well as feeling it up a bit. The only thing I can really disprove is the claim of ads in your start bar and Windows key logging you (I checked for traffic via Wire Shark). My genuine complaint about the OS is that it obfuscates user control behind a very simplistic UI and settings menu; that isn't to say that a Win7 type control is non-existent, it's there. Win10 does borrow some elements from Linux such as workspaces and the Trim feature in Open Box DEs, which I like the trimming of windows coming from #!. I've also not used the Windows App Store or even been pestered to open it or interact with it in any way. I also hate the new prompt for default applications. I bought a $20 OEM key off Kinguin which may account for the lack of ads on my system. Take from this what you will, I'm not shilling the system by any means, just sharing my experience in using it.

Also bench marking at 4k to prove newer cards are better than older cards is retarded.

SteamOS is okay as a dedicated gaming OS for Steam users who want to put as little effort into setup as possible. For anyone else, they're better off using something like Linux Mint or an Ubuntu flavor.

Steam OS fell flat because developers basically decided they'd rather keep making games with DirectX because Microsoft is really good at propaganda. People still think OpenGL is a shit API because of Microsoft's bullshit.

Nevermind that OpenGL got them almost twice the FPS on the same hardware for HL2, there's still no Windows implementation of the OpenGL renderer, you can't even select Vulkan/DirectX 10 or 11 in Source games at this time.

It isn't that it has failed it's just that it's marketed at the console user. If you are use to PC steamOS will be too bare-bones for you but if you are coming from consoles it would be perfect.
It's going to take a long time to gain ground as it's a whole new idea but I believe it will grow over the next couple of years with many console users getting sick of the shit and PC prices falling to competitive levels.

Thanks for not getting the fact that I'm goddamn amazed that this thing uses less power than the 980 Ti and is still more powerful than it.

...

He's saying he was impressed, not necessarily questioning how it worked or that he couldn't understand how technology improves. Too many pseudo-/g/ faggots wanna talk down to everyone, "IF YOU DON'T AGREE WITH ME YOU JUST DON'T UNDERSTAND, LET ME EDUCATE YOU. NO YOU SAID IT IN A WAY THAT SAYS THE SAME THING I SAID, BUT NOT THE SAME WAY I SAY IT, SO I WILL TELL YOU THAT YOU'RE WRONG AND TELL YOU THE SAME THING YOU SAID IN MY OWN WAY!"

That pretty much boils down to talking to a /g/ wannabe mutant.

HEY LOOK AT THIS GUY, HE'S TRYING TO BE KNOWLEDGEABLE AND HELPFUL, WHAT A LOSER

Fair enough, I avoid /g/ and Holla Forums unless I have a very specific technical question I can't find an answer for elsewhere.

Yeah I meant it more for amazement. I have a GTX 770 and the fucker needs two 8-pins and it overheats like crazy (it's only ever gone to like 80% power because it hits its thermal limit).

Meanwhile the 1080 uses half the power and probably delivers twice the performance.

That looks like the beginning of one of those videos where a Japanese girl is doing something seemingly normal (albeit looking cute while doing it) and then some guy comes up from behind and starts fucking her hard while she squeals and loses her shit.

I think it's a safe assumption based on their traffic that this is commonplace.

lmao

with everything absolutely maxed my 760 gets like 40fps while a 1080 would get 120+

I wouldn't be surprised if she was a tranny.

nonsense, trannies are pc gamers. She's clearly just a model.

All PC gamers are trannies?

All trannies are pc gamers but not all pc gamers are trannies
I have no idea what that user is trying to say

no, but all trannies are pc gamers, you need to have a mental illness to stick with the platform

It's actually true for those that do play but the vast majority don't play anything from my limited exposure to them. That said I have old known older trannies so I wouldn't be surprised if the younger ones are on console.

*only known

can someone confirm this?

Google mate, this has been confirmed by so many independent sources it's safe to say it's fact at this point.

Why would you have a key for colors on a chart and then violate it in the very first entry?

I'm stilll using a 260 GTX.

Not all the time. Ok witcher 3 ran like shit but it's drm free and genuinely looks good and has non trash open world . Waiting on that fucking Linux update.

MGSV is flawed but runs like magic. 90% of modern games run fine on my 660ti. If you keep post processing to a minimum, shadow and lighting to medium, I hit triple digit fps often. Low shadows sometimes look even better than high for certain games.

Singular reminder that jew laws are not laws.

Nvidia never changes.

Just give it time, this will be worldwide and just as legal

The law will be as legal as their religion.

Currently have A85500 and 750ti.. Both have insane OCs and are garbage in everything.

38cps Skyrim

Attempting to emulate anything


What's a good upgrade.

Those numbers probably just mean that the game doesn't support SLI properly at all, so the second card doesn't even get recognized. I know this happened to a buddy of mine when Far Cry 4 came out.

...

...

holy shit.
do you really think that? can I spy on you in your house? what's matter got something to hide?
in the shower?
in the car?
at a funeral?
why are you annoyed? got something to hide?


fuck off

lol what, I get better fps in Skyrim and ME1 through wine on Lunix using the "garbage" AMD open source drivers


anti-semite!

...

What are your parts?

that first image makes me feel sick.
It's almost disgust in a religious sense.
Legends of demons and evil gods always gave you a gift in return for your freedom and privacy if you sign their contract up front.

The demons are here, the devils have come and we tell them we don't care what the cost is, it doesn't phase us anymore.
We've sold our soul for rented hardware

Phenom II, 12GB Kingston Value RAM, Radeon 7850

if you're on Loonix it's important to use a recent mesa version, enable DRI 3 and if you're using an AMD card to use a recent gallium nine patched wine version alongside a mesa version that supports the nine state tracker

...

But both Intel and AMD have factories on Israel, no ?
I just want my Zen and a Polaris at $250 with actual jump in performance.


Funny enough, the bible was already telling this since a good chunk of time as "the things of the world" or something like this.

Not counting that a good chunk of the people running the world tend to have some kind of satanist affiliation

Because I have had shit performance in 1080p I have paired my PC with a 900p(1440x900) 75hz monitor.

A 860k and a GTX 680/7950-70 could be had for total

user anon you seem to know your shit.
Why have none of the wine performance improvements been mainlined? Especially the CSMT patchset?

sorry, I have no idea about current hardware. I'm waiting it out till openRISC becomes consumer market

10 years, it will happen!


beats me but last I heard they're working closer together now, see: wine-staging.com/news/2015-09-25-winehq-integration.html

I've still got a Phenom II sitting in its motherboard in antistatic wrap, one day I might put it in a cabinet with Linux and build an arcade cabinet out of it. But for actual gaming, the Phenom II is pretty much done for.

my Deneb runs at 3.6GHz with vcore lowered by 25mV and it will continue to do so until we ever get some CPUs without some black box security module on silicon

it's not so great for emulating CV1k games in MAME or Xenoblade Chronicles in Dolphin but normal Windows games run fine as long as the graphics driver is no issue

I had to upgrade to a newer CPU, for GTA V (which I was playing heavily with friends at the time) the CPU was a huge bottleneck and could hardly handle the game. Now I've got an "8 core" (which is really just 4 physical 4 logical) CPU but I made the mistake of buying AMD, so it's not as good as if I bought an i7 for $100 more.

CPU intensive games are coming out more and more, and yeah I know, it's a console port, yadda yadda but at the same time, CPU intensive games are coming out all the time. Might as well keep up with the curve.

I don't regret buying this CPU as much as I did at first, but yeah I should have paid the extra for the i7.

I'm not the one defending them by buying into it retard.

Yeah, that's what I thought. It is AMD's crap CPUs.

I had a quad 760K in a old machine, but it was garbage.. I don't know why I trusted AMD again. I do know that 2500ks and their mother boards can be had for about $70.. Apparently the 2500K can keep up with Intel's latest models just fine, so I might just grab an $80-90 7950-70680 and the 2500K for my 900P. monitor. Thanks for confirming my thoughts

IPC on AMD CPUs hasn't been competitive since Athlon days. Do you know the "add moar cores" meme? That's not a meme.

Overclocked 2500k is pretty gud, but if you keep it at stock clock you're probably gonna have issues with CPU intensive games.

Yeah, if I ever get into running a server I'll throw the most retarded Xeon I can get at it just to see how fucking retarded it will be.

Sauce please.

oh lookie here wine-staging.com/news/2016-05-18-release-1.9.10.html

allright i guess that means its ok for me to rummage through your house without your permission, and I wont do anything, as long as you have nothing to hide :^)

Tell me your SSN and address then user, you don't have anything to hide, do you?

Why bother asking him? He is already ok with it, can a mod post his IP? Thanks

A trusted company or person spying on you isn't the same as having some random creep on an imageboard spying on you

if Zen fails then we can bury AMD for good.
its bit silly comparing bullshitozer with i7s

Simply ebin

yes goy trust them!

That's exactly how I felt 3 months after buying it and then it came out that the Bulldozer is a fake 8-core.

...

Can't wait to watch all these new video gaymes in silky smooth 31 fps
Old games are best games.

I didn't say I did care about it, but the discussion was about DirectX 12. That's what this thread is about. DirectX 12 and NVidia's newest GPU platform.

And what do you think have changed that would have made it possible for multisampling to work with deferred lighting? Just to recap, deferred lighting geometry pass stores a bunch of information about geometry into a buffer, with each buffer pixel corresponding to one particular polygon. So that when shading pass comes, there is no confusion about the data and thus the picture is rendered correctly. If you wanted to store information about multiple polygons in one pixel, you'd need multiple buffers to do it because one buffer only has precisely enough room to store info about one polygon. And the buffer is large enough as it is, for 4k screen it would be 270 megabytes, just to store 2 polygons per pixel you'll need to double the amount of memory - and double the amount of processing since you'll need to shade both buffers now. But not that it fucking matters because fundamentally multisampling just generates multiple fragments and assigns them a "fraction" of a pixel that they take up, and after fragment is rendered (to a screen or buffer, doesn't matter) resulting data is averaged together based on what fraction each polygon took on a pixel. That works allright with forward rendering and even is better performing approach than supersampling, but with deferred rendering you'll only care for that on a diffuse map i.e. bare texture, and there's gonna be also a metallicity map and smoothness map, and most importantly a normal map for which you really don't want to have results simply averaged between competing polygons, you'll want to store individual data from each polygon to shade them correctly. MSAA discards all of that data and the edges of models where there more than 1 polygon per pixel will have data in those buffers that matches neither of polygons on that pixel and will look wrong. Same reason why transparency doesn't works with deferred lighting, you can blend bare texture from multiple transparent polygons but you'll want to keep normal maps and shit separate for each of them.

Oh and, that's exactly how it works in both DX11 and DX12. You don't just magically sample textures and shit from polygons, you rasterize them just the same way as you did 20 years ago. It's just by now hardware become flexible enough to do all sorts of shit with it rather than just asking it nicely to render your 3.5 polygons with that particular texture on them.

You mean 980 Ti SLI.

And how do we know they aren't just not bothering with SLI support on DX12 yet? I mean would 1080 SLI act as intended on Tomb Raider DX12 or would it give no real boost much like the 980 Ti in SLI? Sounds to me more like DX12 (or at least just Tomb Raider in DX12 mode) doesn't support SLI properly yet. I haven't seen any GTX 1080 SLI DX12 benchmarks yet.

In case you're confused, this is what deferred lighting buffer looks like if you decompose it by purpose. This one has position map, normal map, specular map and diffuse map (called "albedo" for some fucking reason). Since it has 10 channels, it takes 40 bytes per pixel.

Keep in mind that the benchmark is for 4K. The 1080 has 8gb of RAM, while the 980 TI has 6gb of RAM. 4K gaming requires more RAM, so the 1080 has a significant advantage, and the benchmark (once again) is for 4K.

The 980 has to flush its memory more often because of the 2gb difference, so the 1080 already has an advantage. Then there's the technological hardware implementation of features on the 1080. The 1080 has an advantage in that it runs cooler, is faster, has more up-to-date hardware implementation and has more RAM.

The 980 TI in SLI doesn't provide as significant an advantage over the 980 Ti on its own. But it hits equal pacing with the Titan X, which has a very large memory pool to work with.

The 1080's improvements put it ahead of both. I'd like to see the same benchmark with the TESLA on the table though.

So what should be a good upgrade from a GTX 260? I'm looking at nvidia cards and under a hundred bucks.

gtx 670 off of eBay

a 750ti if you go for brand new stuff, you could probably get a 680/770 for just as cheap from people who upgrade to a 1080/1070.

I've wanted to get into bitcoin mining for a really long time, but I didn't have the stuff for it because I was, and still am a poorfag. What are the programs you should use to start, and how are these coins traded?

What kind of hardware does it take to start mining? I've got an R9-390x and an FX-8350 with only 8 gigs/ram, so if it's CPU dependent, it looks like I'm going to continue being a poorfag. Or set up a fan right next to my open-air computer. To be honest, I'm not using the computer for anything other than terrible videogames and shitposting, so it wouldn't be a terrible loss if I used it as a dedicated mining rig.

Pretty sure that at this point unless you have an ASICs farm or a large botnet mining is a waste of time

I guess I can still ask Holla Forums if they know anything about it, but they might hang me for not knowing what I'm talking about
I also ctrl f'd the catalogue and there are no threads on it, so looks like I'm not learning about mining anytime soon. Guess it's time to check to see if there's a board for this.

I hope not, even if it takes a full week Or even a month just to get a single bitcoin, it would be worth it, because at least then I would have some form of income, that I would probably spend on better mining gear, so I could get more coins in less time.

Look, friend, we're really approaching peek butthurt here. You're wrong. Not just in theory, here's a working example of what you say is impossible. I appreciate the time you've spent describing the tech how you understand it but you're out of touch.
Learn how to sample the pre-resolve buffers instead of wasting your time with damage control. It's not shameful to be wrong, it's shameful not to yield to the truth.

forgot pic

while your gpu is half decent entry level, your cpu is utter trash, even by AMD standards