Official Statement from AMD on the PCI-Express Overcurrent Issue

archive.is/3JrZg


Get it together AMD

Other urls found in this thread:

please
archive.is/Xqzc4
ixbtlabs.com/articles/atitruform/
en.wikipedia.org/wiki/Unified_shader_model
en.wikipedia.org/wiki/Graphics_Core_Next#Compute_Units
twitter.com/NSFWRedditImage

How high did the power consumption reach in some cases?

I'm beginning to think AMD fucked up on purpose.

Enough to fry the PCI-E slot apparently.

And to everyone in the thread:
Power limits on certain mobos caused the PCI-E issues. If anyone bought the RX 480, disable power limiting in your BIOS.

...

180w stock 200+w with OC.

So AMD and NVidia is fucked.

Who should I be looking at for in a graphics card?

Looks like it is time for Intel to start making graphics cards as well.

What?


AMD has been fucked for the last 10 years, if it didn't stop you buying their GPUs before, nothing has changed.


Please no, last thing we need is an even worse Jew.

Nvidia has:

More like underclock and gimp yourself out of what would have been a decent card.

Sit tight for a few months and wait for people to sort their stuff out.

Nvidia pulls some insanely unethical stuff like intentionally crippling games for non nvidia users.

Funny enough they used to make GPUs but got rekt and dropped out into doing CPUs/APUs only even though AMD is far better when it comes to laptop APUs.

Didn't they release a card with less power than advertised? And then proceeded to not give a shit.

Hopefully they fix the problem then and that the fix doesn't affect the performance of the card.

Reset into BIOS. Depending on your motherboard, one of the F keys would do it when your computer boots. Google your mobo or find the manual for it.

God damn. I really want to like AMD since they have the best looking cards, but fuck. Get your shit together AMD.

Dont forget the 224bit-illion.

The 3.5 Crisis

Cards advertised with 4 GB only had 3.5 GB of memory available to them at any given time, with the last .5 in such a position that when the card needed it it would slow the fuck down to a crawl and fuck everything up.

The fix is extremely simple, put a fucking 8 pin connector on the board instead of a shitty 6 pin one, then the card can draw up to 225w without any problems.

You're fucked either way.

But that would raise production costs $.05.

Oy vey goyim don't you shoah me out of my shekels.

I really wish there was some black horse company that suddenly started making GPUs, just to fuck with Nvidia and AMD's sorry excuse of a "competition".

Actually, they picked 6-pin because they feared people would see 8-pin and think it was power hungry, something Nvidia would've pounced on.

Also, why the blue FUCK are you buying reference? Anyone who buys reference cards deserves exactly what they get. Seriously, what the fuck?

Why do you hate them?

So my 660 is starting to show signs of obsolescence.

I was thinking of upgrading to a 1070, either MSI or Gigabyte but I'm holding out for better prices. This ATI vs NVIDIA false competition is really getting on my nerve.

I'm waiting on a price drop of the 900 series myself Like that will ever fucking happen.

So basically a hard DRM. Like a console.

A lot of people are sick of both Nvidia's AND AMD's bullshit. It's not some one-sided hatred here.

Well, can't argue with those dubs.

Let's add their complete refusal to release specs and software for their mobile hardware, rendering it completely obsolete years early, or locking 3D vision to nvidia-branded hardware only, removing even the option to use it on generic monitors.

Don't forget the Freesync/g-sync bullshit.

Man, just saying gameworks doesnt do it justice.
AMD are incompetend but Nvidia are evil. I worry for the day Nvidia have no competition. I can see it easily becoming a telecom style issue.

Is AMD the place where first year engineers go after they get their degree or something? I would think that the mistakes AMD makes would indicate a few rookie players on their team.

How do you know Nvidia are bribing devs?

im pretty sure most cards pull more than the recommened PCI power.

i think even more so with all the stock OC'd cards out there now a days

the only worry is for people with old mobo's that cant handle it, aka really poor poor fags aka the people AMD is targeting with this card

They pay them off to add in Gameworks, which is their hair tessellation shit. Look at Witcher 3 and a lot of newer games. They often have gameworks, which takes heavy advantage of tesselation of hair, water and other shit.

please use archive.is/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/

On top of that older Nvidia cards and AMD cards cant handle it as well. This tesselation shit is still pretty new. So unless you got a new card, you're SOL.

And because theyre adding in Gameworks they often have to work closely with the developers, whereas AMD has to wait to get to the game. Just look at Fallout 4, it took AMD what, like 3 weeks before they updated drivers for it? Granted Fallout 4 is shit so wtv, but its still a disturbing precedent that Nvidia can do this.

Don't get me wrong, im not an AMD fanboy. I have an AMD card, but I largely think theyre oftentimes shit and will most likely upgrade to Nvidia when I upgrade later this year, but I worry about where gaming is going when AMD goes bankrupt and Nvidia is the only dog in the show. I'm getting real Comcast feels, and that worries me.

Most card do not use the main PCI port power delivery because have an large margin due to having adequate additional power connector.

archive.is/Xqzc4
Bah, Archived link. Hard to remember every site thats been wordfiltered.

Do you think this issue will be in the 470 too? Since I'm waiting for that for my new cheapest of cheap rig i don't want it to fry everything since it's all new.

It would not be that hard to get into the enthusiasts graphics card market. Not quite "kickstarter possible", but certainly doable with VC money. The reason nobody is doing it is because the market is tiny, and because there is no real profit in it.

Nvidia is running entirely on business momentum and its quasi monopoly. ATI's corpse only keeps belching up products, because they can leech the technology from AMD's APU team. Which, in turn, survives purely on AMD's console contracts.

Probably not, unless they're stupid enough to not put a 6 pin connector on the default design.

They already have no competition.

You can play Quake on it

AMDs existence means they cant rest too easy. Though with them releasing vulkan to everyone and from what i've heard about em almost being bankrupt, you're not really wrong. When AMD is completely gone though, guaranteed you're gonna see some real jewery.

Come, AMD and Intel lies in the same bed splitting retarded normalfags among themselves. I just wish there were some chinese knock-off cards.

Its literally fucking nothing. They said its fixable with just a BIOS update

If AMD went down, the jews behind Intel and Nvidia would found other fake competitors. They need some form of proforma competition to keep pretending they are not monopolists in their markets. Otherwise, they would risk having to deal with a bunch of unfavorable laws and regulations.

Somehow I doubt that. If theyre anything like telecom industries they've most likely already lobbied for special privileges in that market as well as to be above anything that can happen to them.


You and me both. China doesnt give a fuck either, so they'd probably repurpose the tech of nvidia while making it slightly shittier. Though granted if its not good enough it might fry your board but its not like everything chinese is shit. And competition is always good.

Dude forget the fucking 470. What you should get is either 1060 or the RX490.

Well it looks like it DOES need the 8 pin connector because it IS power hungry, like every other fucking AMD card in existence.

Please leave Mr. Shekelberg.

That does not work everywhere. Chinese and EU regulators would laugh about the sort of corruption the jews can get away with in places like the US or Africa. They only offer special privileges to companies who have a significant effect on their GDP or employment figures.

Your wording implies AMD is already a fake competitor? Considering AMD is on their way to bankruptcy and soon Nvidia might be the only company around, what do you think will happen? Also, do you believe AMD is just a "controlled opposition" to Nvidia so theres a cheaper option to Nvidia cards?

BUT IT'S OKAY WHEN AMD DOES IT

Let's ignore the whole patents and crosslicensing stuff for a moment and look at the only part that isn't buried under legalese:
Game coding is trash. Seriously. This isn't Holla Forums saying what Holla Forums says, this is completely serious. The only reason for most games to work is that AMDs and nVidias driver teams unfuck well established practises like treating APIs as a loose suggestion of how you could communicate with the HW. Having the legalese BS out of the way doesn't help you if you can't jump over the hurdle of bugfixing the bazillion fuckups of other people. Ask IMGtech. Ask the guys working on WINE. It's a clusterfuck of epic proportions.

Also lower quality components can fry since AMD is going over the tolerance level by quite a lot.
I was really looking at AMD again, but this whole 480 thing is a fiasco.

How is it even possible to notice something like this after the release??

1060 will run circles around RX480, RX490 will have 8pin to cover these power problems RX480 has. You cannot OC RX480

I hate the ignorance that gets thrown around, because it's bullshit. AMD doesn't do hardware shaders, they build cards with higher brute force rendering performance and expect software rendering to get them through the process. Well guess what? It's because of this that ever since Doom 3, AMD has been losing consistently. Hardware shaders are an important part of development, and relying on brute force rendering is fucking retarded. Because you have to convert those shader programs into rendering operations on the dye, which means using up more of the GPU's time to render something, whereas a shader calculation processor could do an operation that might take 40ms on the dye in only 5ms. Seems like not much, until you remember that we're doing this operation as much as 200, 144, 120, 60 times a minute. That number adds up.

Tesselation is the new black, it's a good technology, what you want is for developers to hold back development of games the same way that consoles do to give a handicap to AMD because AMD doesn't want to redesign their architecture THAT THEY KNOW DOESN'T WORK.

I've tried developing games on an AMD card, it's a fucking nightmare to see how many things I have to disable to get competitive performance, and that's not with GameWorks, that's just using OpenGL shader language.

AMD sucks because AMD sucks, not because NVidia is a mustache twirling Hitler in disguise. Stop being so fucking ignorant.

But the 470 is cheaper and likely perfectly adequate for 1080p 60fps on a budget.

Reminder that aftermarket 480s will be out soon with 8 pins and better fan designs. Anybody who bought reference designs are not just double, but triple retards.

I found it rather strange that AMD bought ATI, and kept producing discrete graphics cards. I found it outright suspicious that Intel and Nvidia did not revoke ATI's licenses for their technology. Which would have been both their right, and in their best interest.


You would not have to bother with any of that. There are still plenty of OEM, embeded, and mobile manufacturers around, who would, and do, license their tech for pennies. And there is absolutely no reason why you could not produce cards that would actually fulfill established standards.

Yeah, aftermarket is the only good kind of video card on either side. I've never met anyone who owned a reference card. I mean not only are they flimsier and have lower clock speeds, but they're just more expensive, and I never understood why it is that the reference card is so much more expensive. You'd think it would be the other way around.

It's funny because there have been no confirmed examples of an RX 480 "frying" any motherboards - keep in mind that THOUSANDS of RX 480 units flew off the shelves in 4GB and 8GB variants, with some units being shipped with an OC (I believe XFX and another AIB?)

Lliterally the only game that sees the RX 480 draw more than 75W from the PCIe slot is Metro Last Light at 4K resolution.
And the average is 80W from the PCIe slot.

There have been quite a few other cards that have pulled over 75W from the PCIe Slot - GTX 750 Ti, GTX 950 (Asus edition with no 6-pin) and the GTX 960 Asus STRIX have all pulled more than 75W from the PCIe slot, in the case of the GTX 960 STRIX upwards of 250-FUCKING-WATTS.

Even so this is no excuse for AMD because if AMD shits the bed, everyone loses their shit.
They have to have better quality control than this.

RX 480 AIBs will probably be closer to R9 Nano/GTX 980 levels of performance.

Because it's a matter of competition. Instead of trying to get a handicap or move the goal posts or take away ATI's ability to use their technology, they continue to beat AMD just by being better products. I'm sorry, it's just fact. It used to be that AMD's processors were always 20% or more better, and back before shader language and hardware shaders ATI video cards were so much better. But AMD still makes products as if we're in 2002, and it doesn't work that way anymore.

It doesn't help that the leader of AMD is a pennypinching gook, so any chance of actually putting some money into redesigning the architecture is an improbability.

I don't know on the GPU side, but on the CPU side the one reason Intel cannot put AMD out of buisness by stopping to license them x86 is AMD actually own the x86-64 extension to instruction set and license it back to Intel.

They're trying to prevent the problem from happening before it happens, man. Do you have to wait for the lightning to strike you before you say "there's a storm out here, I should get indoors so I don't get hit by lightning again?"

Nvidia works closely with developers and gives them kickbacks and help to implement gameworks. A feature that works well with only newer nvidia cards, at the detriment of their old customers and AMD customers. Reconcile that. If this wasnt to corner a market why bribe and help developers to get their gameworks in and optimize the game quickest?


Hm, I should look into that more. Could be interesting to find out more about.

I don't think anybody does anything to keep AMD down. But I am unsure if they would still be around if they had any real potential to change the market.

I didn't say there wasn't a problem, because obviously there is.
But what I am saying is that people have turned a mole hill into a mountain.

It is an issue, and AMD was fucking retarded for not just including an 8-pin by default and just leaving their RX 470 with a 6-pin.

At this point I doubt they'll be around much longer. Unless the deals they have for consoles keeps them afloat.

If GTX960 is your idea of "good card" then of course, but don't expect even GTX970 tier of performance.

Yes, it's like new technology is only on the hardware after the new technology becomes standard. Did you miss that part? You can't add hardware acceleration to something that came into existence and standardized after the reference was completed.

That would mean releasing versioned video card references, which would both kill end-users' wallets and weaken third-party manufacturers' ability to move product.

Adding tesselation to a 780 and then re-releasing it after the 900-series already exists, it's kind of silly when you think about it, because once they start mass producing the 900 series it means they've ceased production on the 700 series and now they're moving old stock. Now they've started mass producing the 1000 series, so the 900 series is as good as done. They're not going to revise the hardware.

Games using tessellation more isn't a conspiracy, it's a forward-facing motion. If you want PC to continue to have the best graphics, then it's a bit absurd to suddenly say "NO, YOU SHOULD NOT USE THAT BECAUSE THE OTHER GUYS DON'T DO IT AS WELL!" because that's like consoles vs PC.

It's not "good", it gets the job done. Sure you can't play Witcher 3 on Ultra, 4K, 120fps but it's a mid-range card so who gives a shit.

This is the same shit that's been said for literally more than half a decade.
It's not going to happen. In fact AMD is in a better position now than they have been in years.
Their stocks are on a steady rise, they have released compelling products, their marketing is finally not as shit, and their market share is on the rise.
The RX 480 sold as many units as the GTX 1080/1070 in TWO DAYS as they sold on launch.

If anything, you have it backward - Nvidia are the ones are being forced out of the GPU market.
All console games have AMD hardware in them - so how are those games going to be optimized? Whose hardware are the APIs for those consoles going to be developed to make use of? AMD's.
And we are already seeing this today with DX12 and Vulkan.

There's no way Nvidia can beat a similar AMD product in DX12 - because the API was designed around their hardware, specifically asynchronous compute which AMD has specific hardware in all their GPUs to make use of in the form of ACE units.

All AMD has to do at this point in make decent products, Nvidia is fucked from this point forward.

You don't have the slightest clue what you're talking about. Many IHVs can make good HW, it's the SW side that's the problem. IMGtech is in the best possible position as far as HW and legalese are concerned and even they couldn't provide drivers when Intel licensed their GPUs, Intel had to do the drivers themselves and it showed. You can't walk into this clusterfuck and emerge victorious.

fuck, I thought I found my next graphics card

Oh yes, it's completely crazy to think that a company bribing devs to add in this one feature when there are a whole host of other features to improve graphics, but this one feature needs to be everywhere, inside lakes where you cant see it, on the side of small objects, on strands of hair, etc,etc, and miraculously it only works well for your card! How conveniant that this one small feature that slows down not only the competition but your older customers, forcing everyone to upgrade to you is everywhere in the games with your logo on it. Oh but now we see in the news about how you've been giving kickbacks and preferential treatment to whoever installs your gameworks and tesselates more than is necessary. What a coincidence that this one feature that helps only you and provides a marginal increase to quality at the expense of massive performance to anyone but you is in these games!

This sounds like a Linux fag heralding the death of Windows when someone releases an article about the "broader acceptance of Linux."


And then 10 years later, Linux is still obscure and Windows is still the standard. AMD users upgrading their old video cards to a cheap new video card isn't a boom of new customers, it's just people updating their PCs. It's not going to force NVidia out of the market anymore than the Tesla forced AMD out of the developer card market.

Read the thread fuccboi. Non-reference cards will have 8 power pins and better coolers, making this problem absolutely nothing. Reference cards from both sides are trash anyways.

M-MUH AMD
YOU'RE WRONG SHILL CUCK FAG
AMD IS PERFECT BECAUSE I SAID SO
NVIDIA IS EVIL INCARNATE RUN BY SATAN

Shit man, I don't feel compelled to explain anything and there's no reason I should.
But just watch when Vega GPUs and the 1080 Ti comes out when Vega puts down the beat down on the 1080 Ti and the Titan - and it'll be like that until AMD doesn't control console hardware.

Frankly, I just want them on even footing. With healthy competition I can get the best price from both and save money. I'd love it if more companies would spring up too to diversify the market.

Try using paragraphs, mate. Split your bullshit up.


If you mean that one speculatory "article" on a site where the users can submit articles, then sure? I've yet to see a source saying "NVIDIA PAID ME TO DO X TO FUCK OVER AMD." whether credible or not, because not only is it slander, but it's as ridiculous as it sounds. You can get sued if you say stupid things that you can't prove, but if you're a pussy and want to "speculate" that someone's doing this, somehow that makes it okay to say, and yet, it's just a fucking clickbait article on a site, there's no official sources, no proof, no one from NVidia or an anonymous employee saying "I WAS THERE WHEN THEY EXCHANGED MONEY!"

Prove this bullshit, or shut the fuck up once and for all, and don't link me another god damn speculatory article. Unless Valve says "NVIDIA PAID US TO GIMP AMD CARDS IN DOTURD 2" or Blizzard says "NVIDIA PAID US TO ADD TESSELLATION TO STUFF IN OVERWATCH" just shut the fuck up.


What I'm saying is doom and gloom is stupid. Don't do it. AMD is doing just fine, people love to make up the story that AMD is on the verge of bankruptcy because they love to think their GPU is the underdog. And while it is the underdog, it's not because of a financial situation, it's because they're not adding shader support to their hardware process.

AMD will continue to be the cheapo cards, and it's because of that they're not going to die out. NVidia will continue to be the expensive but more capable cards, and it's because of enthusiasts that they're not going to die out.

The belief that Intel is going to kill AMD or that NVidia is goingto kill AMD is silly.

We already know the Lake inside of Witcher 3 was tessellated to hell, we've seen games where there is lots of unecessary tessellation, hell even the shitheap fallout 4 had it and what do all these games have in common? Nvidia gameworks. You ask for proof as if some developer will come out and get sued for breaking his NDA with Nvidia after taking their money. Its not hard to extrapolate though that when a developer starts adding in unecessary amounts of your new feature that only works well with your new cards and no one elses that something might be up. Hell, why dont you explain away why a lake you cant see inside and notice the difference of, has tessellation out the ass inside of it? Go ahead, explain all the unecessary tessellation developers suddenly add in once they team up with nvidia.

You seem awfully angry over this user. Attacking others doesnt make your opinion any more valid. It just makes you come off as an asshole.

Can't play doom either. That game has good level editor.

Good thing this is an anonymous image board where your opinion of me does not matter. For all intents and purposes, yes I am an asshole about this, because I see this shit get thrown around every single fucking day by buyers remorse AMD fags who just want AMD to so badly dominate NVidia in a generation of games again, and it doesn't happen. Why doesn't it happen? Because AMD refuses to change their architecture.

You keep citing The Witcher 3 as proof of an NVidia conspiracy, and it's not. Can you find me the documentation in Gameworks, so you can show me the NV_FuckOverAMD(); function call? I don't think it exists.

If you refuse to listen to what I tell you, and continue to repeat yourself, then you're never going to understand.

C'mon.

I think the idea is to use the "spawners" and not so much individual demons.

I want my night of 100,000 revenants, man.

Interesting? Are these niggas fucking supervillains?

Wont happen with modern game industry though

Explain the tessellation in the lake then.

Also, if the only reason why you're not an asshole is because there could be repercussions in real life I pity you.

Okay, first I'll ask, do you understand what tessellation is? Because if you don't, then I'm going to be wasting my time.


Well, it could mean that the AMD card might not be able to reach it's full load because it'll stop long before then to prevent the power from surging the slot?


Like I said man, call me an asshole, I don't care. I know I'm not. But I know why certain things work the way they do, because I work with these things. I'm not just spouting some NVidia fanboy faggotry trying to keep the AMD brotherman down.

Intel iGPUs

Intel has been making their own GPUs for a long time, I don't think I've ever seen anyone say anything like the Intel GPU is going to be the magic bullet. It's usually derided as a half-assed solution for a problem that doesn't exist.

Or buy a nonshit card. Like Nvidia.

Nice nonanswer. You going to explain why a lake you cant see inside of is tessellated to hell or why a single feature that works well only for Nvidia is being overused to a significant degree by anyone who works with Nvidia and get gameworks put in?

Really?

Okay, I'll tell you. You would tessellate a lake because sand. You know that area of Half-Life where the jet goes screeching by and you're on the cliff face? And you look down and you're like "what the fuck is this texture? It looks so wrong and out of place…" – that's why you tessellate. Natural surfaces.

You would tessellate a lake bed so that the bottom of the lake actually looks like the bottom of a lake, and not like a fucking flat texture. Because tessellation is a repeat of the same shapes. So you make a nice looking rock bed texture in memory, and then you tessellate it all over the place, rotating as you go, and it makes it actually look like a realistic lake bed.

You could render the cliff-side in Half-Life 1 using tessellation and it would actually look like a rocky cliff face and not like someone just painted a cheap ass texture onto geometry. Do you understand?

Right, the lake bed you never see? Tessellating things I never see but will affect performance? For what purpose?

How do you know it's there if you've never seen it?

Well for one thing there's the whole thing about transparency, but there's another that there's no culling used. It's a bad implementation by slavs or poles or whatever fucking shit country they're from, not NVidia out to get your GPU cycles.

Just like how Minecraft could get decent performance if they had properly implemented backface culling and done their shadow calculation on the GPU instead of on the CPU.

That might make sense only when tessellating objects that are within the current grid/environment, but why do this for a large portion of the game's environment that will never be seen?

Nvidia fanboys who actually want an Intel/Nvidia monopoly are going to look back one day when PC hardware development for hobbyist and builders has completely stagnated, the open source community has been dumped and told just to use Windows: Spyware Edition to play games and prices have gone through the roof and ask themselves if it was all worth it. All the while the Open Source community getting even more of a cold shoulder from Nvidia and Intel always has.

I've never bought a single AMD card in my life, maybe something from ATI era that's comparable to pre-6800 nvidia card. I'VE NEVER READ MORE BULLSHIT THAN THIS CLAIM.

Course its the slavs fault. Its not like this is a recurring theme in games that add in gameworks.


There was an article a while back where it was found out. Lots of pissed off people on Holla Forums about it as well as the graphics downgrade.


Its why I want more competition. Id love it if there were more than just amd and nvidia as options.

we're fucked.

Kikes in the grave?

One would hope that anti-monopoly laws would kick in, but then again, I look at MS after they went through that circus and figure it will do fuckall.

And considering just how big the overhead to enter the GPU market is, there is next to no chance of someone rising up to challenge nVidia before they buy them out.

America is literally the worst western country ever existed

The US is a cancer.

"But it's faaaaaake!" [insert decade-old memes about woodscrews & Fermi]
Anyone else tired of this sonygger-like behavior from AMDfags?

It goes without saying, but has to be said anyways because "SHILL SHILL!!" will be screamed otherwise; Nvidia ain't perfect, they're engaging plenty of aggressive business tactics that aren't helpful, but lately it's AMD dropping the ball & making inefficient, power-hungry, heat-spewing, Fermi-like cards. It pisses me off that AMD isn't doing better because there's little or no reason for Nvidia to lower prices across the board & the entire market suffers. Because of this, nobody can get a really good card capable of the newer, higher resolutions for some decent goddamn dollary doos. Tired of all the drones defending such poor competition. The notion that "AMD dindu nuffin" is bullshit shill-speak. Stop falling for it & stop regurgitating it.

I'm not sure I understand what this guy is saying. Is he saying that it's not legitimate?

I think everyone wanted to see AMD make good card, not a mobo frying 2014 tier card.

You know what the worst part is? ATI had hardware tessellation 15 goddamn years ago. Unreal engine had an option for enabling it, but I don't know if it was ever used.

ixbtlabs.com/articles/atitruform/

Oh I was just making a jab at the video. The guy just keeps yelling IT'S FAKE, and I thought it was a bit amusing.

That's Terrible Tim. He's frustrated because people still think Sandy Hook wasn't a false flag & really happened. I can somewhat sympathize, since the people behind it are still taunting the non-believer "conspiracy theorists" by inviting the supposedly-dead victims to the Super Bowl. The groups that do this manipulative shit get off on openly taunting everyone afterwards, like Urban Moving Systems & the dancing Israelis.

Different computer configurations. Not everybody has the same build. This problem didn't happen to everyone

Niggers were a mistake.

That is besides the point. If we let nature run its course they'd starve to death and we'd be rid of them.

He even has Hitler's mustache. Reverse Hitler?

...

Hrm, I didn't mean to put the > in there.

Let's not insult a great man by comparing him to a senile silverback.

The fuck is he saying?

He's talking about dog breeds and racemixing. He asks what happens if you mix a Dachshund with a German Shepherd. The woman replies that you get a Shepherd Dachshund. Hitler then asks what happens if two of those Shepherd Dachshunds breed, do you get a German Shepherd out of it? The woman says no.

You really need to watch the movie, it's fucking hilarious, thought I don't know how well it translates Hitler's monologues into English. The title is "Er ist wieder da" (Guess who's back).

He's talking to her about how mixing dogs is wrong because you end up with strange-looking mixed breeds. As a dog breeder she knows this is true, and can't argue with him, but gets uncomfortable because she know he's trying to get her to say racemixing is bad

Typical AMD, they dont give a fuck if your PC catches fire and burns your house down

n..nuh uh! AMD is the good guys just trying to make obviously superior cards for less money because uh.. gameworks is evil!

My gtx780 does a great job of heating the room in the winter.

Hitler based as fuck.

en.wikipedia.org/wiki/Unified_shader_model
en.wikipedia.org/wiki/Graphics_Core_Next#Compute_Units

Please explain how AMD doesn't have hardware shaders.

AMD will never reach Nvidia energy efficiency even with a 14nm versus Nvidia 16nm. Is the dream over? But I suppose you can at least use it to cook your breakfast.

The movie is funny but is also very sad. The more and more he goes up and talk to average Germans and remember most of the encounters with random strangers were real, just read how the movie was made, the more it comes to the surface when the open their hearts up to the actor as if they wanted to hug him or they desired his return that Germans don't hate Adolf because he's ebilracistnazimuh6gorillion. They hate him because he lost or they felt they failed him.

That's about as delusional as I've seen here. Intel and AMD couldn't do it with their billions, how is some VC startup going to do it easily?

I find the movie pathetic because it equates people protesting the kind of fundamentalist muslims that are wreaking havoc through europe while being bankrolled by the gulf state with nazis

At one point an old guy says he has no problem with muslims but he doesn't likes salafists because they tell him how to live and what to think, how is that being a fucking nazi? in any case is the salafists that are pretty fucking similar to nazis

that's not… man this isn't some MC escher shit or your grade 8 maths class. fucking hell.

That is NOT what tesselation is for. lmfao

LOL

Matrox

NVIDIA Gameworks will do bullshit like over-tessellate wolf fur or render non-visible underground tessellated water purely to make AMD cards slow down and look bad in benchmarks.

I fucking forgot about that crysis shit. A concrete block had hundreds of thousands of triangles more than needed.

What else than having their solution be proprietary did jewidia do?

No, its not. But between the cancer that is nvidia and the significantly less cancerous cancer of amd, amd is by far the lesser of two evils.

Go back to pcmasterrace reddit or 4chan kid, only those morons are dumb enough to support a company with such cancerous business practices as nvidia, also you are probably so stupid you didnt even notice no game is yet fully utilizing even half of these "high end cards" performance.
A fool and his money are soon parted.

OY GEVALT THAT WOULD BE ANUDA SHOAH