Vega

That's four thousand GCN cores bitch

Other urls found in this thread:

gpu.userbenchmark.com/Compare/AMD-RX-560-vs-AMD-RX-550/3926vs3925
phoronix.com/scan.php?page=article&item=rx-vega-linux1&num=1
hardwarecanucks.com/forum/hardware-canucks-reviews/73945-gtx-1060-vs-rx-480-updated-review.html
youtube.com/watch?v=bIGpvBrwXvA
anandtech.com/bench/product/1864?vs=1826
majorgeeks.com/news/story/nvidia_adds_telemetry_to_latest_drivers_heres_how_to_disable_it.html
amd.com/en/products/cpu/amd-ryzen-7-1700x
newegg.com/Product/Product.aspx?Item=N82E16819113429
amazon.com/dp/B06X3W9NGG
archive.is/VMN2i
cpubenchmark.net/singleThread.html
docs.google.com/spreadsheets/d/1k12sv1NXGGuSOY0NhsuONtRCte51GHKdgA7ciL76mBs/edit#gid=485052351
hardwareluxx.de/index.php/artikel/hardware/grafikkarten/44084-amd-radeon-rx-vega-56-und-vega-64-im-undervolting-test.html
twitter.com/AnonBabble

>required to purchase with a (((bundle))) including two shit games and a mobo/cpu combo you don't want since a 7700k crushes it
Pretty disgusting, tbh.

Yeah, I'm sticking with my 1070, thanks.

On their own the new products seem pretty solid and a welcome competition against the others but the bundle shit because muh miners essentially destroys any plans for people that can't afford to throw that much cash at them.
Also that powerdraw…

Why would you even buy anything newer?
The only games that need new video cards are unoptimized messes like Fallout 4

Just you wait! TESVI is going to be bigger than the resurrection of Jesus Christ himself! If you want to witness such a marvel, you'll need to upgrade to current year hardware.

Have this, brother.

You can run all these games at low and they'll still look good either way
Every modern game still looks amazing at low settings

I mean shit if I didn't need a video card for other shit I'd just use an integrated one and it would probably still work perfectly


noice

But how nice do butts look with it?

This, AMD is garbage. Their approach to everything is just to throw brute force at it. No efficiency, no clever engineering, anything that doesn't utilize multi-core to its maximum potential will guaranteed run worse on an AMD anything.

bought a 144hg 1440p freesync screen last year because people had told me the 480 would be a high end card for a low price. Once I found out that it wouldn't be able to play most games at 120fps at 1440p I decided to wait for vega and now vega turns out to be total trash.

Let's face, you're better off playing old games.

Good goy, keep giving intel your shekels so they can spy on you

AMD hasn't been the anti-jew for years, they're equally as bad now. You may as well buy the better product that will last you longer.

What a time to be alive

Just a quick calculation on the power draw for the air-cooled Vega 64 vs a 1080 which is 295 watts vs 180 watts, respectively. In New York, for example, you have a cost of about 20¢/kWh. On average, 18+ burgers spend five hours a day watching TV. So lets say you spend that time playing vidya on your new gaming PC instead. Over the course of a year, that 115watt difference is going to cost you an extra ~$42, so if you upgrade your GPU every 4 years, you're looking at an extra $168 dollars tacked on to the actual cost of going with a Vega 64 over a 1080.

I'm not saying anons shouldn't buy a Vega, like if you already fucked up and got a Freesync monitor, this is what you're stuck with, but I just want to put into perspective that Vega isn't a particularly good deal when it comes to bang for your buck, even before we take all of that bundle garbage into consideration.

Asking the real questions here

$42 over a year is a pittance but I'd go for the less power-hungry option because it probably runs cooler and won't heat my room up as much.

Core count only matters if you are upgrading from the same architecture. In example, a new architecture can come out and have something with a lower core count than the previous series but still outperform it as the architecture has improved.

From what I've seen, Vega 54 had a shitty power limitation making overclocking fairly useless until a BIOS flash or registry hack makes it possible to fuck with the power. However, as has already been stated ITT, Vega is already sucking a lot of power to begin with. You'll have to put it under water to deal with all the extra heat that comes with high power consumption, unless you set the fan to ridiculously loud RPM and still be running high temps. Power efficiency matters as it provides lower heat plus greater overclocking headroom.

IMO Threadripper looks good, Vega not so much.

You are fucking stupid. ALL major hardware companies will spy on you

...

AMD CPUs are great right now. their GPUs aren't worth shit. The best thing about it is if you are using a freesync monitor, it's far less expensive than getting a gsync build. Talking 300-600 hundreds of dollars in savings.

Not like it matters because the only good PC games that have been coming out that utilize modern hardware are console scraps.

RX 550 sale for 45$. I want something cold/low power. Is this good?

Might as well find a 1050ti as it performs better than the RX 550 by quite a bit.

It's shit tbh, would have be worth it if the price was cheaper but that isn't the case. I would have liked it if AMD had good competitive products to go against Nvidia in regards to high end gaming, but oh well. Wait for Navi. :^) Also fuck miners driving up GPU prices.

Define quite a bit.
I was thinking of getting a 550 for a emulation cabinet I got the idea to work on.

...

It's not the Nvidia killer AMD fanboys were shilling it to be, but did anyone actually expect it to be realistically? I'm just happy we have actually competition now for the first time in over a decade. PC is supposed to be about options and AMD is delivering actual alternatives and for competitive prices. That's fantastic. Competition benefits everyone, wither you main Intel, Nvidia, or AMD

Twice the ram unless you want a 4GB card and I doubt the 4GB varient of the 550 is $45 bucks. Twice the pixel rate. Twice the renderers. So on and so forth. The RX 550 is much more akin to a 750TI in performance, just slightly better.

Arigato

But it's not competitive though… Vega that is in regards to gaming. AMD's Ryzen CPU's are great though, well done to AMD for pulling that off after all the shit Intel pulled on them.

We speak english around here amigo, not Chinese

IIRC Vega is slightly cheaper than its Nvbidia counterparts so it put it in a competitive range price to performance wise

The price is wrong, but I bet it shrinks quite well over time.

gpu.userbenchmark.com/Compare/AMD-RX-560-vs-AMD-RX-550/3926vs3925
pretty weak card

Fine Thank you Very Much Mr. Roboto for doing the jobs that nobody wants to

Cheaper MSRP, but retailers are selling it $100 over MSRP, so there is no point in getting it when you can get a better Nvidia GPU that is more powerful and consumes less power, and costs less. You'd be lucky to get a first batch Vega card at MSRP, after they have all gone the prices are gunna be fucked.

The prices will go down closer to MSRP after the hype dies down and people refuse to buy AMD cards over Nvidia cards for the exact reasons you just mentioned

watashi-wa nippongo desu

they're only good for production and muh open source drivers for ganoo+loonix, i really don't like the higher draw of power.

Like the Vega 56 would be a good deal against the 1070 if it is going to be sold at $400 MSRP but that isn't going to happen. On a side note the MSRP of the 1070 is $350 but you won't get one at that price either, mostly likely becuase of miners.

This is true. I ended up getting a RX480 last year around chirstmas for $160. Now I check Newegg and the prices are up to $400-$600 depending on the manufactuerer.

So do you fags bitcoin mine, because there doesn't seem to be any good reason to upgrade to the latest top of the line hardware.

Gendaijin?

>The prices will go down closer to MSRP after the hype dies down and people refuse to buy AMD cards over Nvidia cards for the exact reasons you just mentioned
I hope so. Hoping miners won't drive up the price of them too, since they consume alot of power, and aren't very good at mining from what i've read, idk if undervolting would help mining much for them though.

Shit sucks, I wanted a RX480 for a Ryzen build, but now i'm fucked. Not sure if I sould just buy a shitty AM4 APU and wait for prices to go down then get one and Ryzen CPU, or just say fuck it and try preorder one of the cheaper (but still inflated in price) RX580's. I think they might be bringing out weaker Vega GPU's between RX580 and Vega levels too sometime, but idk when.
FUCK

Where’d they get all those gamecube cores?

AMD got the rights to Gamecube cores after they bought ATI from ArtX, the company responsible for the Gamecubes architecture

I think it's better to just go with a 1070 and get a ryzen 1700 for 3Dwork and graphic design. The 1070 will still run everything at over 80 frames so like fuck even getting a 1080. but everything is too expensive right now thanks to the retarded as fuck miners so fuck even building a pc right now

i'm still on a gtx 760, the performance is ok for what i do but i'd like better.
i'd rather have amd just because nvidia has been fucking me on drivers the last two cards i've used, but i suspect i won't be able to run limelight on an amd card.
is there anything besides steam/limelight that allows for lan streaming?

The Vega 56 seems to compete favorably against the 1070 on price:performance, but I'll wait for Vega cards in the $250 range. I just want a modest upgrade to my 7950 to drive a 1440p freesync display at middling graphical settings.

290X and still going strong after all this time. Been wanting to upgrade, but can't justifie dropping a few K on a new system when there is nothing worth playing and my old system handles shit just fine.

Moar thx

mmmmmmm, no.

At least it'll have Ryzen.

The Ryzen 1600 could be enough too and is much cheaper. Also don't buy Shillvidia Shit wait for the new Rx Vega 56 it cost around $400 and is as fast as the 1070

OwO meaniepaws

Thanks fam, I mean it. Without people out there making sacrifices and artificially propping up AMD on principle, the market would be even more fucked since Nvidia would have no reason to innovate. I honestly wish Vega wasn't such a shit show, because now there's little incentive to push Volta out the door since AMD still has nothing that can even touch the 1080 ti, so Volta would only cannibalize their own sales.

My 670 has been dead for a month now, and it looks like I'll be waiting another 8+ fucking months for Volta. Thanks AMD.

You could have prevented this.

But the problem is the 1600 doesn't have 8 cores and I want 8 cores and the vega 56 doesn't have physx and I like physx.
Plus I also like whispering nvidia when people ask me how good my computer runs.


Should have gotten an itx case that can fit a full atx psu. Like this one which also fits a noctua d-15 and doesn't cost a million dollars.

My psu popped a capacitor and I let it keep going because I didn't realize what had happened. The HDD fried, but the other parts seem okay. I replaced the psu, and I tried installing windows onto a new refurbished hdd (using the old sata cable), but it crashed and did a "File record segment ### is unreadable" thing. Was that HDD just faulty or did some other potentially damaged part ruin it?

New cards are useful for emulators, but they usually work better with Nvidia hardware anyway.

Possible band/album/poem title

Wew and here I was thinking this is actually it, this is AMD's winning ticket. How much I hoped it would be. Doesn't matter though because nu-games won't even be worth playing and needing a GPU upgrade. I'm on a 780 and there's some games I want to play that my GPU won't cut, I'm just gonna buy a 1080ti and be done with system upgrades forever.

Nvidia, Love
The Green Queen Whispers To Me
Oh Fug, My Wallet

Quite possible, shorts can have weird knock-on effects on other components, you're probably best off replacing your cabling at least.

I only have a GameCube

I learned my lesson last time around AMDs are like the Chink knock-off GPUs, they squeal, run twice as hot, last half as long and the driver bullshit is endless.


What can't you play with that? I get by on a 750Ti and have yet to encounter anything it can't get running reasonably smoothly.

What games? I have a 770Ti and everything runs everything perfectly fine on high

780ti* fuck

I can run everywhere too when I am high, don't even need a gpu.

...

Yeah, double it from half a cent a week. Hash rates of cards is pitiful and aren't economically worth it. hell, it can't even cost the electricity to run it, not by a long shot. Needs to shell big shekels out for dedicated chips, goy.

Well yeah, a 780 ti is slightly faster than a 970, which is itself good enough to run pretty much every game worth playing on high. As for the 780 vs 780 ti, the ti offers roughly 14% better performance.

Youd think if you didnt like amd you wouldnt come to the thread and talk shit about a graphic card and and company you know nothing about unless of course, you were perhaps getting paid to.

Its like everytime people try to talk about amd you shills and fanboys are immeditaely on the scene, its almost as impressive as it is pathetic.

ohh my post is the wrong one, not the guy above me that speaks like a total retard. Nigger go kill yourself.

>being disappointed that amd went the jew route with (((bundles))), further skewing cost/performance ratios in nvidia's favor makes you a fanboy/paid shill

Your only valid point is that fact its bundled and the prices even though the card is only 500-400 dollars comparred to the 1080 which was 600-700 and comparred to when the 980 came out in 2014 which was 550 dollars, the benchmarks look like the 1080 has a slight advantage with a price that matches and anyone who worries about extra 50 dollars a year in power consumption must live in a favella

Everything else you said is pretty retarded

You're being deliberately deceptive, and I'm not sure why. What is your agenda?

You know, the argument goes both ways. No matter what, when an Intel or an NVidia thread pops up, suddenly it's laughing kikes.gif/webm, SHUT IT DOWN THE GOYIM KNOW, housefire, etc.

But the moment someone posts an AMD thread and some skeptical people show up to say something, THEY MUST BE SHILLS FOR NVIDIA OR INTEL!!

Fuck yourself, you massive shiteater.

I guess if you mine bitcoin powercost might be a problem since youd have your machine running at full power 24/7 otherwise you might be retarded
Your also probably correct that after release the price will probably drop substantially after a few months

Minus the power consumption, Vega is a much better deal if you're a Linuxfag.
The latest open source drivers are leagues faster than AMD's infamous OpenGL driver and competitive with their Windows DX11 driver, and even without any Vega-specific optimizations yet Vega+Mesa is extremely competitive with the 1080 and 1080 Ti. Nvidia's Linux drivers are also finicky and don't play nicely with Wayland (X11's replacement Nvidia keeps holding up with their awful, nonstandard implementation), KMS (mostly useful for RetroArch), distro/kernel upgrades, or non-glibc distros, so if you're a Linuxfag and do anything besides basic bitch Steam gaming on Ubuntu, Vega is a no-brainer.

...

K, I guess since we both greentexted that means everything we both wrote is invalid as well

Very well.

Did linux ever start getting more games ported or is there still basically nothing modern?

It's perfectly fine that you feel that way. For the rest of us, doing some simple math with how much you're spending per kWh on your power bill, vs how many hours a day you're going to be using your GPU under heavy load on average makes a pretty significant difference with that massive TDP disparity between the 64 and the 1080.

I suspect that you either don't pay for your own electricity, or you're emotionally invested in Vega's success, and unable to come at this from a rational standpoint.

I make around 1600-2200 dollars a month and live with a room mate so im not rich. Not that it matters because if you have trouble with the power consumptiom of either of these cards you probably have some serious financial issues id work out before buying either one.

...

Don't care, I'm buying a 1050 Ti because I'm a poor fuck.

That's how I know you're a n'shitia shill.


WINE

Gamer's Nexus on YouTube keeps showing that you can undervolt VEGA and get better results out of it. I think AMD is just too scared of stability issues with their cards and so they deliberately set the voltage higher than it needs to be, thus using more power than needed.

Also HBM2 is cool but also drives up the cost and delays the product, I don't think it was worth it for them to go with it.

wine is fine, but kvm has low overhead and no compatibility issues with dx12.
i do wish you could completely remove the gpu from the host to use just one graphics card.

It's odd how much better Vega is on open source drivers on Linux as compared to the 1080ti. Almost like DirectX is the main culprit in limiting the hardware……

phoronix.com/scan.php?page=article&item=rx-vega-linux1&num=1

i'm upgrading more as a hobby than a necessity.
i have the occasional game that won't run 60fps on low, but that's hardly my motivation.

You think that 12 fps is "so much better?" That's 9%. The thing is that performance is directly proportionate to AMD's drivers on Windows. The difference is on Linux no one uses AMD's proprietary drivers. The problem lies with AMD, not with DirectX.

12 frames is much better you retard. Imagine if these cards were playing a game where the 1080ti struggled to maintain 60fps. Suddenly those 12 frames are pretty fucking important, huh?

This.
I got a nice bonus a few months ago and built a new PC with a kabylake CPU and 1080ti to replace my 10 year old toaster. I've used it to play Atelier Sophie and Trails of Cold Steel, and the only game I'm looking forward to is Sonic Mania, which is delayed.
All of those games would have run on my toaster. At least it does a nice job encoding videos I guess.

If it's indie and doesn't run on some fancy custom engine, it probably supports Linux. If it's AA or triple-A and not published by EA or Microsoft, there's a reasonable chance they'll hire a third party to port it.
As said, there's also >Wine for Windows vidya. It's still not GPU passthrough level for most games but it's improving quickly.


Most double-A and triple-A games on Linux were ported using a DX11 to OpenGL compatibility layer, so there's usually a little overhead unless you get a miracle like DiRT Showdown. Even with that, Mesa still comes respectably close to the DX11 original on most games and regularly beats Windows when the game uses OpenGL on both platforms.

For what purpose?

I see them cheap, are they better than gtx 960 4gb?

Has anyone carefully checked the resulting screens to be sure it's rendering the same thing and not running different effects where they didn't have time to port 1:1?

You really are a fucking idiot, you think you're going to have 12 fps improvement over the 1080 ti in that situation? No, if you're struggling to maintain 60, let's say you hit 58, that means that you're getting 63 fps on the Vega in optimal situations. 9% doesn't turn into a magical 20% you twink.

DirectX12 isn't limiting AMD GPUs. If anything it's helping them as they seem to be more optimized for it than NVIDIA cards, although I'm not sure if it's due to AMD having chips in the Xbox Consoles or due to drivers. Games like Total War Warhammer and Hitman show performance gains across the 400 and 500 series of AMDs cards with DirectX12 enabled over only a few months. The 1060 and 1070 were performing better than the RX480, and now a year or two and some driver updates later the 480 runs faster than the 1060 and even the 1070 in some cases with and without using DirectX12 or Vulcan. The best thing about higher end AMD GPUs is that they usually begin to outperform similar NVIDIA cards a few years after release.

Females are weird and gay

DirectX12 and Vulkan's asynchronous compute performance boosts are directly based on AMD's Mantle API that they developed but then discontinued. They handed over the code to Microsoft and Khronos so they could receive the benefit of their work. If you want to be a little pessimistic, you can think of it as a Trojan horse that allowed them to experience a larger boost to performance under their own architecture versus what Nvidia experiences because AMD was the ones that wrote the underlying code. If new games were still stuck with DX11 and OpenGL 4.X as the latest graphics API, AMD cards would fall even further behind Nvidia.

Vulkan tends to give a larger performance boost to AMD than DX12, and I'd largely consider that to be because of existing overhead/bloat built into DirectX.

What's the crossfire situation on lower end gpus?
Is there significant gains from it or is there a degradation of quality compared to higher end cards?

GTA 5 mostly and a few others plus emulation. I'm running a 1440p monitor so 3gb of vram isn't enough. But majority of games still run good, I can play Shadow of Mordor on max settings with a little bit of slow down.

Wait for Navi :^)

Running multiple GPUs for gaming is nearly on its death bed right now. NVIDIA officially announced they no longer support more than two GPUs for gaming, and AMD also recently stated they are heading in that direction soon too. The problem is that decades went by without any meaningful progress being made towards achieving nearly 100% performance scaling by having additional GPUs installed. DirectX12 and Vulkan finally are able to fulfill that role, but only if game developers are very directly programming and optimizing their games for it. The only example I can think of right now that actually hits around 98% scaling give or take is Sniper Elite 4 on DX12.

Both AMD and Nvidia have plans to instead produce video cards with multiple GPU dies that are tightly linked together to provide high levels of performance scaling. AMD's Ryzen CPUs use their "Infinity Fabric" design to stitch together cores, core complexes, and dies, but this technology was originally developed with GPUs in mind. They just happened to put it into use on their CPUs first.

So, then it'd be more worth it to try and invest in a higher grade 5XX series card instead of Dualing 550s.
Thanks.

Who cares? You can run any new game on the highest settings on ten year old hardware. They really need to get someone to make an actually hardware intensive game before they keep marketing new cards that nobody needs.

...

...

Not true. As I said before AMD cards have been gaining on their NVIDIA counterparts due to having far better drivers than NVIDIA at this point. I've lived long enough to see the:
Meme finally die.

Source: hardwarecanucks.com/forum/hardware-canucks-reviews/73945-gtx-1060-vs-rx-480-updated-review.html

Who gives a fuck about this monstrosity? Raven Ridge when? I want to see how Vega performs when scaled down and slapped onto a cpu.

Spoiler: Still better than Intel's iGPU, still worse than a $100 discrete GPU.

I still understand that you wanna go for the 1700 especially for coding or converting it's better since he has 8/16 Cores while the 1600 has only 6/12.

If you would use it only for games i would still recommend the 1600 over the 1700 because it's faster.

Physics as the only argument for nvidia is stupid. I could agree with better power usage or stronger cards like 1080 (ti) or titan but after that there're only fanboy arguments.

What is your CPU? I don't do 1440p, least not yet so of course my games run better, not to mention GTA5 is more CPU intensive

I would be ok with that, on the condition that it exceeds or at least matches Intel in power efficiency.

kill me

Somewhat of a double edged sword since Pascal also undervolts well which makes it draw even less power. AMD made the cards work at stable voltages even if only 1% of those cards actually need to run that high. I will say with prices as nuts as they are, why wouldn't someone either
a) buy a 1080Ti for maximum frames and decent compute/CUDA cores which is useful for Waifu2x, or
b) buy a Frontier Edition and get 16GB of VRAM and access to some better workstation performance? If I was buying Vega, it'd either be a 56 or drop a shitload and get the FE, the middle ground seems silly to me.

that's what i'm rocking
hd4000 is pretty impressive for integrated. minus the modern AAA trash it runs a lot of games at a decent fps with low-mid settings/resolution

Honestly I've becomed so used to 800x600 that I find anything else weird now. I really hate when a game doesn't support it too.

...

Well let's see how the next few months turn out. If my current gpu doesn't die I could wait for navi. But honestly I am kinda pissed at amd right now. Not going to support those nvidia jews, but amd's last move was pretty jewish too.

...

wouldn't the radeon chill shit lower the electricity consumption when the card isn't under heavy load?

I doubt he meant bitcoin, pretty sure he meant ethereum.

not by much. youtube.com/watch?v=bIGpvBrwXvA

we're talking going from nearly 600w power draw to nearly 400w. That's still 400 fucking watts for your damn graphics card seriously what the fuck.

(checked)

Classic shit, user. Good post. I wish they made more of these. Got the Christmas one too, by any chance?

That is seriously fucked. I Waited For Ryzen(tm) and now regret I didn't buy a damned middle-of-the-road card when I had the chance.

watch the video I posted m8. jim makes a legit case for vega-freesync, but he's totally right. But it depends if that's what you're after.

It could be 1080/70 performance for a $300-400 price tag compared to 500-600. The power draw means you're gonna want a killer PSU, but there's a real argument to be made about freesync.

That's the measurement at the wall, i.e. total system power draw, not the GPU itself.

I'd love vega-freesync if I could just find a damned vega card! I waited for Ryzen, then waited for Vega, and now I'm stuck on a Jewtel Core i5 760 which can't emulate Wii and PS2 well enough for me to play what I want, so I'm a little upset.

Oh.

If you don't have a Kill-A-Watt meter you should. They're fun.

And you'd be suprised how much energy your damned refrigerator draws.

Why is this shit world so unfair? Why do worthless Jewish companies like Nvidia and intel get so much money for R&D while AMD doesn't? Why can't AMD just BTFO both of them with ease?

the thing is, amd isn't really aiming for the gaymen crowd with either ryzen or vega, vega is for ethereum mining and machine learning shit, with some standard gpu shit that kinda works for games just thrown in, and the ryzen shit is just awful for games because most games actually can't take advantage of multiple threads for shit, for some reason and amd has no low end processors with few cores to compete with a G4560.
I mean I'm an absolute pleb when it comes to technology but this is what it looks like to me.

just buy ryzen and nvidia if you want. freesync is a good option too

...

Ryzen CPUs have absolutely wrecked Intel in every single way. But AMD GPUs have never been competitive for years. The last good AMD GPU was the R9 290 because Nvidia has absolutely nothing to compete with at that price. You would either buy a GTX 770 for the same price of the vastly superior R9 290 or you would throw $200 more for a GTX 780 that had less than 10fps difference.
But today? AMD has fucking nothing. Nvidia keeps winning.

It shows.

tfw nvidia will never get as brutalized by a gpu 1/8th the size of their flagship card that is 1/4th the price and provides identical performance

the hd4850 was an event. you had to be there.

Well, unlike most people in this thread I admit it.

Certainly not in gaming, and this is, after all, Holla Forums. In modern AAA shit, the 1700X matches the 7700K while costing $45 more, while getting absolutely manhandled in STP which is what you need for high-end emulation.

if you are just a gaymer don't buy a $300 cpu. buy a $200 one or cheaper, where Ryzen is the best value still. If you are buying a 1700X you are using it for the 8c16t capabilities it has which means streaming, rendering, multitasking (how many of you fags play games borderless windowed?) and all those kinds of high demand tasks.

also having 4 cores in 2017 is becoming a joke. Even consoles have more cores.

Like a Ryzen 5, that has even worse STP than my fucking 5 year old 3570k, and would be absolutely worthless for high-end emulation? Sheeeeeit. Here's AMD's own marketing material. Why should I pay more for slightly worse AAAshit performance, and significantly worse emulation performance, in exchange for better performance with a bunch of eceleb shit?

The 1600X or even the 1500X would be a closer comparison, and both trade blows with the 7700K in game benchmarks at notably lower prices.
anandtech.com/bench/product/1864?vs=1826

See

I'de argue Encryption matters in that pic. Wanting privacy doesn't equate to being a degenerate.

I'd counter that the only consumers going hard on encryption are either full-on tinfoil, or degenerates partaking in illegal activity. If you're going for productivity purposes, or are some kind of (((eceleb))) by all means, grab a Ryzen. But do note that you are sacrificing vidya performance while paying a premium, which goes to my original point which is merely that is wrong when he says that, and I quote "Ryzen CPUs have absolutely wrecked Intel in every single way" which is contradicted by AMD's own marketing material.

So, what you are saying is, invest in an AM3/+ cpu?

Fuck this shit

Nvidia cards are inflated too. 1070s were ~$350 a few months ago.

I'm just arguing that extra encryption isn't something that should be considered a moot point. If there's an option between 2 products and 1 is better in performance but worse in security then that's a different argument.

It's not "extra encryption", though, it's merely faster encryption, just like it's faster for video encoding and other non-vidya productivity shit.

It's an improvement on security.

Locking and dead-bolting my door in 3 seconds versus 4.17 is an improvement to my quality of life, but it's not improving the actual security of my house, and thus could not be called "extra security".

By this logic you should just keep your curtains open 24/7.

It would be easier to just admit you misunderstood what "Encryption" meant on the marketing material I posted instead of repeatedly doubling down and backing yourself into a corner with increasingly nonsensical responses.

It's basically security fam. If you want to call it pointless then go for it. I would say it's there for people who care more about security than others. Streaming shit is useless to anybody with a life but I can't criticize anybody for wanting added security. Especially with the context to that pic.

You're still doing it after I've already explained this to you in detail.

are you guys fuckin retarded?

Don't forget the free telemetry with every nvidia driver update! :^)

majorgeeks.com/news/story/nvidia_adds_telemetry_to_latest_drivers_heres_how_to_disable_it.html

The principle here is more protection. Weather you find it useless or not really doesn't have much to do with it since it is factually more protection.


Are you?

Heh, you're still doing it.
Ironic choice of words.

Yea k fag.

...

That's how shovelware can stay up you retard.

You want to know waiting, you fucking nigger? I'm using an R9 Nano, a card that couldn't even maintain 30fps on a pirated copy of Far Cry 4. My i5 2500k died after five years of intense overclocking, now I'm running a shitty workstation-grade first-gen i7 on a motherboard that was literally ripped from a discarded Dell machine at work. I have a 4U rackmount case that used to sit proudly in my server rack in my apartment next to my 40 core 128gb Poweredge, now I'm a literal NEET and live in my parents garage on 20/5 internet bottlenecked to 10mbps total throughput instead of my 200/20 cable. I even had to run ethernet myself through 30 feet of spider-infested crawlspace complete with fallen fiberglass insulation, because pops was too fucking jewish to buy a decent wireless AP.

The case that I spent $120 on doesn't even have enough room for your retarded oversized modern GPUs, I even had to remove the built-in cable organizer just to fit the nano. Now I'm sitting in a painful wooden chair at a tiny writing desk, secondary 1680x1050 monitor perched on top of the case (because my fucking IPS monitor died when I was overseas and screenburned ecchi in the process) with my IBM Model M knockoff, drinking straight gin and despairing for my own future.

Fucking hang yourself. You have nothing to complain about.

pic related, from the glory days but before I moved in my new servers

...

Windows Store is a mistake.

I'd imagine if you benchmark these Vega cards three or so years from now with mature drivers and games that better utilize DX12 and Vulkan they'll probably beat the GTX 10XX cards in everything except power usage. If you're the type to buy a new card every year that doesn't matter though.

Seriously the only people surprised by Vega's shortcomings are the people that aren't really into the tech industry or are AMD fanboys that think RTG can work miracles. Still it's really fucking hilarious watching Nvidia card owners do the whole

I7-4790k, last I checked it'll GTA5 will run good. It's good enough for emulation and CURRENT games I don't need anything more.

surprised google is this specific

re-terminate that cable boy, shit looks messy

AMD is 50% less kiketastic

I read that and I want my IQ points back.

...

Oh no! Instead of 130 fps I'll get 110!
R5 is cheaper than i7 buddy
What is rendering?
What are VMs?

See and and try again, marketer-kun :^)

And why am I comparing the 1700X to the 7700K? Because that's what AMD themselves are slotting as their answer to the 7700K. While being 14% more expensive, shittier with gaming (again, by AMD's own admission in their own marketing material), and significantly shittier with emulation due to the laughably bad STP. It's being marketed as a jack-of-all-trades CPU, showing significant gains over the competition in productivity applications, without sacrificing too much performance for AAAshit gaming, and it does indeed fit that bill, as I've already mentioned in this very thread multiple times.

However, if you're like the majority of Holla Forums and would be interested in it specifically for how it performs with gaming, it's a bad deal compared to the 7700K in cost vs performance, costing more and performing worse.

But why would I want it if I don't play console shit or have 144 Hz monitor?

Because at the end of the day, you'd still be paying 14% more for less gaming performance..

But Ryzen 5 1600 is cheaper and its gaming performance is almost identical to Ryzen 7.

You see where I'm going with this. Did you recently buy an R5? Is this all just post-purchase rationalization?

Nah, but you sound like somebody who got buttfucked by Intel pretty hard. Did you buy 7700K in january only to realise that 7 months later you'd get 6 core i7 for same price? Tell me 1 (one) thing in which 7700K is better than R5 1600 besides emulation and playing vidya at 144 fps.

Welp, that explains it. You aren't a shill, you're just one stupid motherfucker with an axe to grind.

It's ok user. Just use some ass ointment.

PCs are not consoles, nobody just "games", you will always have a lot of shit running in the background. And having twice as much cores makes it more future proof, once Intel's 8/16 solution gets widely adopted and devs start targeting that in production the 7700k will be dead, just like 4/4 i5s are today.
What kind of things you want to emulate that 1700 STP will not cut it but 7700k will? This sounds like grasping at straws to me, like if AMD removed real mode from their processors you will be complaining you can't natively run DOS games on it.
I don't know, you sound half reasonable half shill to me. As far as I know the only difference between pic related and the 1700x is binning, so I see this as a better choice for your bucks, and comes with an ok cooler.

good thing i stocked on intelvidia last season, bet you fags even use linux, oh wait you actually do, that's why inferior gaming hardware is no problem for you

Except you're conveniently ignoring a point you were trying to make: cost to performance. The 1600 would be cheaper than the 7700k, offering similar gaming performance (though in anecdotal cases, offering more consistent frame times) and still be good for productivity since it's 6c/12t.


Why would you want to emulate a shitty console? And as far as I knew, Citra doesn't run well on anything.

It's almost like he's just whoring for (You)s. He picked one marketing slide that was probably drawn up before the R5s were even at retail, and has made it the end-all, be-all of the discussion.

Where the 7700K destroys the R5s which can't even match the STP of 5 year old intel CPUs, rendering them worthless for emulation, and thus cutting you off from hundreds of top-tier free games. If you exclusively play (CURRENT YEAR+2) garbage, your R5 does indeed offer a better $/performance ratio, in the same way that a $110 1050 technically offers better $/performance over a $330 1070.


Is this really the hill you want to die on? :^)

amd.com/en/products/cpu/amd-ryzen-7-1700x
newegg.com/Product/Product.aspx?Item=N82E16819113429
amazon.com/dp/B06X3W9NGG

You keep making that assertion, but you have yet to show a single supporting benchmark.

I mean, those are easily found. I don't know who's questioning that part of the statement.

archive.is/VMN2i

The 1600 is better or nearly on par with a 7700k in some mutlithreaded productivity, but lags behind in games by a little bit. I wouldn't call it enough to say that magically a 7700k will emulate perfectly while an R5, it's just that it will be a bit peppier in single-threaded applications.

Apologies if my ID changes, I'm not trying to >(1) and done.

won't*. Fuck me. So yeah, the R5 has some good price to performance ratios and in modern games, isn't that far off. Need to look for benchmarks of something single threaded and gaming related since that's the point he's trying to drive home.

cpubenchmark.net/singleThread.html
Got that covered too. On this one we can play Spot the Ryzen. Disclaimer: Some scrolling may be required :^)
docs.google.com/spreadsheets/d/1k12sv1NXGGuSOY0NhsuONtRCte51GHKdgA7ciL76mBs/edit#gid=485052351

STP really is the bee's knees when it comes to emulating 6th gen and beyond, which leaves Intel as the only rational choice if that's important to you, just like if you don't give a fuck about vidya, and want the CPU that's going to save you the most time when encoding video, Ryzen is the answer. I'd actually prefer not to give my shekels to the staunchly anti-white kikes at Intel, but those principles only extend to the point where I'd be willing to spend slightly more, if and only if I could get equivalent performance in the applications that are most important to me.

Full-size graphics cards AND almost-ATX power supplies? Sure.

I got an R9 390x as my first AMD card in over 10 years because of the AMD drivers being so fucking dog shit (assuming you could even find them jesus christ). But they've finally got their shit together. I'm happy Intel/Nvidia have some competition again, it's about time.

I appreciate you refuting your own opinion. Saved us from the spergs (who would be right, honestly).

wow what a faggot

Imagine thinking a console is the smart choice. You've just imagined what it's like to have Fetal Alcohol Syndrome. Remember not to drink while pregnant folks.

Agreed, brand loyalty is for degenerates and retards.

He's right that AMD doesn't care about gaymens though. Ryzen is repurposed server CPU and VEGA is originally a workstation card. AMD marketing might say whatever they want but they simply can't hide it.

That doesn't impress me too much.

That, on the other hand, is a nice show. Ryzen seems to match older/quad core chips though, not sure how much that affects real world performance as my only other system is a modest 3.2GHz Xeon so I don't have the ability to gauge how effective it is. Runs Dolphin just fine, so I assume Ryzen is similar. Definitely shows that the 7700k is better on single threaded performance, though.

Thats because AMD hasn't fixed their shitty OpenGL drivers on windows for as long as they've existed. I am not just talking about framerate here. Recent AMD drivers are buggy as fuck for OpenGL on windows.

except 99% of this board does nothing productive at all, except maybe re-encoding youtube poop videos in vp9 because autism

You think we still live in the age of plaintext netcode and quake1 packet sniffer wallhacks, son? AES speed is your fucking ping time now.

The RX 480 was good.

Considering that pc parts tend to depreciate quite a bit with time, and that future-proofing is pretty hard when new instruction sets are constantly being added and improved upon, the optimal course of action is to buy the product that offers the best overall price/performance ratio and maybe the lowest Cost of Ownership. Eventually, one could also take into account upgrade paths, in which the longer-lived platform helps the user save potentially more money.

Of course, this doesn't apply when one very specific characteristic is required because special user requirements usually come with an added price in the product [Assuming that official support is required and it can't be compensated by software] Note that the argument can be generalized to computing hardware in general and not just general purpose hardware.

Some people don't know, but AMD was able to charge pretty steep prices from high end CPUs (The AMD64 era) back then, so it's important to try and use objective criteria for building systems. We could also get in the discussion of Intel anti-consumer practices but it's unlikely consumers will take that into account because of the involved negotiation costs, especially when there might not even be an available alternative to their offerings. Finally, one could argue that precisely because of the low market exposition AMD suffers (and has historically suffered) due to such practices that they're stuck in a vicious circle, but that also involves discussing market regulation and stuff outside the scope of this thread.

Should we all stop complaining because people in Africa are starving and have a harder time than all of us? Honestly, sucks to be you, but I am in a cancer situation too.

Stuck with a 6950, don't even remember when I bought this shit. I have the money lying around to buy a new card but the market offers nothing worth buying.

On a scale from not mad to MAD, I'm not even

the 460 isn't even that old, do you fuckers upgrade your hardware every year?

...

Until a third party appears we as consumers are just going to be fucked over by these two again and again.

What is Intel?

same as nVidia, they are just as cancer as the rest of them.

Intel is part of the reason AMD's gpu side is dying.
With them losing to nvidia on performance and for a while recently also in price, they were relegated to the low end market. But then everyone buys intel CPUs and integrated gpus work just fine which means AMD lost on the top end and the low end and they don't look like they'll be able to compete in the mid range for long.

I'm not saying that Intel as a whole is good or anything, but the Intel Open Source Technology Center deserves recognition. Their drivers are amazing and they are by far the best gpu vendor.

use your big goy words

best purchase of me life tbh lads

Darkstalker Kaathe did nothing wrong

Sweet Jesus why did they waste so much on that fucking case and cube? Or was this some special box that went out to faggots to unbox?

Because PowerVR suck ass, otherwise we'd see some other GPUs make it. And anything that is used with an ARM SoC is too fucking proprietary/tied to the ARM CPU to bring to a normal desktop.

he can just buy another one, you dirty commie

...

Got the rx550, and it's coming tomorrow. 450w enough for that+i5-2500k?

Definitely. RX550 should be fairly light on power requirements, and a 2500k isn't that heavy either. You're probably going to be at about the optimal efficiency for your PSU unless something in your system decides to draw fuckloads of power.

What the fuck are you smoking?

Some Vega news: GN broke the power limits of Vega 56 with a registry hack, but for now they only showed a hybrid mod to put the GPU under water so it could handle the 242% power offset. They should be posting benchmarking results of the power bypass today though.

AMD pulled a bad move and made it impossible to flash a functioning modified BIOS (WHY THE FUCK HAVE DUAL BIOS THEN?!) so the power limits could not be bypassed via BIOS. Had to wait until another registry hack was created like the registry hack for Vega FE.

Interesting news, looks like AMD did a bait and switch with those rebates (they're already expiring), and the actual pricing is significantly higher than what was originally being teased, shitting all over the supposedly decent price/performance of Vega for gaming.

On the CPU side of things, the 8700K was announced and it's going to be about 11% faster than the 7700K in STP, and a whopping 51% faster in MTP with 6 cores.

...

Whether it's a good buy depends what you can OC it to. For the most part, games only seem to gain from higher frequencies. Moar coars may result in moar heat thus less overclocking headroom. The R7 1700 offered slightly less performance for gaming but far superior for streaming+gaming compared to 7700K. The extra cores going from the 7700K to the 8700K plus architectural improvements may bridge that gap in streaming performance. Personally, I have a 6-core 5930K for video editing/rendering. I'm curious how the 8700K does in those tasks.

More information:

Well yeah, consoles are the lowest common denominator devices like capeshit for film, of course retards (ie most people) are going to prefer them.

...

I read that AMD delayed Vega like this because they couldn't afford to sell it cheaper so they tried to improve it. Sounds like a pretty shit excuse to me. Either way, how comes AMD is short on money? All the miners are buying their cards to the point where they get inflated in price so gamers can't buy them anymore.

who falls for the "AMD is in need of your support" meme in the current year

yeah that's exactly my point. Everyone says that AMD is out of money, but to me it looks like they are doing fine.

I see you weren't following that thread of conversation. If the 1800 offers similar gaming performance to the 7700k in modern games, the 1600 performs similarly to the 1800, the 1600 will logically perform similarly to the 7700k.

It's not really a noticeable difference in performance in modern games. It will be noticeable in heavily singlethreaded things like emulators.


They also fucked Skylake-X, they can't really afford to fuck up Coffee lake.

The 7740 and 7640 are borderline scam-tier, but the rest of the lineup looks fine at a glance for professional use.

Update: Vega has a clock bug. People are getting misreported clock frequencies and claiming crazy overclocks that aren't actually occurring. Gauging the actual GPU-score and or framerate improvements demonstrates the reported frequency is wrong when the bug happens.

Ryzen 3 1200 can beat the G4560 when overclocked. The thing is price disparity. The G4560 is $80, but the R3-1200 is fucking $130. What's a real joke is that the i3-7100 is fucking $150 when it's no stronger than the G4560.

But right now the 4560 is being scalped by Cryptominers for an extra $50 ($130), however, the G4600 (which is stronger than the G4560) can be picked up for a mere $100.

They don't
It doesn't

Don't fall for this meme. I paid $150 for a GTX 1050 Ti and it's been able to play fucking everything on Ultra at 1080x1920 without even breaking 50C.

Some people want more than just 1080, though. Some people want 1440p, some want 4K. And if your name is Linus, you might even want something retarded like 8K and 16K.

I wanna fuck that samurai

Does 8k at 600fps exist

Fuck no, you can't even get 60 fps at 8K.

Damn. I thought the technology was there already. I mean, for sub-$5000 prices.
I doubt any consumer grade monitors even exist for such scarily beastly specs.

Okay, here goes since CPU discussion came up.

The IPC improvements of the Ryzen do not necessarily make them equivalent to the Intel chips of comparable lines. I have seen so many fucking videos on YouTube of AMD fanboys finally feeling like they can revel it up, because HEY THE FUCKING RYZEN IS BEATING DOWN THE INTEL.

Now hold on. Because that's not entirely accurate. You see a lot of these videos are comparing the Ryzen 3 1200 with say an i5 5600 (the 5000 series was replaced pretty quickly for a reason).

The Ryzen 3 1200 is the budget line of the Ryzen chipset, it's a quad-core, has a clock speed of something like 3.1 GHz stock. You can overclock that boy to 3.8GHz on stock voltage, making it attractive to people who like to overclock. The comparable line of chips to the R3 is the Pentium and the i3, which are both budget chips themselves. The Pentium comes in 4560, 4580, 4600 and 4620. The price tags of those are $130 for the Ryzen 3 1200, $80 for the 4560, $90 for the 4580, $100 for the 4600, and $120 for the 4620 whose clock speeds are 3.5, 3.6, 3.7 and 3.8 GHz with no overclocking capability. And then i3 7100 with a clock speed of 3.7 also with no overclocking capability, but with support for Optane memory and slightly better IPC (which nets no real performance boosts in games).

The Ryzen 3 1200's IPC improvements are great compared to the Bulldozer and Piledriver lines. But the 1200 at stock frequency is easily beaten by the i3 and the Pentium, it's only when OC'd to 3.8 (or even 3.9 if you're lucky) that the Ryzen beats out the competition.

That said, despite the idea that if the Ryzen 3 beats the i3 like that, that must mean that the Ryzen 5 beats the i5 and the Ryzen 7 beats the i7 is .. wrong. The Ryzen 7 does not beat the i7, except in perhaps synthetic benchmarks which only really happens because unrealistic scenarios that don't come up in video games. The fact of the matter is, Intel is still stronger as a gaming CPU.

Even the Threadripper does not beat the i9, not to mention the Core-X series. It just doesn't. HOWEVER .. Threadripper IS competitive. It's also super expensive, with the cheapest one being as expensive as Intel's super expensive chips.

Threadripper is awesome, no disputing that by anyone who isn't a fucking faggot. When it comes down to gaming, Ryzen is able to compete with Intel on levels that actually matter. They might not get better 1% and 0.1% fps, but they get fps scores that are just as playable. Ryzen 5 can totally do 4K Ultra and compete against the i5, it won't get better scores than the i5, but it's still playable and it's possibly cheaper.

Cinebench? Not a realistic benchmark, because it does things that games don't do. Don't trust Cinebench scores. Synthetics? Not realistic benchmarks, don't trust them because they're not realistic benchmarks. The only thing that matters is how many frames per second a chip will get in your favorite games. Most games aren't CPU-locked, so your CPU isn't really going to determine your FPS in most situations unless it's a bottleneck.

If you're building a budget gaming PC to start, the Ryzen 3 and the Pentium are great places to start, they won't get you super huge FPS, but the socket type matches their better bretheren. So you could start with a G4600 (since it's not being scalped), and later upgrade to an i7 or start with a Ryzen 3 1200 and later upgrade to a Ryzen 5/7, or whatever you want to do.

There's nothing wrong with either set of chips, the old days of AMD tax are pretty much done. There's no more 30-60% variable between comparable CPU lines. And while the Ryzen 3 1200 does beat the G4600, it only just does so. It's not a big beat down. The IPCs are similar now, and Ryzen loves fast memory so you can actually improve your performance by using the fastest possible memory that your CPU supports.

This all having been said, how about we just do away with this retarded fanboyism crap? Stop hating Intel. Stop hating AMD. And even stop hating NVidia. Enjoy meritocracy and know that everyone on the fucking playing field is pretty much competing again.

AMD has finally come back to play in the races, so let's all just enjoy the fruits of these labors and not worry about what color your CPU or GPU is associated with?

That was a nice read. Now, I don't know what to buy.

If you're looking to stream or convert webms at the fastest speeds, AMD is the better buy. They have more cores, and more cores equates to better when performing multitasking like that.

If you're just looking to game, get whatever you like. And if you're just looking to browse and play emulators, buy an android instead.

...

"Who gives you more" is a pretty wacky question to ask. You know why? Because clock speed is not indicative of the level of performance you will receive. I could tell you that the i7 7700 for the same price as the Ryzen 7 1700 is 4/8 on the Intel chip with a 3.6GHz stock and a 4.2GHz turbo clock compared to the 1700's 3.0GHz stock and a 3.7GHz turbo clock. So what does that mean? Absolutely dick. It doesn't mean fucking anything, because you're comparing numbers that don't equate to real world performance numbers.

Same price for the chips, the i7 still wins but it's a meager victory. Also because you tried to stir up more platform war discussion, fuck you.

The woman IS Asian, but she looks Korean. Oh god don't tell me she's a feminazi please

She is Korean and she is a feminazi.

Both of them are total SJW's and push diversity.

Why am I always right. God fucking damn it.

Meme-tier for vidya.
Even more meme-tier for vidya, although high-end CRT shaders like Kurozumi's Royale tweak do an incredible job of recreating a Sony BVM on a 4K screen; but at that point you spent more on a meme monitor for shaders rather than just buying a fucking good condition BVM and having the real thing.

For 4K to be truly mainstream for vidya, we would need to reach a point where either GPU tech rapidly outpaces whatever the current graphical standard is, or it gets to the point where it's just not financially feasible for developers to create higher-fidelity content. I can't see the former happening any time soon, and even with the latter, they'll just treat that as a license to throw tons of heavy post-processing on everything, and skip various optimization techniques like various LOD assets.


Honestly, I think as consumers it's in our best interest to hate, or at least have distaste for all of them. We should be basing our purchasing decisions exclusively on 1) what meets our specific performance requirements, be they vidya or productivity, and then 2) what offers us the best $/performance ratio among the GPUs and CPUs that meet or exceed those requirements.

Do their philosophies hamper your ability to game at all? No? Then just fucking let it go. Honestly, if you were to stop patronizing products and services on their politics alone, you would have stopped subscribing to the internet years ago, considering what kind of policies your ISP pushes.

Their philosophies directly and harmfully impact THEIR ability to produce quality products and software. Diversity hires destroy industry.

And that affects my ability to game by resulting in lower quality hardware that costs more money.

you try to avoid them as good as you can. If only one of them was sjw cancer you could easily go for the other one.

FAIR.

I just hate this stupid platform war crap. "OH FUCK INTEL, THEY'RE ALL JEWS!" it's bullshit, Intel are an American-based company just like AMD, their main manufacturing plant isn't in Israel or anything of the such like people constantly say. AMD just spent a few billion dollars to put an R&D plant in Israel, does that mean that AMD are the devil? Fuck no.

If you want to hate all the players, I am 100% okay with that. If you want to love all the players, I am 100% okay with that. But for the love of fucking god, stop saying X are good but Y are bad because muh underdogs or whatever, then fucking kill yourself.

Intel are not some super mega corp with AMD being on the verge of bankruptcy, but yet people go into threads and push that idea, and it's just ridiculous. Would a bankruptcy-tier company have been able to purchase ATI? I don't think so.

The point is that we as the consumers have to stick together so they can't pull their jewish tricks on us. And people side with AMD to push back nVidias and intels jewish bullshit.

RX Vega should never really have been released. It's full on a compute card, not a graphics card. It has almost not optimizations and designs with gaming in mind, everything about it is geared toward raw compute, which is why it wrecks the 1080 in that arena. It's a ridiculously powerful compute card for the money, but it's not a good high end gaming card.

I'm honestly baffled why they didn't just leave Vega to the prosumer market and make an RX 590 which was just a scaled up Polaris (570 and 580).

rx 480 for the RAM, games will need it
VEGA for something between 1070 and 1080 that doesnt cost a fortune

There's a tiny bit of truth to that, insofar as it's probably in our best interest as individuals if others make the "sacrifice" of purchasing from AMD on principle to spur them along and keep them competitive with Nvidia/Intel, which drives Nvidia/Intel's innovation.

In other words, when AMD starts swinging at them and taking a bigger bite out of their sales, Nvidia/Intel get ready to release the next big thing. They actually did that with Intel this time, and lo and behold, Coffee Lake looks to be a good counter to Ryzen in productivity, while widening Intel's lead in gaming. Who wins there? We do, because we've got more choices and the pricing should become even more competitive. On the flip side, Vega is a fucking flop (for vidya), so instead of them eating Nvidia's lunch and Nvidia being forced to roll out Volta, they're just going to continue to sit on it now since it would only cannibalize their own sales, so we the consumer lose out.

I know the fanboy shit is obnoxious If you think this is bad, though, you should see the firearms community. They take corporate cocksucking "brand loyalty" to a whole new level, but from a completely selfish perspective, I think it's in our rational self interest to perpetuate (or at least not actively working against) the meme that AMD are the underdogs and need support.

I don't know how much more they can get out of the 480.
Regardless, I'm not convinced we are seeing all Vega has to offer for games. I've got the feeling they haven't had time to come up with proper game drivers yet. Still, I doubt they will have something on the 1080ti level with the current Vega architecture, it doesn't seem like something built with games in mind.
The 56 is not a bad deal though, close or better than reference 1070, far better on production stuff that benefits from powerful GPU. Seems like a good deal, at least if you already have some PSU to spare.

It still doesn't make sense. Vega isn't a failure, because the chip is doing exactly what it was designed to do and doing it very well. Compute. The failure here is them trying to sell this compute card as a high end gaming card.

Could they just not afford to make a gaming optimized high end chip AND a compute chip and said 'fuck it, the compute chip will game well enough'? If so, then they need to sack the fuck up and just admit that. Position this as a card for people who want a cheap compute card that can also game fairly well, not as a pure high end gaming card, because it's just not. It's not designed to be.

By that logic the Tesla shouldn't have been released, it's a compute card not a real graphics card.

It's just a new way to target old problems, I don't see the harm in it. Remember when Ageia was going to push a whole new add-on card called the Physx line of PPUs (physics processing unit), and they were advertising it to fuck and back that for $300? Said you could supercharge the physics engines in your games for the future, thus offloading that particular load off of the GPU and CPU.

They collapsed, and NVidia bought them up, but it was an interesting idea in theory. That's kind of where we could be going with compute cards, it's no secret that the CPU is too generalized to be a horsepower thing, and that's why OpenCL is so popular nowadays. Imagine if we saw a new line of cards to compliment the GPU to supplement physics and logic computing… It would make games almost have no hit on system performance, thus it would be a win for pretty much everything, you'd be able to game, convert a webm, and stream on twitch all at the same time without having to hook up a bunch of bullshit for a TV-cap card.

They tried to do the Ryzen trick again.


Shit just didn't line up for them. HBM2 is a lemon, GDDR5X is just as good and cheaper, and unlike Intel Nvidia hasn't been slacking.

HBM2 could have been a success, the problem was that they took an old board, slapped a new type of RAM onto it, and released it as a product expecting that somehow the new RAM type would supercharge the old card.

If they put that shit on a proper video card, it oculd be a hit.

The next line of Nvidia GPU will offer a HMB2 model. That type of ram is actually pretty great on many regards people often ignore, the frame latency people get on the Fury X is better than anything Nvidia has to offer, even now.
I think the issue with Vega and gaming lies on the architecture, not on the RAM.

...

GDDR5X is getting more expensive to make as well. Honestly though I think amd should just move away from GCN all together.

also lower power draw

So it looks like the Vega 64 is actually the same price as the 1080 ti, while getting positively blown the fuck out in vidya performance, even when overclocked. Looks like it's not even efficient for miners either, since Vega is performing far worse at mining than originally anticipated, and the heavy power draw necessitates more expensive PSUs which severely ramp up the time it takes for a build to be profitable.


I almost want to give them the benefit of the doubt on that and blame Samsung. That's indescribably jewish.

That's the bundle, you twat.

No, these are the bundles you twat :^)

Finally, a little bit of positive news for AMD. Hardwareluxx.de undervolted the Vega 56 and 64.

Vega 56 undervolted beats the regular nVidia GTX 1080 in FPS while using less power than the 1080. The Vega 64 still complete shit.

hardwareluxx.de/index.php/artikel/hardware/grafikkarten/44084-amd-radeon-rx-vega-56-und-vega-64-im-undervolting-test.html

Look at the prices, vega is total trash. Same price as 1080ti but way worse performance.

Now that's interesting. I'd like to see how many other tech sources can reliably reproduce those results, or if that outlet just won the silicon lottery. I highly suspect the latter, because dyke gook CEO or no, if that's something that the average 56 was capable of, I think they'd be doing that out of the factory and marketing it as a $400 1080 which would shit in Nvidia's cornflakes, versus selling it as a 1070 equivalent that's going to shatter your shitter in increased power costs.

not sure why this got taken down. has some great points

So here's a question that needs to be asked. With all the controversy surrounding Vega why did AMD decide to legitimately shoot themselves in the foot? Everything about this launch has been a gigantic trainwreck.

...

Muh trophies, muh achievements. Buy ps4 or a fag box faggot.

yea well i am still on a ((sli)) GTX 760 (with 4gb to not have issue with unoptimise PC port)

and still no reason to change

The (((mining))) craze is still in full swing, and at this point consumers are desperate to even have a chance at buying upper-mid to high end GPUs for $100+ over MSRP, so I'm sure this will be a financial success for AMD.

more like
console peasant saying PC master race is bad because they do not eat in a refectory like them.

There are so many posts by people who watched a linus tech tips video or read some reddit posts and think they got a fucking degree in astrophysics or some shit that I can't begin to describe my disgust.
Shit goes from completely random nonsense by someone who read a dictionary to find smart-sounding words to possibly a shill.

pc gamers are in an optimal position to parade around superiority

Not dollar for dollar nigger. If you can afford the best go with Intel. The other 90% of us would be better off with an AMD, even for gaming.

how much do you pay for a rx vega 64? Is the 1080ti still more expensive?

Some people actually buy this shit. How much of a dicksucking fanboy do you have to be to do this? This is why apple got away with all the bullshit they did.

That image is nothing but lies anyways. You can game without a fucking service. You don't need Steam to play games. In the old days you had something called Installers, works just like if you're installing CCleaner or any other fucking program.

Just that everyone goes to Steam cause of the massive user base.

Agreed. Let's hope others can reproduce the results as well.

I'm mad now.

Every time.
I wonder how many watts of heat the LEDs of the average faggot generate.

SSDs are nice and all but for the money I'd rather go for more storage capcity. Besides if you use an operating system that's anything but windows things won't take ages to load without a SSD.

A 1TB SSD is very cheap now, do you actually have more than 1TB you need to store locally? You might just be a data hoarder who never throws anything away and lives in digital filth.
I've been doing development on SSDs since they came out (and my old X25-M is still working great) as they make everything ridiculously fast. I had been doing all sorts of stupid shit prior to that like VPATH builds from tmpfs so that version control operations on large trees like Linux's would complete in a few seconds rather than a few minutes. Same with PotgreSQL which no longer takes a guru DBA to tune thanks to SSDs. Video editing as well is awesome - no need to do wacky media filesystems or RAID anymore to get AE to shuttle smoothly. All our servers are on SSDs now and all our "cloud" ones, too. Amazon defaults everyone to SSD-based storage. I've yet to have to replace any of them in 10 years. I can't imagine choosing to go back to that magnetic hell, and I can't imagine not having switched a decade ago.

wew lad

...

Go on, explain how we're just overlooking the greatness of the RX Vega 64.

How do people live like this

A lot of shilling going on on Reddit. Either paid shills, or some huge fan boys there. Either way, they are gay as fuck.

Not using crippled drivers/OS helps:

THIS

More Vega news: Shortages are expected to last into October, so if you were hoping for prices to go down any time soon, you can go fuck yourself. Also (((Industry Insiders))) are claiming that Nvidia is indeed pushing back the release of Volta from Q4 2017 to Q1 2018.

Rumors about Nvidia pushing the release of Volta to Q12018 were floating about before even the professional Vega card got released.


That's free market at work

I wanted to buy a new PC for quite a while now.
is buying a 1700 (for 40€ more) better? i rather have 2 cores less but higher clock speed.
the prices for for jewvidia are:
1070: 450€ - 550€ (ASUS Strix: 530€)
1080: 520€ - 650€ (ASUS Strix: 630€)
1080Ti: 690€ - 850€ (ASUS Strix: 770€)
VEGA64 would cost me less than a 1070 and it outperforms it. the 1080 would cost me much more and be better, but not by that much.
should i wait for VEGA56?
and do i really need 750 watt power supply for the VEGA64? pcpartpicker tels me a 515 wattage
does anyone know a good 27" freesync monitor? can come out later since im planinng on upgrading in a couple of months later

pic related is my wishlist. tell me if you know better specs or have found an error

And still no drivers.

I actually laughed out loud, thanks fam.

Nvidia is pretty notorious for its shilling and bacterial campaigns. They also have EA tier marketing practices deliberately sabotaging competition which keeps me morally opposed to buying any of their products.

Rather than making a reasonably priced high-quality product and competing respectfully, they make mediocre garbage and sell it at a premium while shelling out to developers to create things with their hardware in mind or to design using nvidia's own software while keeping everyone else out of the deal so shit naturally runs better on their products out of the gate. Then they parade their numbers around to show off how much better their product is even though those developers had months and sometimes years of additional time to optimize it for their hardware. And people wonder why AMD has such a hard time getting proper drivers out compared to Nvidia who had a relatively humongous headstart.

Not to say AMD is too great. They make mediocre garbage too and their prices are inflated due to buttcoin mining, but at least they're not padding their numbers in the most underhanded ways possible.

...

You really need to buckle down and do your research on PSUs.
No, it's really not.