Budget PC gaming

So, with the release of the Ryzen 3 2200G and Ryzen 5 2400G, budget PC gaming is once again a reality. Not just a reality, but cheaper then it's ever been. The 2200G is only $99 US and overclocks incredibly well with just a 20 dollar cooler. Video related. The 2400G is less cost effective at $169, but if you can overclock it as high it might very well be worth it.

With this performance it can handle most games at 1080p and is excellent at most emulation save for Cemu, though current rumor is the APU has the horses, they just need to optimize for it. With the 2200G being a hundred bucks, people are putting together rather competent builds for as low as 300 bucks. The CPU side of these chips are also fairly powerful too, so once GPU prices calm down, slapping in a mid-range GPU becomes a reasonable upgrade. Budget PC gaming is back it seems.

Other urls found in this thread:

youtube.com/watch?v=56VHYE3SRiQ&t=263s
youtu.be/vD_yUozylPc?t=2m53s
youtube.com/watch?v=N2NebRdMEds
youtube.com/watch?v=Oo16cNur0v4
youtube.com/watch?v=N2NebRdMEds)
hooktube.com/watch?v=RwF5XewHmU8
twitter.com/NSFWRedditImage

ha but are these digits back my friend????

Nowadays not possible. Not with games like Kingdom Come Deliverance, Assassin's Creed Origins and Dynasty Warriors 9 that are so unoptimized, they are pushing the limits of what most modern computers can handle.

Untrue. It's true that some super unoptimized games like Kingdom Come require you to drop to 900p or even 720p, you can still get playable frame rates. I couldn't find a direct test, both the game and the chip are just to new, but I found a benchmark with the i3-6100 and a GT 1030, which is a fairly comparable set up. It almost kept a locked 30fps at 900p low, but at 720p medium, it's actually quite playable.

I'm still running my 970 from 4 years ago,and i doubt i'll be replacing it unless it literally dies,because fuck the artificial inflation on GPUS now

I don't consider anything below 60fps 1080p "playable". I'd rather sacrifice graphics than resolution and fps.

I bought a gaming laptop back in 2016 with a Skylake i7 and GTX 970M and I can still play the latest shit on high to max settings at 1080p

If you're on a budget you need to lower your expectations lad. 30fps is perfectly fine as long as you use a gamepad but as soon as you pick up a mouse 60fps is pretty much a requirement. I am fine playing single player games at 30 but for obvious reasons I will turn shit down for online games to get that minimal 60fps

In most games, these APUs can do that, but for the unoptimized games mentioned, that just isn't going to happen, regardless of hardware. Even the 1070 Ti has drops into the 40s at 1080p Ultra. If that's your standard, it's pretty much impossible for the really janky titles. I was just trying to argue that with some modest compromises (720p for a small handful of titles isn't that bad), you can still play these games on something like a 2200G.

There are a shitload of unoptimized games though, I would say for the most part they outnumber optimized games, more so if the game is a port.
It's a side effect of the increase in resources PCs and, to a lesser extent, consoles have gained in recent years.
There is no reason to even attempt intelligent resource usage because you can just raise the minimum requirements.
It doesn't really help that console manufacturers have begun dumping halfassed "ports" a year after release at full price and sometimes including additional DLC onto the PC platform that have overhead of 40% or more.
It creates this retarded cycle of people buying shitty ports for franchises they want to play on PC because they want console devs to release more shitty ports and they can't stop rewarding horseshit because they think console publishers will stop pumping out their shitty ports. Somehow this is a loss

I don't entirely disagree, but again, the point of my posts is that the 2200G is remarkably capable, even at 1080p, with the vast majority of games. You have to go looking for the real terribly optimized ones to find things it really struggles with and even then it can usually manage at 720p.

You're better off spending the 30 bucks extra for an HD7770 or something.

Also untrue. The HD 7770 isn't any more powerful then either the GT 1030 or the RX 550, and those cards are what these APUs make obsolete. That's the entire reason I'm making a post, this isn't just another release. These two APUs make the entire sub-100 dollar GPU market obsolete. There is zero reason to buy a GT 1030, RX 550 or anything comparable to them anymore. Especially since these chips will fed a mid-range GPU very easily, making for a clear and simple upgrade path.

Capable of just above bare minimums which I believe is, to an extent, the definition of budget. I feel like once you're dropping to sub 1080p to hit 30fps it's less budget and more destitute

More or less this. The RX 560 4G, coupled with a Ryzen 1400, that I have does its job really well in high/ultra with even many demanding games (Not many games TBH because I don't play triple a shit in general), not 60fps in a constant way perhaps, but it still powerful, even if its considered in the lowest level of hight performance-cards. And i think it costed at release only a little higher than $100.

Why would you buy something like a RX550 or similar, unless you are a total poorfag?

Considering the 7770 will hit 55fps avg with 40fps lows vs the 1030 with 35 avg and low under 10 running GTAV high/1080p
I'm gonna go with no.

Incorrect again. Video related. The GT 1030 and RX 550 trade blows consistently. Also, don't know where the hell you are getting '35 avg and under 10' for the GT 1030, because it can easily lock 60 FPS at 1080p.
youtube.com/watch?v=56VHYE3SRiQ&t=263s

The HD 7770 isn't more powerful then the GT 1030 or RX 550. You have to move up to the HD 7850 to find something that consistently beats them both.

Because I know the difference between low preset and high preset, unlike you apparently.

The 1030 is a badly designed poorfag card, the VRAM bandwidth bottleneck is too big, the RX550 is better but you can just get a secondhand card for half the price that will shit on it like say late TeraScale/Fermi and early GCN/Kepler, which is better for a poorfag build

While on the subject of poorfag builds
Reminder that fake chink cards are a way to source free or very cheap GPUs

Huh. Then what about this?
youtu.be/vD_yUozylPc?t=2m53s
Compared with
youtube.com/watch?v=N2NebRdMEds

Huh. It's about the same. And just to clarify, your solution to poor memory bandwidth is to use a card that often only came with 1GB of VRAM? Really?

You contention on buying used has merit, but you have to move up to 750 Ti and HD 7850 to get an real boost out of it, HD 7770 you're just sitting in the same ballpark using more power with older hardware. I can't find direct comparisons of the HD 7770 or it's rebrand R7 250x with the GT 1030. Best I could find was once with the RX 550, where again, it lags behind slightly.
youtube.com/watch?v=Oo16cNur0v4

The 2200G and 2400G still obsoletes the HD 7770 as a purchasing option, just as it does the GT 1030 and the HD550, especially since it needs a 6-pin, where's the more powerful 750 Ti doesn't, nor does the comparable GT 1030 and RX550. If you are going to use a 6-pin, just get an HD7850 or something comparable. Not an HD7770.

Custom settings for the VRAM bootleneck for the 1030

Don't need more VRAM if the rest of the specs don't follow, see the numerous cards with way too much VRAM for their own good that have popped up over the years,
I should know I'm one of the suckers that bought a 4GB GTX770 only to realize it has basically no fucking use, but hey at-least it wasn't a 4GB GTX 650 or a 4GB DDR3 GT 630

You know what factors in last in a budget build, yeah that would be power consumption mostly because budget build enthusiast don't pay for their electricity

If we're going to nitpick, then how about I point out that just 'minimum' is a fucking terrible benchmark method. There is a reason most places do 1% minimum and .1% minimum. Just 'minimum FPS' is nonsense and conveys very little.

1GB is still not a good choice for 1080p nowadays.

It's not power consumption, it's power requirement. Using a 6-pin and pulling over 100 watts means you need a beefier PSU, because if you stress a garbage one, it may catastrophically fail. The GT 1030, RX550 and especially the 2200G and 2400G are going to stress a PSU significantly less. That actually is important to a build.

No good PSU under 430w which is enough for even a card with 8+6 on top of a decent CPU, not a factor I'd say.

...

Friendly reminder yoy can crossfire the APU with a radeon gpu.
You swine

Does Multi GPU still feel like utter shit even at very high framerates?

go be a cry baby somewhere else

It works better with low end shit than highend with amd.

But that means buying a complete system and slapping the GPU in there isn't a viable option anymore, you have to also get a PSU, which makes the entire 'buy and build used' significantly more complex. Also, what CPU are you pairing with this HD 7770?

Go be a corporate whore somewhere else, half-cuck.

and ask your money back for receiving a fake product and your gpu is even free lole

someone wants to fit in

Why don't you explain how games which have worse graphics somehow require stronger systems to not only look worse than games 5 years older than them, but perform worse as well, MR generational gap?

why don't you post some examples of games that look worse yet perform worse

Listen you turbo nigger, there's no excuse for games taking up as much space as they do even when way back with the Master Chief collection that nearly all of the oversized glob of an install size was literally garbage data to pad it out. (Read the localization audio over and over)

Then download repacks. The size of game installs has nothing to do with performance. It's not like it's trying to cram all of that install into memory. As for discs, there is a reason they use redundant data on them.

Mass Effect andromeda to Mass effect 2
Dead Rising 1 or 2 to Dead Rising 4.
Skyrim to Skyrim special edition
Arhkum Asylum to Arkhum knight
Could go on and on. Shit, resolution packs which go up to 2k to 4k work better with old games than new shit games that barely function.
Compare fallout 4 fps to fallout new vegas with 4k textures and an enb.

You still haven't explained how suddenly specs work worse for no reason because of "generational gap". Or how devs have to keep all their files uncompressed making games ridiculously large.

Are you a shill or just a retard?

...

I mean maybe in title that scale well,going by embed related the answer to my own question is that multiGPU is still shit.


Imo you never ever want to keep the PSU that are in prebuilts if you're gonna go the refurbished prebuilt route for your cheap build
If you're sourcing every part individually it's not an issue at al, the 40bucks you put in that PSU is well worth it.

My own backup build is using weird mix of parts, I have a Phenom II X4 955 on some higher end AM2+ mobo and mismatched RAM, I use it for fucking around and testing stuff, the 7770 is my control GPU for when I test some.
But if you're asking for the build I pulled those GTAV numbers from it's the one in that video (youtube.com/watch?v=N2NebRdMEds) you posted, not sure if the 955 couldn't hold 60fps though..

ass effect 2 looks worse than andromeda, I don't know what world you're living in.
Dead Rising 4 also looks better than the first. different visual style, I think the first is sharper, but it's definitely not more detailed and uses simpler shaders.
go away todd
knight objectively looks better.
go on and on, because so far you haven't made a real point.
they do it for various reasons from increasing load times, providing higher quality audio, cleaner textures, meeting new DXTC standards, etc.
because there's a solution for it you whiny retard, a solution you don't seem to know about.
is this tumblr or something? who says "called out?"


how do you remember to breathe?

by increasing load times I mean decreasing them. nobody likes seeing textures turn from a clay smudge to a eye cutting sheen 2 feet in front of them.

But then you're kneecapping your ability to upgrade in the future. And to save, what? 100 bucks? I'm ballparking here, but lets say 40 PSU, 20 case, 40 GPU, 60 CPU, 60 RAM. You're still breaking 200 bucks for the build.

Or you can spend 300 and have a powerful, modern CPU that will pair well with future upgrades, have entirely new, modern, in warranty parts and your system performance will be as good or better. I'm sorry, I just don't see it.

Objectively wrong.
All of which could be avoided by being competent at compression and optimization, which they're not.
Oh yes, user, clearly the solution is piracy, now someone else needs to do a shit devs job for them
It's pretty clear you're either baiting or an idiot, corporate whore.

Which is an issue of poor optimization, which you seem to want to pretend doesn't exist. What are you, one of the shit no talent devs colleges are shitting out these days, or something?

In that case you can go for the same brand new build as the 2400G but get a secondhand dGPU with comparable performance and a regular cheaper Ryzen instead, removes any and all of the issues that come with an APU like say having to pay extra for RAM (most of those bench are on 3000-3200Mhz, 2400Mhz which is the minimum to not gimp CPU perf needlessly but that's not gonna do any good to your IGP), that easily 40 extra on 8GB, having worse thermal constraints (nevermind the lack of soldered IHS) and even if it's minor needing to use system ram for your APU.
Both have their arguments but if you're looking for raw performance /cost I'm not sure the 2400G is the indisputable best option.

The 2200G uses the same socket as those $300 CPUs. It's a good temporary solution for solving the problem of getting games playable where you have a clear upgrade path. You spend a little now and curb your spending in the future.


its an issue of over-optimization really, texture streaming is both a good and a bad thing. I think I'll take what it offers - more seamless game play, rather than getting source engine style load screens when the map designers hit their memory limit. Having better hardware solves the problem.


do you expect me to take you seriously

What a surprise you can't offer any kind of counter evidence or explanation of your previous statements about how pc specs are somehow weaker for the same things over time because "generational leap". But by all means, keep avoiding the question at all costs.
Is this 1984?

Unless you can answer the question i'm done dealing with your answer avoiding ass. You've derailed this thread enough.

Not with modern GPU prices and shitty optimization it isn't.

Careful user, according to 90e68e shitty optimization doesn't real. It's cause of magical reasons he can't actually explain.

...

The 'cheaper Ryzen' IS the 2200G. The 2200G undercuts all the other Ryzen 3s right now at 99 bucks. Also, the gap between 2400MHz DD4 and 3000MHz DDR4 is about 10 bucks. 90 bucks vs 100 for an 8GB kit. If you are willing to settle for 2800MHz though, you can get a kit for almost the same price as a 2400MHz kit, 90 bucks.

Also, these APUs using TIMs has been proven to not be an issue, because they overclock like bosses on even a modest air cooler, as the video I posted in the OP shows. With even a modest air cooler, instability becomes an issue far before temps, so AMD going with TIM instead of solder is actually a moot point, probably thanks to the lower end, cool running nature of these chips.

I mean, you provided 4 comparisons which were too ridiculous to even grant any sort of validity to. All of which have convenient outs attached to them (MEAs bugginess, for example), and you've given up on defending your position on all of those. I'm not convinced you'd understand why DR4 looks better than the first, despite how I already described how it might in an earlier post. In fact…

you don't even seem to be able to read. So I'm thoroughly convinced you have nothing useful to say and are just here to run interference instead of actually have a discussion. Literal "if you don't agree with me and my narrative, you must be an outsider from cuckchan/reddit/wherever/corporate shill despite you telling us all to pirate!"

tl;dr: You aren't smart enough to have these conversations and you're too much of a faggot to accept your incredibly flawed world view to be flawed. as demonstrated here


watch the video if you're curious what's going on

Once again, you can't actually answer your claim and avoid it again why the same specs are somehow worse for "generational leap".

And hilariously enough, the pictures you've posted look worse than the dead rising 1 pic. Of course, don't mistake this as me not noticing you keep refusing to explain your ridiculous claim before, which once again, you REFUSE to actually answer. But it's me who can't defend my position, lol okay.

The games don't look worse. There's more advanced shader techniques going on, higher quality textures, distinct effets for things like hair and foliage, SSR, global illumination, etc. All things that make it look far better than dead rising 1.

I'm waiting for you to post that mythical game that looks so much better than these. You've ducked away from 3 of them, and until you post that game, I won't have anything to work with. I've already explained why you're wrong about DR4 vs DR1. You can find subjective qualities you prefer more, I laid out an escape card for you to reuse here: in order to explain why its magically better looking. You can also point to the shadow render distance which really doesn't compare to the performance impact of the SSAO being used, making DR4 more advanced still.

You're a slimy one, and a desperate one at that, but you're wrong. It's OK to accept that and reconsider your position next time you want to throw a tantrum.

But that one has a noticeably shittier IGP and then you have to look back at getting a dGPU anyways
Damn the price hike has really gotten that bad hasn't it.

65°C on a brand new chip with only a full CPU load @ stock on an AiO loop, seems rather high to me, although I'm unsure as to what "good temps" even are for Ryzen,
OC nearly touches 80°C and I wouldn't be comfortable with that on any CPU, still keeping in mind that it would get worse with an actual GPU load
I mean you could delid and bring both back to good temps but yeah not sure you're gonna risk a $100-160 CPU if your build is barely twice that.

Is it just in my country or did the price of the ryzen 5 1600x got increased by 10% after the release of the new amd cpus?
Is there a good alternative for the ryzen 5 1600x or should I just eat it up?

Again, the video in the OP. The entire video is about overclocking the 2200G. With a cheap air cooler, you can push the GPU to 1.6ghz, which puts it over the 2400G stock. The video comments that with a 1.6GGz iGPU clock and 3.9GHz CPU clock, the 20 dollar air cooler kept temps at 55°C, though that was in gaming benchmarks, not in synthetics meant to full load the chip. In the video, it only hit 80 degrees with that overclock on the stock cooler. Now, to be fair, silicon lottery, etc etc, but still, that's pretty incredible.

Even at stock, the IGP in the 2200G is hardly 'shitty'. It still competes very well with the 2400G and actually beats it when overclocked.The whole reason people are in a tizzy over these releases is because AMD actually fucking did it. They realized the potential of the APU and destroyed the entire concept of an entry-level GPU. That's why every tech site is benching the hell out of these chips.

the price has dropped from 260 to 220. I think amazon has a sale on it for 210. Not sure where you're from but the original price was 260 USD.

The only real alternatives are eitehr going higher or getting fucked by intel.
Unless you don't actually need the extra cores then you can make do with something like a 1400

You can get the 1600 for under 200, and the 1600x is just a higher clocked 1600, which is pointless considering all Ryzen chips are unlocked and you should be doing at least a modest overclock anyway.

If you don't need the two extra cores, a lot of the other Ryzen 5s are worth looking at and again, the 2200G is solid at 100 bucks, though it's four cores, four threads, no SMT like the Ryzen 5s.

Fucking Brazil, I get taxed literally 100% if I try to buy something from Amazon. Pic related.
Oh, there is also a chance they decide to tax me 160% instead once it's delivered.

I'm going to use it for gayman, so I guess less coresconsidering this usually leads to more processing power in each core the better because of all the shitty devs out there. Still, I'm using a gtx 1060 6gb and I want it to bottleneck the CPU and not the other way around.

Maybe I really should get a 1600 instead.

If it's anything recent 6 core is a thing you kinda want, I'd say go for a 1600 if you can find one at a reasonable price.

sorry user.
nah, this is not the case. PC devs took quite a while to get real multicore support out there (many different architectures, different ways they communicate with each other, and no unified way to work with them so its no wonder console devs had it figured out back in 2006) but they're finally on top of it. You want to have at least 6 cores if you're playing new games, 8 cores ideally. Getting a quad core now will be like getting a dual core in 2012, there's no clear path forward. Consult the video for more information on how quad cores just won't cut it any more.

Not to scare you or anything, because I don't think there's been anything substantiated yet, but I wonder what the consequences will be of having CPUs use core0 for 90% of tasks for long enough. Will that core eventually crap out? Just food for thought.

Until bitcoin miners drive prices up again sure.

Which one?

the one I embedded

Windows has super agressive physical load balancing (led Ryzen to have shittier perfs than it should because windows wouldn't stop shuffling the load accross CCXs) so nope, Linux also has physical CPU awareness and probably does some load balancing too

I'm mostly just pondering the theoreticals of it. If it's running software that neglects the other cores and only uses that single core, what will happen to how the software operates once it can no longer access that core?

Your ID didn't embed a single video in this thread.

I just refreshed, and its still there. Either the board is broken or you might have something wrong with your setup. hooktube.com/watch?v=RwF5XewHmU8 this is the video I embedded

Yeah, something wrong, probably between dashchan and hooktube since hooktube embed was added recently.

Nigger, the pictures are both up there. Dead rising 1 looks crisp with good and varied textures and nice shadows while your dead rising 4 pics look like barely textured shit clay shit, and everything looks washed out. And once again, you have yet to answer the claim you made about "generational leap" making the same specs somehow work worse for no reason. In fact, this is the fifth time you've refused to answer your own claim, whereas i've actually answered and cited proof for my own, you disingenuous cunt.

There's almost no cases that would let that happen in the first place outside of you fucking with affinity settings.
I doubt the effect would be that important unless you did it for extended periods of time with heavy loads.
And obviously once the core craps out then it stops being able to run things properly which in real world usage would translate to high instability at first.


Oh boy it's that one AMD guy that literally cannot stop sucking AMD dick.
Not that there's anything wrong with AMD but this guy is really reaching

He's plenty critical of amd and acknowledges their shortcomings. I don't see what you are seeing.


Disappear again, this thread was so good without you showing off how dumb you are.

Budget PC gaming is a joke right now, you can't get good or playable FPS in many titles with a budget rig, the most glaring recent example being Kingdom Come: Deliverance which runs decent on consoles with a stable enough 30 FPS with texture streaming issues, but on a Ryzen 2200G you'd be lucky to not get stuttery shit all of the time. Games are just far better optimized on consoles nowadays, the 2200G isn't worth it at the moment, it's far more sensible to just wait until the GPU drought ends and RAM price drops and build a 600$ rig.

i cant believe im saying this but Xbox One X (the more power one) is viable option

they didn't benchmark kings cum in the video, so who can tell how it would run? The game is ridiculously poorly optimized anyway, so it's not really the kind of thing you should make a PC to target.

What the fuck are you talking about
It drops down to 20 FPS at times much like MHW

What a croc of shit

who tf would wanna play any of these shit games anyway?

Someone explain these new CPUs to me
Are we talking 99$ with no video card required? How compatible are these motherboard sockets? Will I be pigeonholed into just using these budget ones? Are these things even out yet? How much of this supposed "powerhouse" is just marketing hot air like Ryzen?

Well just in the video you posted near the end, the part where he gloats about AMD being good for gaming because of encryption based DRM, also the whole part where he talks about AC:O scaling very well with high core counts while completely ignoring that anything above 6C/12T is well past the diminishing returns threshold.
Then the other video I've seen from him where he shits on Nvidia's driver issues saying he doesn't understand why people still think Nvidia has better driver support but conveniently ignoring or trying to turn the fact that they release hotfixes way more often than AMD leading to major problems being a matter of weeks to months instead of years like AMD, also in that same video pointing out Nvidia had big hardware problems several times then saying AMD had no such thing while completely ignoring the RX480 fiasco and a few other issues (although generally limited to faster hardware wear instead of direct hardware damages unlike the others I mentionned)
It's quite easy to pick on the fact he is extremely biased towards AMD bordering on being dishonest at times.

The 2200G is essentially a GT 1030 with a Ryzen 1200 with less cache, so a slightly older i5 or i3 with a R7 360, GTX 750 Ti, GT 1030, RX 550 or RX560 gets roughly comparable results.
It doesn't matter if it's poorly optimized or not, the only thing that matters is how playable it is on comparatively budget priced hardware to consoles, if it runs like shit on PC, it runs like shit on PC and you can't play it as well as you could with a 200-250$ console. Not to mention this shitty 2200G rig would cost you a minimum of around 250-300$ not including a monitor if you don't have one, although a TV can work as a replacement it's not optimal.


And on PC you get stuttering and 20 FPS instead in those same large cities. At least consoles don't stutter as much in those same areas.

You first double spacer.

it seems to be performing better than the 1030. It's got low end Vega architecture as the GPU, so I don't really think it's comparable.
Oh I definitely disagree. There's the upgrade path, how the hardware platform will mature, and many other things to consider. Ryzen for example has only gotten better since it launched last year. It has a good future. Intel's 3 architectures last year? Well the fact there's been 3 is pretty indicative of how well they've aged.
I think you'd buy a console for different reasons than a PC, but that's a very different discussion.

Not only that but they run even worse on consoles.


Highly dependent on the rig and the settings you use, settings you can change at will.

It's 99 dollars for a 4 Core CPU, basically comparable to a 7th gen Intel i5. It also comes with an integrated Vega 8 GPU that gives fairly decent performance, enough to play most games at 1080p, especially if you overclock it.

Also, what part of Ryzen exactly is marketing hot air?

Minimum everything, no resolution scaling.

At this point, maybe we ought to stick to pre-built?

Lie.

You can easily twerk that shit to run constantly 24-30fps.

More like medium settings at half the resolution scale at 1080p.

I'd rather buy parts second hand

Nobody uses that, even for a budget build.

Not only are those parts only really found second hand today, the game ran at essentially the same FPS as console.

I know, a Ryzen 1600 gives five time the performance and is cheaper.
Slavboy is just showing his Russian high spec PC.

That's even shittier considering no support.

I dunno how but the 8150 has started massively degrading in performance on newer stuff.

So?
They're also old as shit unlike most budget stuff you find which run even better.
While recording, with the same price and with hardware much older than said consoles.


What do you mean no support?
Most graphic cards still get support, mostly driver based.


It's a 7 year old CPU
Makes sense that it already struggles with some games.
Even my 2012 i5 3570K struggles with some recent stuff.

Should have said warranty instead.

That's 2-3 years
Just like most hardware people buy and keep for a whole decade or more.

If you use Shadowplay or AMD Relive there's little to no performance hit, using recording as a excuse is not really valid anymore. Consoles can record too in their own hardware without capture cards today.

There is still performance drop even on consoles due to encoding.
More negligible than something like FRAPS, but still there.

Near non existent, only 1 to 5 frames but generally below 2. Only uses up some RAM and disc spins so loading times might be slower though.

Ryzen CPUs dont have integrated graphics unlike Intel with their HD Graphics. This launch changes that and you now have two Ryzen CPU's integrated graphics, the 2200G with 4C/4T with 8 Vega Cores and the 2400G with 4C/8T with 11 Vega Cores.

Now for the motherboards, you will want a B350 mobo so you can overclock, since the cheaper A320 boards dont allow overclocking. Second thing is to look for displayoutputs on the boards themself. They are there specifically for the integrated graphics. Since Ryzen's launch, some motherboards dont come with displayoutputs so you wouldnt even be able to use the integrated graphics.

Did they end up fixing poor RAM support on those, there was a time where I was recommened a X370 over a B350 for that reason.

I'm putting together a Loonix build with one of these as soon as I can. Gonna play some Grim Dawn and E.Y.E. at maximum settings And maybe finally pirate the Bethesda Fallout games

did you retards not buy a decent graphics card before the cryptonigger shit blew up?
goddamn this board is full of retards

New videogames fucking suck, who the fuck cares.

I agree.
So why does this thread still exist and has not been shitposted to death?

...