Don't fucking buy this shit

The only motherboard that works for this can't even work right. My best friend fell for this kikery and is out a hefty amount of money.

Other urls found in this thread:

digitaltrends.com/computing/ryzen-threadripper-1950x-overclocked-cinebench-record/
twitter.com/SFWRedditGifs

...

He's got more money than sense.

I have a 3570k for a while now and don't even know why I would upgrade it.

No shit, Sherlock. That's why you do fucking research before spending large sums of money on products. Your idiot friend deserves it.

...

wasnt it proven that thread ripper is better than this at 1/3 the price? is your friend also retarded?

Yeah. He still fell for the meme though. I might be getting his stuff now that he's doing a new build. I told him to save himself some money and either do better research or hire someone out.

Jokes on you, I have no money

...

Eh, I'll stick with this quad core for a good while.

And I'm strictly saying an i5 because most people here probably don't do anything complicated and just play vidya.

...

...

...

...

I even went to a government funded program to give free jobs to retards like me and I still can't get a job.

If you can't get a job, then do the rest of us a favor and kill yourself to lower the unemployment rate. People who don't make money are subhuman.

well maybe if you didn't have such a shitty attitude somewhere would hire you.

...

the two cheapest i5 and i7 look good.
Are there any emulator benchmarks for this?

Act like a fucking normal human being and you might find employment. No one wants to hire a literal sperg.

Are you budgeting for a three hundred dollar motherboard?

The 7740X is just a 7700K on a retarded platform. If you're going to go Intel, just get that instead. This whole release was rushed to shit because Intel has fucking nothing to compete with Ryzen and Threadripper at their price points.

Furthermore, they removed the iGPU from the 7740X, so you can't even do PCI passthrough without a separate PCI-E GPU. It's fucking useless.

Post pics

Enjoying that max 1080p gaming.

My i5 is still working after all of these years and the only thing holding me back is my videocard which is really old and outdated now.


They really haven't improved all that much since the i5. It's not like it is with videocards.

...

next you're gonna tell me the competing megacorporation aren't kikes

Thread Ripper looks great. You'll want all of those cores in 8 years which is how old my i5-750 that I'm currently using.

Intel is an Israeli company and they make their processors in fucking Israel.

My old ass intel core i7 2600k still works perfectly fine and that's from 2011. I don't understand why you niggers need to constantly upgrade CPUs when nothing good requires anything top of the line/new.

And AMD?

Owned by a guy in Abu Dhabi, headquartered in California, manufacturing plants in Singapore, Germany, and the US.

So kikes.
K thanks.

good job contributing to the thread retard stop posting

If Witcher 3 runs like shit then I'll scream.

So we're allowed to shit on one kikes, but not allowed to shit on others?

Threadripper looks sexy and it brings me great joy when it lit the fire under Intel's ass.

Man nothing Intel nor AMD have done to date even remotely interests me yet. There's still absolutely zero reason for me to 'upgrade'.

Everything is owned by kikes you stupid goy.

...

That's his own stupid fault lmao.

Your friend is a genuine retard for buying current X series chips, that's for sure.


2600k here, and yeah every SandyBridge person I've seen still doesn't have enough of a reason to upgrade, thanks in part to the great overclocks that can be applied with the right cooling. The only thing we really miss out on is PCI-E 3.0, but the difference is minimal for gaming and 4.0 is on the horizon anyway.

The next line up of Ryzen chips will make me pull the trigger though.

Going for pentium because I only play vidya, would ryzen 1200 provide any futureproofing? Thinking of pairing with 1050.

Threadripper has the strangest installation procedure I've ever seen in a CPU. They must REALLY want to make sure you don't fuck up those pins (which makes sense for how much money you are paying).


Nope. Anyone who buys a quad core processor without hyper-threading in the [CURRENT YEAR] will get away with cheap "esports" gaming for now, but it likely won't age well as games become more and more optimized/demanding of additional cores and threads. Also it's fine to pair it with a cheap video card, but anything above a GTX 1060 gets bottle-necked HARD, so if you get a new GPU in a few years your CPU is going to hold it back, guaranteed.

I have a 480 8gb, how do I jew?

Prices are going down now that Ethereum prices dropped and mining difficulty increased. Plus everyone is going to think your used card came out of a mining rig that abused it 24/7 for months on end. Try putting it on eBay if you want, but you're probably just wasting your time.

they just synergize

nigger 20 years ago maybe but never, EVER go for the lowest end and worst purchase unless you want to spend hundreds of dollars next year just to keep up because your fucking web browser isn't keeping up with youtube.

Thanks for the volume warning, you fucking asshole.

Ignore the name. The current line of Pentium chips are just i3s that were binned as something less than an i3. They still pull out good performance for people who are playing shit like DOTA 2, CS:GO, Rocket League, etc.

they don't even output ideal performance for those bottomfeeder games. Don't recommend people buy notebook hardware.

AMD has manufacturing plants in Israel as well.

you're welcum

Pentium G4560 is absolutely fine for 90% of video games from 2017. Only games which it cannot run at 60 fps is Civilization VI/Total War Warhammer and other CPU heavy games.

And I don't recommend it either, but the benchmarks say you are wrong, if it's paired up with the right card (1060, maybe 1050Ti if you lower settings to up framerate).

Ryzen and Vega are the future now

Ryzen yes, as long as Intel can't produce dies that are either worth the price premium or get their costs slashed even more than they have already been forced to.

Vega, not so much. It's seriously late to the game and under-performs, unless they pull a driver miracle out of their ass at launch this month. Their commitment to using HBM2 memory really bit them in the ass when it came to delivering the product on time. Maybe Navi will deliver where Vega failed though.

Ryzen maybe.
Vega not really. Only interesting Vega is that gimped Vega 54 and only if it's better than 1070 for lower or same price.

Games aren't going to use more than 8 logical cores before you'd be upgrading again as that's the limit on consoles. The AAA devs design for those limits and indie devs can't even handle a single core. It'd be pretty stupid to blow money on cores as future proofing.

Nope. Game engines are already getting optimized for 8 core chips because of Ryzen. Lots of games have gotten patches that improved performance on Ryzen recently, and CoD WW2 is advertising itself as "optimized for Ryzen." Plus when those 8 core chips are cheaper than 4 core chips from Intel, there's basically no reason not to get them unless you are a MAJOR penny pincher/poorfag.

Wait, you said MORE than 8 cores, meaning Threadripper. I misunderstood.

Threadripper (12 or 16 cores currently) is definitely overkill for gaming at the moment, those chips are for people who do actual work on them that involves heavy multithreading.

My nigga, AMD overclocked a 16/32 threadripper to 5,2 Ghz and broke cinebench world record and they probably did it with water cooling, since the die is fuckhueg it has a bigger surface area for the cooling. Heck even the 1900x which is a 8/6 is gonna overclock like a motherfucker. And on top of that all threadripper CPUs have a 4.0 Ghz boost and it was rumored that with high precision boost or whatever the fuck they get an extra 200 Mhz.

Just like all those other times AMD was the future and it turned out to only be shilling?
They've been unable to compete on anything other than price since the Athlon 64 days, but it's like every time they're about to release something a certain segment of the internet population heralds it as the coming of the savior. The only people more delusional were the "year of the Linux desktop" guys.

8/16*
also here's an article digitaltrends.com/computing/ryzen-threadripper-1950x-overclocked-cinebench-record/

That overclock was liquid nitrogen cooled, not watercooled.

overpriced 10-15% performance bump (thats innoticeable) plus price bump
intel and nvidia is the future guise

8 logical cores, as in a quad core with hyperthreading is more than enough until the next console gen. Nothing effectively uses cores as it's fucking hard to do and not worth the effort. You can set affinity of GTA V to limit it to two physical cores and it'll not run any differently.

Ye i got my articles mixed up

Hey user, you are late to the party. Ryzen caught Intel with their pants down, and the entire X lineup of Intel CPUs this year as a response is a rushed mess that is a huge waste of money versus Ryzen chips.

Your statement is correct for Vega though.


GTA V is not a new engine. Stuff like Battlefield 1 and Total War: Warhammer got updates to better take advantage of Ryzen's extra cores and threads. Yeah I'm sure a 7700k will last someone a while, but 8 core chips will likely hold out even longer.

fugg i just realized, i don't need 8 cores but i also don't need more than 4gb of vram

Get on my level retards.

But Ryzen 7 was garbage and the ThreadRipper I have no idea who they're selling to. Gamers aren't going to be buying a $1k processor, high-end server guys don't trust AMD at all as they take forever to patch their microcode and AMD-V isn't competitive with Intel's offerings, and embedded guys don't want to deal with all the ACPI and interrupt breakage they /always/ have.

Just buy an old dell optiplex and toss in a gpu. Do not buy pentiums or i3s unless you like wasting your money.

If you aren't making a new build with ryzen and an amd gpu you are wasting your money and just going to have to spend more money sooner to correct your mistake of building an "on paper" console killer and not a good pc.

*nvidia gpu
Amd graphics suck this gen

Nope. It's performance has been going up with BIOS updates and game updates, and it has had two price cuts already, making it a fantastic deal. Meanwhile Intel advises people not to overclock their overclocking chips, because they put toothpaste and cum underneath their heatspreaders.

People who need lots of cores (video editors, 3D modellers, etc.) and who want 64 PCI-E lanes for multiple GPUs for CUDA or OpenCL computation heavy tasks. Intel chips are seriously lacking in the PCI-E lanes in comparison.

And they shouldn't be, because that's retarded.

Except for that time that AMD Opteron servers were faster and cheaper than Intel server CPUs and the only reason they didn't overtake Intel in the market share (which they were close to doing at the time) is because Intel was paying BILLIONS of dollars to companies to not offer AMD chips in their systems. Dell alone at one point was getting $800 million per quarter to not sell machines with Opteron chips in them.

[citation needed]

What even needs a monster rig nowadays? I've been playing some really good games recently and they all run on this outdated 10 year old toaster

Only people gaming at 4K on Ultra settings in the newest AAA titles, if you are at 1080p you can get away with much, much less than a "monster rig" that's for sure.

It's interesting how anyone saying [citation needed] always turns out to be a giant faggot.

nice try shlomo

Once again, performance has gone up as bios updates have rolled out and game updates have come out. Go look up recent revisit benchmarks for stuff like Battlefield 1, Rise of the Tomb Raider, Total War: Warhammer, etc. The gains are between 10 and 20 fps depending on the game. Furthermore, saying that getting 105fps instead of 120fps is somehow really bad is just fucking retarded of you to say, especially considering the price difference (which has been widening as Ryzen has had two price cuts).

And if you want to stream or generally produce game videos, having 8 cores from Ryzen kicks the shit out of four cores from Intel.

...

That's some turbo shilling right there. "Performance doesn't matter". I'm impressed you went there.

Fuck off intelaviv

Like I said, they're always stuck competing on price because they can't compete on performance. They've been stuck in "value buy" territory for 15 years.

Yes and R7 shits all over Intel in benches which actually test performance buddy.
Yes because everybody has 144 Hz monitors right?

I wasn't going to. I already have a i7-5930K and the only thing I use it for that benefits from 12 threads is video editing/rendering. Most games just want higher core frequency.

...

If you are on a 60hz monitor the perceived difference between 105 and 120 wouldn't be very high. 120hz monitor would of course matter then.

Still doesn't change the fact that updates have closed the gap, prices have come down, and you can overclock any Ryzen chip instead of paying for an "unlocked" chip that Intel then turns around and tells you not to overclock.

...

What kind of moron would buy a high-end processor and pair it with a 60fps monitor?

...

Go be poor somewhere else.

Yes, indeed who would buy Ryzen 7 for gaming right? Especially when 1600x achieves better or same results in vidya and cost less.

ahh showing true colors

People who don't want to spend $400 on a monitor and would rather put that money into more cores and better GPU.

Was looking for that webm thanks

I think amd made a very jewish move here. I bought a computer in July 2016. My previous build was about to die, so I thought why not build a new one now? People had told me that the rx480 would be a high performance card for a mid range price. So I build a pretty decent computer. I had 2k lying around so I bought a i7 6700k and everything else other than a gpu. Got myself a 1440p 120hz screen too which supports freesynch. Now I just had to wait for amd to release their gpu's but it turned out that the 480 is indeed just a mid range card. Now we are about to get the rx vega, and from what I heard it's not that good either.

So amd basically just created a lot of hype to get people into buying freesynch screens so they would be forced to buy a amd card. Pretty gay move of them.

But that would just be being retarded. Games very poorly utilize multiple cores so money beyond dual core is waste for a gaymer, it's all about having the fastest individual core. The difference in quality and control latency going from 60 to 144 on the other hand is massive. Learn to build a rig properly.
Also,
Good luck with getting anything good for that. Btw, I'd recommend the Predator XB271HU - I have two of them.

user, you seem to be retarded. The 480 was always the top of the mid range at the time. It was never the high end, they have no high end since Vega kept getting delayed.

Furthermore, you don't pay extra for FreeSync. It's literally free. Monitor manufacturers can throw it in without paying for it. Nvidia G-Sync costs money and is locked to Nvidia cards. Nvidia could support FreeSync if they wanted, but they don't because they are greedy. However, as the market shifts further and further towards FreeSync monitors, they may end up supporting it in the end.

What the fuck

It's not 2012 anymore. Modern AAA game engines not only utilize more cores pretty well, but they require four cores at a minimum. DirectX12 and Vulkan achive their speed boosts through greater usage of multiple CPU cores. It will only continue to get better.

You also still don't seem to understand than there are other uses BEYOND GAMING that benefit greatly from having more cores, and people with 60hz monitors could easily be considering those needs to be above gaming needs. An 8 core Ryzen chip will beat the shit out of any 4 core Intel chip when it comes to video encoding for example.

4k is a mistake today. The hardware doesn't exist yet at any price to run it at high enough frame rates.
I've got two of those 1440p monitors and two 1080s. It's a his and hers setup as I also have a wife.

I think the hardware is somewhat there, the problem lies with optimization. When it comes to consoles, they have neither. At least some PC games manages to be optimized, even saying this makes the situation sad.

and 1440p 144 Hz isn't?
Not true 1080 Ti is powerful enough to run most games at around 60 fps at 4K. Which is funny because 1080 Ti is only card which is good for [email protected]/* */

No they don't. Pick a modern game on your system and set the affinity to only run on two physical cores. You won't notice a difference with a good processor.
You don't seem to understand that the "high-end workstation" is a dinosaur. Most of those have been converted in the last few years to using cloud and on-site cloud as an infinite thread pool. My company uses openstack that way, I know Google internally does builds similarly on the smoking remains of appengine like a turbo distcc, and the data analysis guys moved to EC2's rent-a-GPU almost the day it was announced. While there are still industries that give employees high-end workstations it's the ones that are slow to retool and they're understood to be deprecated.

I opened up my old HP laptop and the cpu looked like that except with even more thermal paste spilling from the sides.

Honestly I'd have stuck with 1080p but the quality wasn't there. The monitor companies wanted to drag people to where the cheapo brands can't compete so they kept the good features off the lower res panels (IPS + gsync + 144hz, etc.). But I still get over 100 at 1440p on the highest settings in everything I play except Subnautica so it's fine.
60fps is shit though, user. It was shit 17 years ago, what are you doing? Like I said, 4k isn't viable yet as there is nothing on earth that can get the frame rate up to acceptable levels. Maybe in 2 more years.

Games like Battlefield 1 REQUIRE four cores. You can technically fulfill the requirements with two cores and hyperthreading, but you are gonna have a bad time.

My dad's electrical engineering firm buys HEDT computers every year like clockwork. In the past they have been Xeon chips but Threadripper could show up next time around because the prices are fucking phenomenal for the amount of cores you get versus Intel's X lineup. But your average YouTuber isn't buying Threadripper, he's buying an R7 1700 and overclocking it.

Here let me break down in simple terms neccessary proccessing power
web browsing with images only = pentium 4
web browsing with video and medium gaming = new age pentium
heavy gaming and light emulation = i5
heavy emulation required games and compilation of programs = i7+
Video editing = dedicated gpu or bust for large videos.

Now let me explain in more advanced terms. Games and programs these days are bloated. Importing whole libraries like node.js just to use one function in it. That adds more complexity than neccessary slowing everything down. Bigger programs mean even larger bloat. Web browsing and watching video shouldn't take much resources in FLOPS proccessed. But gaming uses specialized functions of the x86 proccessors' assembly sometimes. That causes even more heat thereby slowing it down. Emulators and compilers use even more specialized functions to emulate another CPU hence even more heat and even slower programs.
TLDR
You don't need anything greater than a sandy bridge proccessor with the right software OS. Exceptions if your into video editing/compiling/emulators.

Most emulator benefit little form an i7, pretty much only RPCS3 and PCSX2 do and even then not by much.

This was before the 480 got released. We had no information on that gpu back then. Holla Forums and other communities praised and hyped the 480, said it would have high end performance. So I decided to build a computer and buy everything other than the gpu.

For the price you shouldn't expect all that, also never
EVER
believe in hype, think rationally.

...

sorry, but your friend absolutely deserved to get burned if he was dumb enough to buy this shit even after everything that's been said about it

why on earth did you fall for the hyperthreading meme when it made no sense to have it back then?

Why would you buy a K if you're not going to overclock? Good fucking lord. "Master race" indeed.

Have you seen any benchmarks at all in the last two years?

cemu