I9 cores have almost no improvements in performance over i7

Where were you when Intel officially died?

Other urls found in this thread:

archive.is/2faBr#selection-3403.163-3409.289
archive.is/YVT5P
archive.is/9oxp2#selection-495.0-495.545
cpu.userbenchmark.com/Compare/AMD-Ryzen-5-1600-vs-AMD-Ryzen-5-1400/3919vs3922
twitter.com/SFWRedditVideos

I wish

A few months back, when Ryzen was announced.

Intel should just dump all its patents and stop holding humanity back so we can welcome our new Nvidia overlord and pay for all his leather jackets

Eh, I can wait a few more years.

Now, be honest with me, is there any reason to own a gayman pc in the current age?
Is there any game on PC that actually allows me to use the hardware for something good or is everything that needs good CPU and GPU just really bad ports?

Gay-MD is coming up with some crazy shit from what I've been hearing about Zen

Bad Rats

Obviously, what Intel needs is more diversity.

Intel is pure Israel, so I'll be happy if they end up fucked.

Intel did it. They made the Nvidia Fermi of CPUs.
Old meme though since AMD's Hawaii was hotter than even Fermi

only thing you'd want these for is for non-gaming tasks, but I think ryzen has intel beat on price-for-performance when it comes to higher core count chips.

waiting on decent IOMMU groups for Ryzen so I can do an AMD linux build and run win10 for gaming on a VM. i9 could do that too but the pricepoint is just too high.

It's the Preshott all over again.

It's not for gaming. It's for duping corporate goys.

If you're excluding poor optimization you only need POWERRRRR for high resolutions.

But..
MEGA
TASKING

I'm now left wondering if AMD's retarded 6 core processor shit was actuality a ruse.

You're just wasting your money if you buy anything better than an i5 or equivalent anyway

As long as people keep buying these "new" chips, Intel won't die

To play non-shit games with good graphics.

Well not really, but Armed Assault series always has been bottle-necked by CPUs, though that's more to do with shitty game engine.

I have no sympathy for first adopters, they reap what they sow I haven't really found a game that requires even an i7 anyway

It's been a pretty bad year for Intel user. Lotta people are sick of their shit.

in terms of CPU gaming performance there's barely been any reason to upgrade from 3rd gen i5's

Not only that, but there have been plenty of layoffs as well.

That's a good sign then, maybe Intel will consider investing more into Research and Development instead of Marketing and using same tech but moar

The funniest part of this is a marketing company with Intel as a client owns Tom's Hardware. They can't even make it look good.

Way to go, Purch Media!

archive.is/2faBr#selection-3403.163-3409.289

Brand Spotlight: Tom's Hardware
archive.is/YVT5P

archive.is/9oxp2#selection-495.0-495.545

They own Anandtech too. Tom's and Anandtech are literally shills for Intel. And they can't make i9 look good.

poor user, you think Intel will actually try and innovate instead of just bribing OEMs and paying off tech sites. I feel so sorry for you polite sage

Either its that bad of a product or they have some sort of integrity.

If you look over their site, part of their marketing is to make it all believable that it's not shill and independent.

Their last recommended CPU guide was hilarious, even after Ryzen launched, every single recommendation was for an Intel CPU. I don't think they are that innocent or independent. Someone who recommends a $300 Intel over a $300 Ryzen is not your friend.

Daily reminder to buy AMD CPUs and GPUs.
They're Jewish, but don't have as great a share of the market and offer better price-for-performance.
Intel and nVidia are for the gaymurz, but AMD are for the computer users who don't want to spend an extra $800 for the privilege of the NSA having a livestream of your computer and your house burning down when they pin you for made up charges.

if you're running any version of windows you're already giving up your data to Big Bruv anyway

AMD really needs to make their cards play nice with PCSX2. Then I'd be happy.

CPUs have improved 10% since Oblivion came out 11 years ago.

Gee whop would have thought a 10 core CPU would use a lot of power and have little practical use for regular people

Still impressive it can hold those kind of clocks on all cores though

And with that we're back to the good old Pentium 4 days except with MORE CORES

You need a gayman PC in order to properly emulate consoles :^).

I laughed and mocked it when they announced it and am feeling rather smug right now.

Stay snug, smug

The only games I found that benefit from having an i7 are Parajew's GSGs, due to shit optimization and ridiculous amounts of data. Everything else runs fine using an i5

That's not on AMD, that on the PCSX2 devs refusing to fix their shitty code because they all use nVidia and don't care.
Inversely, CEMU works better on AMD.


I'm sure there's a variation of Moore's Law for GPUs.

Is this a trick question? All of the old reasons apply and more. I haven't not known a disadvantage in owning a gayman pc as the years go by. Gaming itself is collectively worse as its potential is squandered. Thus, all of gaming suffers, but owning a gayman pc gets you the most out of what little gaymen has left to offer.

I'll take it.

Dorf fort and megabases in Factorio benefit. Autism games still need powah.

Why wouldn't they be? Ryzen was disappointing and landed firmly in the 'value' category which Tom's generally gives zero shits about.

Is this the $1K one or the $2K one? Also is Intel charging for RAID capability?

I ll just pick whatever has more power only poorfags take less than that

Is there any reason to own a console in the current year?

I want it as much as you but let's not lie to ourselves.

I want it as much as you but let's not lie to ourselves.

In heaven.

...

It's just nice to have good specs just in case. Even if you don't need the extra power, it's good to have for when you will. Not saying you should go overkill, but if you can overbuild for fairly cheap, then I don't see why that would be a bad thing.

...

I mean, if money isn't an issue at all, then do it. No one is telling you not to.

I've been having fun playing ARMS with friends and coworkers. Even playing with motion controls

I see no reason at all to upgrade. Games aren't making the same major jumps they were in the past and even my gtx 570 is still holding up after all of these years.

Supporting PSP and make the future even worse. Thanks for that.
I hope you enjoy your built-in spyware and "accidental" software removal after updates.
Sage because I know this is nothing of value.

Doesn't really matter though as the Stockholm syndrome is still going strong.

GPUs yes.
One word PSP (Platform "Security" Processor)
FX is a shit (especially in power draw and performance however I can handle tweaking settings) but Ryazen is way worse with PSP that is AMD's version of AMT. PSP is the only thing stopping me from upgrading from my 8320. My 290 should still have plenty of life left in it to play the latest and greatest games if I wanted to deal with Windows10 in addition to an almost 90% of way more focus on "muh agenda" over anything else.

>Implying that haven't been doing that since day one. Also it's Intel so of course, why leave potential money on the table goy sheep customer. Do don't want us to starve to death by only earning 109% rather than 109.0001% right?

This. That power draw though.

Hang draw and quarter yourself then commit seppuku even though you are not even remotely worthy of it.

Have we reached the end of upgrading?

That was likely reached a while ago, this is just it becoming more known.

If AMD really wanted to rape Intel then they should have made their Thread Ripper to be Multi-Proc Compatible.

That way you get 32 cores, 64 threads, 128 lanes for 2000 dollars.

Bundle two of those CPUs, a mobo, an AIO liquid cooler for dual CPUs, and a Caselabs case for 2500-3000 dollars and Intel would be put into the grave.

I get that Naples will be multi proc compatible, but considering how they're going to stuff twice the cores and threads on the same Die size, the performance per core for Napels won't be that so great compared to Thread ripper.

Also, I wish somebody would come out with a case like this.

upgrading hardware in less than 5 years at a time is a bad meme

I never gave enough of a shit to dig through intel's marketingspeech for the i7 series, since it was too expensive for my shit, but wasn't the jump from i5 to i7 also pretty much nonexistent?

i5 to i7 don't matter for gaming PCs, it is just a waste of money and TDP. i7 is only good over an i5 for processor intensive tasks that enjoys performance boosts from multithreading. Shit like render farms, home servers/dbs and shit.

The only thing Ryzen did was make Intel drop their prices. An i7-7700k still outperforms a Ryzen 1800x in gaming. And now the Intel chip is cheaper. Yes, I know it beats it in sanitized multi-threading benchmarks, but this is Holla Forums and I'm fairly certain that at most of us will only be using it for games. I was really hoping AMD would make something a lot better than what we got. Actually, force Intel to innovate instead of just slashing prices. Make a processor that I might actually buy, and I buy both Intel and AMD processors, generally depends what I'm building the computer for. As it stands now, as long as you have a 3rd generation Intel I5 or above, there is no reason to upgrade your processor for gaming.

Why are you posting videos from a stupid fag

The only time you should upgrade is when the hardware fails or can't keep up to your current necessary tasks
Only cancerous hipsters think you always need the latest hardware
And I'm not even a Holla Forumsfag

...

Is this a trick question? All of the old reasons apply and more. I haven't not known a disadvantage in owning a gayman pc as the years go by. Gaming itself is collectively worse as its potential is squandered. Thus, all of gaming suffers, but owning a gayman pc gets you the most out of what little gaymen has left to offer.

Is this a trick question? All of the old reasons apply and more. I haven't not known a disadvantage in owning a gayman pc as the years go by. Gaming itself is collectively worse as its potential is squandered. Thus, all of gaming suffers, but owning a gayman pc gets you the most out of what little gaymen has left to offer.

Mostly because a PC is a good thing to have in general, and they have no generations to worry about.

Most people nowadays are simply not competent enough to use a PC, so they should just stay away from it.

I like emulation and older games, and multitasking, so PCs are good. Easy piracy for the games that are available as well.

However, an used console is enough. In fact, I need consoles anyway, because a lot of the more recent games that interest me are on PS3/4.

Also, if you don't want to pirate, just stay away from PCs. Steam is complete shit, and it's a monopoly. You will never get real discs with actual games in them anymore, so what is the point of spending the money?

Paying for shit that you don't own and can't resell is dumb.

You can argue that games that are on PC are better on PC, but really, the games are made for console anyway, so the good games work well enough on console.

The reason to have a PC is that if you are not an idiot, you are probably going to need a PC anyway, so it might as well be a good one that runs games and does everything better and faster.

My general rule is to have a good PC, and wait until I can get consoles used and for a good price, so that I can play the exclusives (generally not too many, but enough to convince me). It just a versatile thing to own, that does fucking everything.

People that argue about this shit are retarded and should own neither. That's pretty much where most people stand.


My PC died in February. After 10 years. And it was actually killed by a storm while I wasn't around. I was pretty sure that it was protected, but it wasn't. It was dumb.

No reason to really get anything new until it dies.

I have an i7 now, because I can. Good enough for the next 10 years. Maybe more.

Everything that I own tends to last more than that, so I don't mind spending a little more than maybe I should.

What game is even optimized for anything like that?

RIP little guy. It's a fantastic processor, best one I've ever owned.

Anyone else get really attached to their hardware? I still have my Radeon 4890 lying around despite replacing it due to the fan dying. It was my first introduction to true PC gaming, as in 1080p 60 fps, all settings in all games maxed sort of thing. I just can't bring myself to throw it away, I consider it retired and having a well deserved rest.

Well-designed things make me feel like that very easily.

It's something that I fight against, when it happens, so that I don't become a hoarder.

"I can use the parts later" doesn't help.

I know that feel, I'm still on a 2500k myself. These current-gen AMD CPUs are only just breaking even with the 2500k in single-thread, multithreading in vidya is still just a meme, and Intel are just doubling down on their jew tactics lately, so I probably won't be upgrading until at least Cannonlake or whatever AMD have out at that point.

Wish they'd hurry up and hit 10nm already. Weren't we supposed to be on 10nm last year?

could be worse.

Intel is in full panic mode, releasing chips they planned to sell for servers. All X299 motherboards are required to support all X-series CPUs, so they all come with 8 RAM slots and 44 PCI-E lane support, but the lower end CPUs themselves only support 4 RAM slots and far fewer PCI-E lanes, meaning you are buying a $250-400 motherboard and then not using it to it's fullest potential. What a fucking waste of money. Meanwhile all AMD Threadripper chips will support 64 PCI-E lanes.


Are you talking shit about the 1600X? Because it's basically the best bang-for-buck CPU on the market today. It's a fantastic choice for the majority of people. The money you save should be put into a faster GPU (though prices of those are crazy inflated right now, sadly).


The 7700k is an overclockable CPU that Intel recommends you don't overclock. It's shit.


I have a trophy collection of sorts of my old GPUs. I can get rid of plenty of other parts but something about those video cards makes me want to hang on to them.

The AM4 socket will be supported by AMD until at least 2020, but I'd wager it would go a little longer.

what a load of shit, I have a build like yours and newer games struggle at native resolutions at medium settings

Might as well just get a 1600, the x-series Ryzen CPUs are just higher binned chips with higher base TDP and clocks - usually you can overclock the non-x variants to about the same level and you get a free cooler to boot.

Isn't the problem that games don't properly make use of that many cores? Same reason AMD's 8 core chips are worse than a good ol' i5.

True, but if you really like maximum overlock headroom and already own a good heatsink (most manufacturers are offering free or cheap brackets/adapters) then the 1600X is a solid choice.

You are clearly referring to the old FX line, which actually aged better as newer games are increasingly making better use of multiple cores. AMD's 8 core Ryzen chips are well beyond what an i5 can dish out.

I don't think it'll last that long, but I think we'll see AM4+ with some new features, and AM4+ Ryzen would be somewhat backward compatible with AM4. AMD usually goes out of their way to make that stuff work, and Intel likes to make 5 different incompatible versions of the same socket with the same number of LGA contacts.

AMD's official statement is that they will support it for 4 years, and yeah AM4+ is probably going to happen and I'd bet plenty of AM4 mobos will get BIOS updates that make it support AM4+ CPUs in the future.

Never gonna happen.

Resolution you play at may be the culprit there. Playing with everything maxed out isn't what it used to be anyway.

You forgot about RTSs in your reasons to own a gaming PC, my dude.

I suppose.


A really good example of what happens in modern games is DOOM 2016. The Medium, High, and Ultra Presets look almost identical (aside from the texture cache size, maybe). The low preset is the only one where it's easy to point out something is missing - namely the SSAO.

i5 760 here. Can't wait for Never Ever(tm) AMD Vega to be released and available for non-miners so I can get my fucking Ryzen computer built.

Just so I can emulate PS2 better.

VEGA cards will likely be sold out in a matter of minutes after they become available. Far too many people anticipating the launch, and of course you know miners will be all over that shit since the 400/500 series cards are all gone and these things are bound to hit a new level of mining performance.

This guy could have just said what most people with half a brain already know; that 4k is a meme resolution that marketers and TV/Monitor companies invented to sell the next HD, the HD of HD, and ended the video at the one minute mark. The ones that suffer the most are of course gaming. 4k is wholly and absolutely unnecessary in this day and age.

To give anyone who still doesn't get this an example, what's the first thing people with toasters do when they start a game their computer barely handles aside from turning off most of the graphical settings if not all? Lowering the resolution. Forcing the computer to display demanding games at near-HD level resolutions is death for toaster-tier PCs. But some toaster-only PC gamers find that if they set the resolution low they can sometimes play a game on medium or near-high graphical settings.

So back to 4k. In a time where consoles are barely catching up to 1080p and games are already console-tier in terms of what hardware developers build their games around, this results in developers reducing the amount of entities that will require graphics so as to ensure that 4k will "work." Remember when the 360/ps3 generation started and developers were excited that they will have hardware to use to draw so many enemies on the screen in high detail? We're going back to the ps2/xbox era of hardware capability but with shinier graphics. We have the CPU power to support intelligent AI enemies but we're squandering the graphics to draw these enemies in high detail on a resolution we shouldn't be delving into. The end result is another reason for why gaming is stagnating.

By the sounds of things, in an alternate dimension to the one where Intel died.


No goy, buy that PS4 Pro, upscaled 1440 @ 24fps is perfectly acceptable, not being able to toggle off some of the fucking egregious visual effects is completely acceptable, third party exclusives are completely acceptable.

Well, anything that is better with a mouse should be played on PC, obviously.

I mostly play games with my stick or my PS3 controller (for actual PC games), though. Maybe I should play some old PC games with good old mouse and keyboard. I just don't play games like that very much nowadays.

Consoles should just get on the Dreamcast's level and fully support this. Everything is USB, so why don't they?

Hell, they should also just imitate Atari and label every game with "Used with X Controllers".

Well, it's never going to happen. Because we live in a shit consumerist world dominated by casuals that can't handle more than one type of controller. It's all about just shoving more garbage down the masses' throats, so it's better not to force them to learn something. Optimizing controls doesn't matter in the current industry.

I can't play Beach Head or IS Defense on a console.

Never had any problem overclocking them.

How far are you pushing it though? There were enough people complaining about temperature issues with their chips that Intel had to tell them to not overlock, when overclocking is the point of paying the K model tax. Best way around the issue is to de-lid the CPU and replace the thermal paste used by Intel, or do a direct-die cooling method.

Most I've pushed it to was 5.1 GHz, but I normally run at 5.0 GHz. At 5.1 it will crash every once and awhile. Though the K isn't just an overclocking tax. It also has a 0.6 GHz boost to the standard clock.

Which is disabled and irrelevant to anyone overclocking. The i7-2600k boosts to 3.8 GHz but I have mine running at 4.2 GHz at all times.

I meant that its an added benefit over the standard version even if you don't overclock. Of course I'm going to overclock, but some normalfag might just buy the K for the higher base clock.

You definitely want to have a good gaming PC in case of a flunk.
Back in 2015, I wondered like you too, because for 4 years I hadn't played a single game that I couldn't with a 2006 rig. But I kept upgrading anyway. Then Rainbow Six Siege happened and I'm glad it was at least decent.
The Vanquish port also made me glad I kept the PC up-to-date to at least play it at 60 FPS on HD.

The reason it's an overclocking tax is because non-K CPUs are locked out of overclocking capabilities. If you want to overclock, you have no choice but to get a K model. AMD chips can all be overclocked, they give no fucks.

There was a time when Intel let you overclock budget CPUs, like embed related was one of their best examples of what you could do with a cheap chip.

I know. I was just pointing out the higher base clock was also a "feature". Even though it doesn't matter to us.

Pentium 4 here, upgrading is for nerds.

How's TempleOS treating you? Have you basked in the glory of our Lord today?

It runs great, theres none of that ring 3 bullshit to hold me down.

Blessings of the Lord upon you brother, may the holiness of 640x480 fill with joy. Amen.

maybe you amd fags wouldnt be forever playing the catching up game if you worked on your shit instead of shilling here

Newsflash, Intel is the one playing catch up right now. Ryzen caught them with their pants down.

Though in defense of Intel, the K series might of come about due to yield rates. The i5 and the i7 are the same chip, just during manufacturing the i5 chips were two damaged to become an i7. It's possible that the standard i7's just couldn't even be overclocked to the standard clock of the K.
Or more likely its just a cash grab, or both.

Best option is to get a mid-range one that won't break the bank, this lets you play games from 2005 onwards that won't run on a toaster but are still worth playing. Almost everything that actually needs a top end PC is bloated AAA trash.

Are you watching the same video? I'm still watching but this is about ultra settings, complaining about 4k being pushed.

You must have missed the part where it's a virtual machine with nothing on it other than games

For about $500-600 you can build a computer that should be capable of fast PS2/GC/Wii/3/DS and probably PS3/Xbox (if they ever come out) emulation, run everything up to 2010 in max settings and play the rare good CY era game in medium-high.
Going full gaymans (>$1000) its only really recommended if you're shooting for 4K with the aforementioned rare good games. Old games in 4K will usually look like shit. You could potentially get purty PS2 graphics with upscaling though.

You're going to do some video editing, watching anime while shitposting and also playing video games on three separate displays or you're one of those twitch/youtube faggots.

Using a gaming PC for actual work isn't recommended as you'll probably just get bored and play video games.

gaymen pc worth it if you play any sort of competitive game and want muh 120 frames

I can't think of any "competitive" game that is demanding enough to require a monster rig for 120+ FPS. They need low requirements to appeal to a wide userbase after all. also

your post hurts to read

When they released the Pentium 4 underage faggot.

This thread is full of the lamest excuses for PC players i ve heard in my life.

Surely your 2010 hardware can run NieR at 1280x800, maybe you can play Tekken 7 at 30 fps or you are satisfied by running Endless Legend at min settings and waiting 2 minutes every time a turn passes while your Pentium G processes the other turns.

But for people that care, maybe there is only 3 games worth playing but i will want to play them at their best.

All of you who can't simply not stop owning literal toasters or upgrade your old shit, i shiver to think you call yourselves PC players, also don't think you are part of the Mustard race ever.

So I have. Why use 10 for games when 7 is; rock stable most of the time, the UI hasn't got multiple personality Disorder, it doesn't have the capability to automatically remove unsupported software after a update nor does it love to reset drivers back to stock. Finally there are no good games that are Windows 10 exlusive because why lock one's to DX12 when userbase growth has stalled.

Denial at it's finest

>>>/reddit/

Mine's still going strong.

I need to know if there is any reason to play vidya at all these days

Something to keep you busy until the sweet release of death.
And of course being murder and rape simulators they are the perfect training material for the coming race war against the kikes and their army of niggers and mussies.

I'l rephase then "willing installed".

Isn't the old to the escape the shit hole that is the real good enough anymore?

End your life.

That is Reddit tier faggotry. Kill yourself.

Those are just poorly optimized, they aren't actually built for latest hardware. Don't know about tekken, its shit since 4.

...

So the i9 is the new Pentium 4?

why do you fucks keep spouting this nonsense?
this isn't 2015 anymore and most games coming out now actually make use of more than 4 cores

Moores Law is dead

wow op, i'm sure that intel is doomed

I thought i7 had 6 cores

Epic memes my dude!
Its great to see AMD is actually trying though. I hope they pull through just to get Intel off their lazy monopoly-driven stagnated ass. In all honesty though Ryzen is fairly disappointing.

The 6900k is actually 8 cores if that's what we're talking about here

Honestly, everyone who uses consumer CPU's is a faggot.

I meant the first one

The 6950x? That's a 10 core chip

Nigger, you're not making any sense. If you mean the current flagship i7 AKA the the 7700k then that's a quad-core chip

Xeons usually have poor single-core performance. They're designed for massive parallel workloads like what you'd find in servers or supercomputers. Basically the ability to give many small tasks their own cores is a significantly more efficient use of die size then giving the same small applications to 4 faster cores. Conversely, it makes more sense for an application processor to have less but faster cores because for a typical application workload there will usually only be one heavy-CPU intensive application running at a given time, as opposed to 18 CPU light programs running at the same time in a supercomputer workload

first one meant to reply to;

Guess I'll wait for Ryzen 2, or whenever they roll out 7nm.

The fuck do I know
I don't know computers

Me too also not even overclocked… if I could even find a chipset for ocing lol.

wew

I'm getting a prebuilt and nobody can stop me!
prebuilds are actually cheaper in my slavshit country

I'm going to stop you

You're same person who thinks that skinny kids who can't afford to eat right "aren't real weightlifters," aren't you?

as far as I know Xeons don't overclock

Is this place filled to the brim with pro video editors, developers and hydraulics engineers running CFD? If not how is this even relevant?

we're mocking it

Why would that call for an i9 or even an i7? Please stop talking about subjects you know nothing about

Unless you're trying to push enough pixels to get smooth gameplay on a 4k monitor, no.

Unless you have specific demands on performance for x or y intended application there is absolutely no reason to spend more than $600 on a PC in the current era. A computer in that price range will play all current and past games at 1080p/60fps — unless you're one of those morons who buys prebuilt in which case you ought to choke on a car's exhaust.

Good luck getting anywhere with CFD without HT/SMT.

Its called using a processor actually designed for massive parallel workloads like an Intel Xeon, idiot. Why the fuck would running CFD call for less-but-fast CPU application cores over a processor actually designed to handle the multiple computations required for CFD at once? You know, like an Intel Xeon..

glad we have you in the same thread as the guy that seemed to think i7 and i9 have 7 and 9 cores respectively
I guess it evens out

PC has reached it's top boi, face it, no more relevant improvements until quantum stuff

I think even the latest AMD processors incorporated SMT if SMT was all he was getting at.

he was getting at i5 not being a great buy for that workload
which is a "not even wrong" kind of statement
xeons go up to higher core counts than consumer CPUs, which include i9

You might be calling me daft for suggesting it's appropriate be doing CFD or FEA on a consumer cpu but the kind of roles you'd have a xeon for are the only roles that are even remotely appropriate for a $1000+ 10+ core chip.

It makes all of zero fucking sense to discuss any i9 here, it'd be like going to a yacht club for a conversation on bulk carriers.

What the fuck are you even on about?

why zero sense?
there's 6 and 8 core i9's
are you one of those other fucks that for some reason keeps holding on to the meme of games never moving beyond 4 cores?

You're right that discussing i9 is pointless here, but you were wrong in saying that i9s are convenient for the applications you mentioned.
Basically they have no use, that's the point.

I never did, you're putting words in my mouth.

You clearly implied here that talking about i9 is only relevant if you're a pro video editor, etc.

Which holds true. That doesn't mean it's a good choice for such work.

Do you get off on arguing semantics or what?

That makes it irrelevant for that work. It's like discussing server processors for videogames. Possible but not even worth of discussion unless it's for giggles.

I'm not the guy you were talking to who said Xeons are better. I was simply explaining why your post was stupid because you seemed confused.

Considering how most devs struggle with basics like not tying physics to framerate, it's ambitious to expect them to manage 10+ threads.

What's so wrong about inciting competition?

Intel trying and failing to deliver MOAR COARS in consumer packaging in no way incites competition.

Get on my level.

...

Intel pretty much dropped the ball and failed.

Same here. Only my humble GTX 650ti is a bottleneck. I wonder if a GTX 960 would be a good, low-budget upgrade.

BeamNG.drive loves many-core CPUs.

4670k at 4.4 here, even at 37C outside temp it's fine.

I'm still on a I5-750, quite frankly I'm waiting for the really cheap SSDs before then. I mean who knows what interface they would use then. Remember Sata express?

Nigga, just wait until Ethereum et al crashes and buy a super cheap RX480. I can't wait until the crash happens, so I can finally upgrade to 2x4GB RX480s.

Not only could you run these stable with +50% MHz… you could physically mod them to enable SMP and run dual Celeron 300A @ 450MHz

You are absolutely correct and I FUCKING HATE it. I do not want to support the Intel/Jewvidya cabal.

I want my Totinos pizza rolls

Literally all Intel cpu's after 2007 are kiked to hell. I want to get a Pentium 4 machine or some other good AMD machine for opsec.

FX Series CPUs do not have botnet, are much cheaper since Ryzen, and are actually half-decent for gaymen in real world usage.

Yeah I heard about the FX 8350. It's on my radar too.

Unless you want to emulate PS2, because PCSX2 sucks ass.
inb4 le PCSX2 defence squad

...

So you can alt-tab between 50 tabs of 8ch, your Dorf Fort game and Metroid Prime at 4xIR/60FPS

PCSX2 absolutely does suck ass compared to a masterwork like Dolphin that always manages to out-do itself. FX series CPUS would probably be the worst choice for that right now, but to I'd recommend watching YT videos on the performance in said games for that processors and a comparable video card.

Where is the archive.is? OP you said there would be an archive.is!

Why hasn't somebody taken 64 AVR's and string them together?

AMD are so far behind Intel and Nvidia they've effectively given them a monopoly where they can do whatever they please. 15 years ago this would have hit Intel bad; now they couldn't care less.

Just wait for it

This retard is still posting. You are entirely at fault for using shit hardware and software then complaining pcsx2 wasn't designed for it.

how not to bait PC goyim:


how to bait PC goyim:


you can just feel the sheer insecurity oozing from over half of the replies as they scramble to justify their purchases.

console wars are OK when it's PC lol

See I fucking told you. Took less than 4 mins

Dubs confirm that shit is on its way. It always fucking happens.


One (you) has been deposited into your account.

Your poor little ass seems so hurt

Tekken 7 runs like a dream and can even be ran on integrated graphics if you had to with some choice config file tweaks. Don't talk shit about fighting games, nigger. It's the only genre men have left that is inherently impenetrable by bitches and whores.

Fucking snap your own neck if any of you play that shit

This, I bought an I7 back in 2012, with DF I can run a large fort with 200 dwarfs around 30-60 frames.

Was there a drop in tv sales or some shit, that they had to push this 4k nonsense for sales?

Why are redditors so bad at making OC?

the fact that you're trying to attack the response only proves that you're from reddit, 1, and 2: it proves you have nothing to offer logically to the conversation and thus must resort to semantic assassination in order to have even a tiny foothold in a conversation.

If you can't take an insult, then get off reddit.

That just tells everyone that you really aren't from here, fool.

I'm going to start using this as bait. Thanks user!
Also
Christ user

Fucking right!? They spill they spaghetti harder than anyone in this board and think their ironic shitposting is a genuine way to argue and convince people. God fuck, they're a living breathing walking joke.

go home reddit

I wanna fuck that

hmmm?

oh fuck, I forgot that I was in mod.php.

sorry guys

The smug makes my dick twitch

What kind of case would be considered "good" for building a PC? Can you just use anything? It's a dumb question, but I've never really seen anyone mentioning it.

WHAT DID YOU DO MARK!?

I used the name field while forgetting about forced user.

A nicer case, you're basically paying for better cable management, it being easier to replace hard drives and stuff, better air flow, the looks, etc. It's generally not worth it, though I usually try and spend money on a nice enough case to have fan filters and enough space for fans and a long graphics card.

Im thinking in having a new PC, to replace my old prebuild PC, and I tought of getting a RX 570 (When they are buyable again, THANKS MINERS) or similar (AMD), a Ryzen 1400, 8GB of RAM, A NOX 650W HUMMER PSU, and a WD 1/2 TB of HDD. Case would be something between 30-40 euros

Besides browsing/non-heavy stuff, I will game in 1080, and im not a graphic obseebsed guy. Any thoughts/objections to this build??

RX580 should drop in price dramatically when those minerfags get assraped by a Altcoin crash, the R5 1600 is 20% faster and has two cores extra for around 50€ more: cpu.userbenchmark.com/Compare/AMD-Ryzen-5-1600-vs-AMD-Ryzen-5-1400/3919vs3922

Up to you, though. Or better yet: buy a used RX480 when the prices crash, something I am planning on doing.

If that happens, then it wold be good.

The 1600 is really good for the price, but it still cost too much money for what I want, and even if it lowers in price, then will also the 1400. And unfortunaly, there isnt a cheaper AMD ryzen hexacore cpu (And Ryzen 3 will be all quadcores…)

Im interesed in emulation too, and if it isnt as good as the 1600, the 1400 would be good enought for it?

profitmargins on 1080p TVs got too small

Yes a massive drop, or more accurately sales regressed back to the pre-HD mean.
It's actually a very similar situation to Nintendo who caught lightning in a bottle with the Wii, and have pushed every gimmick they can think of since to try and replicate it's success.

>>>/reddit/

I imagine. I hate the mentality that companies have of needing constant income instead of providing good products/service.

Making profit for their owners is the single purpose of companies in a capitalist system.


Do you think they are paying sony/M$? It makes no sense for consoles to go after 4k at all atm.

You forget Sony is also one of the biggest TV manufacturers in the world, of course they'd want to leverage their gaming division in order to sell more 4k Televisions. M$ is likely just copying the rest of the industry in an effort to not fall any further behind the competition, they probably think the more they throw around the 4k buzzword the more units they'll sell.

What kind of monitor do people recommend for gaming or 3d modeling? Should I stay on 1080p or move to 4k? Or is an intermediate solution good like a 2k panel?

The latest fad actually is to push 4K HDR displays, they are dubbing them as Premium UHD or SUHD even though resolution is completely different from contrast, color reproduction and so on.

2K is good, I mean, 4K IS technically 2K.
You don't see people announcing 1080 as 1920.
Anyways. If you don't care about a PSP, get the Ryzen.
Are you on a budget? Do you want to save money? If you give no shits about power consumption or botnet, get the i9
Ryzen is the best option for power and voltage.

Does Sony even still own their TV division?

I plan on getting a new PC next January. I need to save a few bucks but am also waiting on Vega. I want to get a Ryzen 7 with Vega but I do not know which monitor I should get. I have a 1080p but am looking to get something for 3d modeling. Should I be getting 4k?

Forgot pic. Sony tried to do the same thing during the short lived 3d fad, most people probably don't even remember this it flopped so hard.

Steamfags and/or redditors basically ruined the case market, it's flooded with overpriced pieces of trash.
Anyway on the cheapest side you want a case with the slots you need, the fan mounts you need and you want it made out of metal. There isn't much else one would want on a case but some good but more expensive ones are modular, have built in filters, holes for cable management, screwless drive mounts and rubber-padded fan/drive mounts.

If you do any kind of texture work you'll want a monitor with better than 8bit color, and then you want it to have the best possible maximum contrast and color accuracy.
Other than that yeah higher resolutions are beneficial for everything including/specially modelling. It's all about you feeling like you want the increased precision of a higher resolution.

I think sony only chopped off SOE and that was basically because SOE's workers didn't actually do anything and just lazed around collecting their paychecks.

You are bottlenecking your video card with that, but it certainly isn't the worst thing in the world. I dropped a pretty good video card into my dad's old Phenom II X4 machine and it plays modern games quite well now.


That's great and all, but the fact remains that people bought a 7700k because it's the chip you're allowed to overclock, and Intel is telling people not to overclock it due to heat transfer issues. Imagine buying a sports car and then being told you should never drive faster than 60mph with it.


Not if the eSports corporate sponsors get their way. Look at how shit SFV is thanks to their "make it more fun for the viewer" mentality. Tekken 7 does seem pretty pure still though, thankfully.


And yet they went full retard by not putting in a 4K Blu-Ray drive into the PS4 Pro.

I like the Fractal Design stuff.
Good features with sometimes elegant and/or understated looks, while not being unreasonably expensive.

SimulView was a really nice idea, though. It allowed same screen multiplayer without having to split the screen. Instead, each player got a full screen that was only visible through his pair of 3d glasses.

i7-2600k OCed at 4.0

i literally have no reason to upgrade, it runs almost everything maxed

intel fucked themselves with their windows 10 kikery

I was running an overclocked amd phenom x3 with an unlocked 4th core and an amd 5770 until I had money to blow and got an i7-4770k and a gtx 780 ti.
It still runs everything great and I'll get a waterblock so I can push the OCs harder, I managed to have the 780 ti almost a 980 on benchmarks.

user I still own my ATI RAGE all-in-one.
I still have a geforce 440 MX.
I still have a working pentium 3 with the infamous mount.
I still have a cynix 486 DX

Is it worth going to a ryzen 7 from a i7 2600k? Or should I get a 7700k?

True, but i'm pretty sure they were deathly afraid of "599$ U.S. dollars part 2 electric boogaloo"

In your case wait for Zen+ so you can benefit from a upgrade that will give you higher multi threaded AND single threaded performance.

I was very close to dropping my 2600k for a Ryzen chip, but I decided that I can wait one more year for the next gen of Ryzen before I pull the trigger. I do video/audio editing and 3D modelling, so MOAR CORES is more beneficial for me, gaming boosts would just be a nice bonus.

If you are purely concerned with gaming, I suggest watching this 2600k revisit video to determine how much you would gain from various upgrade choices.

No idea if ryzen would beat a 2xxx series but intel still has the lead on performance. If you're looking for the best cpu for the price go AMD otherwise intel is still it.

there was an infamous mount?
I remember us having a cyrix
really sucked not having a genuine intel 133mhz back then
fuckin MMX magic

buying a quadcore in 2017 is something people are gonna regret, even if the performance seems tempting at the moment

I have been getting into 3d modeling bigly so just wait and get a Ryzen plus next year?


Yes, this seems to be the case. AMD seems to be goading ((( Intel )) into a core war. It seems like we would have mainstream 12 core chips now if Intel did not have a monopoly?

Ryzen 7 are 8 core/16 thread chips.

Well I guess I should specify 3D rendering instead of just the act of modelling. If you actually spend time and CPU cycles doing a finished image or video render of your models, then a Ryzen chip would dramatically outperform the 2600k.

One thing to consider is that RAM prices are currently inflated due to shortages. In a year from now, the DRR4 you will need for a Ryzen (or a modern Intel chip) should be cheaper, and in theory PCI-E 4.0 will also be available on motherboards so you can be prepared for those video cards as well. This is another reason I chose to wait.

that's what I'm saying

Right, the user is asking if it's worth it for an 8 core, and he's saying, "You'll be sorry if you buy a quad-core."

Assuming he is talking about the "real cores" which I also seem to do well with, then he has a point.

The thing that is very obvious to me about real vs virtual cores is their performance, you are not really getting exactly 2x performance with a split core, very much like you are not really getting exactly 2x the performance with SLI or Crossfire. The point? When you're doing really hi STP tasks like emulation (of which Dolphin and PCSX2 can actually offload some tasks to a second, but only a second core), I've seen steep performance improvements with the virtual cores turned off.

The rest of the cores will remain useful with new software becoming increasingly multi-threaded. Just look: Intel is hopping on the MOAR COARS bandwagon with the i9, and assuming they are still the leaders, software writers are going to begin to get the hint. If we are building our computers with the intent of them lasting over 5 years, investment now pays dividends later, because as usual, every extra year that you can stand to wait to upgrade your hardware, you are getting more performance per dollar.

Intel has been jewing the market with quadcores for years now and actually got so complacent as to let themselves get caught off-guard by Ryzen.
Which is something I have trouble comprehending, but apparently Intel sometimes really don't know what they're doing. Case in point: the entire 2017 HEDT line-up not being soldered to the heatspreader, and other shenanigans with the X299 platform.

Ah alright, sounded like you were implying buying a Ryzen 7 would be buying another quad core. I think quad core chips with hyperthreading will still be good for gaming longer than you might expect, especially for the price. Essentially they will be the new i3's (and they are already the Ryzen 3's).

Yeah, I get where you're coming from, but my hunch is something like a Ryzen X1600 is gonna age a lot better than the 7700K, even if that 7700K performance is pretty fucking amazing in today's games. I truly pity the people that bought an i5 this year though.

They're testing the waters regarding how much they can get away with without having to do anything but more marketing.

If I am doing rendering on my machine would it be better to use the gpu?

I guess I can't blame a company for trying, but goddamn if the list of "shit Intel pulled" isn't long enough already. People just don't seem to know or care. There's just this idea of Intel, and Nvidia as well, out there that makes people buy their shit without thinking.

Yeah those dudes got boned. I expect to see a lot of used ones on ebay and craigslist soon.


They are also having issues doing their next die shrink. The limits of silicon are being approached. Even AMD won't be able to do the MOAR CORES trick forever. The Ryzen Epyc CPU is four of their 8 core chips packed into one. The socket design and power requirements are actually built as though it's two CPUs in one unit. If they wanted to do something like 64 cores in a single CPU in the future, they would need to reduce their die sizes by half or so in order to make something that isn't nearly as wide as the motherboard itself.


That depends on what your needs are I suppose. The trade off with GPU is faster speed but lower accuracy in your renders. It's up to you to decide what is acceptable. I don't think I've ever done a GPU render out of 3DS Max now that I think abut it. Never really learned how to set that up.

I totally forgot that I was making a WebM of the most important part of this video. Here you go.

What other choice do people have besides Intel and Nvidia? The entire tech industry is a monopoly and the government does nothing because the techies backdoor their products for the FEDs. Anti-trust 100 years ago was because the Rockefeller tried to set up their own "Cathedral" and eclipse the Federal Government. The US government does not care about monopolies unless it threatens the oligarchs power.


Did you acquire 3ds max legally? I use blender atm.

I was using the free student version, didn't do any commercial work with it. I recently gave Blender a spin and I would love it if it had actual spline modelling. Still solid without it though.

AMD has never really stopped competing price/performance wise, even if their previous CPUs had no real high-end taking any sort of performance crown. RX 470/570 8gb was/is one of the best cards money can buy.

if you can find the damn things, that is

Before ryzen basically any intel cpu was better than AMD for gayming, or at least that is the perception out there. I have a 480, good card.

5nm is looking good, but you're right. After that, no one knows for sure how to shrink things even further (at least without resorting to other materials)

I'd say anything tool-less for drive bays is good enough for a sign of quality. In addition, go for

-Boxy chassis, the less frills the better for fan expansion
-USB 3.0 (duh)
-A PC width T H I C C enough to hold a real CPU cooler, if you expect to go beyond a stronger CPU than 84W TDP. Also, get a case where there's a hole behind the motherboard, so you can take off the big coolers easily without taking off the motherboard
-Don't worry if the PC doesn't fill the case with fans, you only need one case fan in the back for budget PCs, and the ones coming with it are a bit loud.
-Case holes on top of the motherboard spot come in handy for cable management.

My first case was $30 on sale. I can only use 1 USB 2.0 front slot at a time (one of them recently started getting loose), no room for a AIO watercooling loop, and no room for a Hyper 212 EVO CPU air cooler to take advantage of overclocking. Although it has served its purpose, there's little reason to shove a 4790K for max rendering if I can't get an aftermarket CPU cooler like the Hyper 212 EVO.

But if you don't care and are a little creative, then you can even make the case out of cardboard. You'd still be down some front USB ports, though.

...

SupCom FA is the only one that needs the power.

It's technically still thick if the case is sideways

NO

Picked that TV up for $100 a year or two after release. It is pretty good, especially for $100. Definitely wasn't worth whatever they where charging for it originally though. Simulview also work really well. It's a shame that only a few games used it.

...

I see a large difference from going from a 1080p display to 4k, flipping back from 1080p to 4k on many games is a night and day difference on my 35" monitor. Also have a large 4k display allows in greater multi-tasking since your screen is the equivalent of four 17" monitors tiled together with no borders. And 4k isn't very hard to run, any high-end card made in the last year and a half can run most games at 4k (not very intensive games like The Bitcher: Wild Cunt) at high to medium setting because when your playing on 4k you need very-little to no filters for your game to look great. I can run rising storm 2 at medium settings on a 980ti and I get 70+ fps at 4k, but to me it looks better than ultra settings at 1080p on my old rig (Fx 8150, GTX 970) because 4k brings out more detail in the textures. Although I agree that 4k is a waste on any monitor smaller than 30" because you don't have as much screen real-estate for split-screen multitasking, and also because the difference in much less noticeable.

There's no question we're nowhere near the limit of our own eyes when it comes to monitor resolutions, that's a meme spread by consoletards and b8ers

After Ryzen 6 hit, I knew Intel was fucked. 200 bucks for a 6c/12t, unlocked CPU which has similar IPC and uses less power?

Intel hasn't had to deal with competition like that in years and I fully expected them to panic and fuck it up. They did not disappoint.

144 @ 4K When.
My phone is 1080, I'm spoiled by high pixel density

Holy shit, I forgot all about hole technology

I still have an i7 920 running at 3.6ghz. Would a newer cpu that could hit ~4.5ghz really make that much of a difference in performance? I've had this i7 920 since 2008.

Intel has barely progressed with each cpu generation for a long time, but you're really pushing it with almost 10 years on the same cpu. Yes you'll get a huge performance boost with a newer one.

Soon tbh


Just buy a Xeon X5650 if your motherboard can handle those.

It sounds like you're just saying that. Is there an actual significant difference in perfomance between an i7 920 @ 3.6ghz compared to a modern i7 @ 4.4ghz?

Yes and no. Intel hasn't really done a massive architechure improvement since Sandy Bridge. Most i5s are just the same arch with some special sauce and adjustments, new features, etc. However, it's been 5 generations of those tiny boosts and adjustments. We're getting to the point that, when taken all together, they are actually making a different.

1st and 2nd gen i5s for instance are starting to get CPU capped on a lot of modern games with a GTX 1070 or stronger. It's not quite the point where you NEED to upgrade, but if you are rocking a first or second gen intel, it's time to at least start considering it. If Zen2 (which sounds like it's gonna be another die shrink to 7nm) is as good as rumors are saying, that might be the time to pull the trigger.

If amd pulls that off they'll have objectively better parts than intel
pls be true

it's just marketing
Intel's manufacturing node is always smaller than others, so their 10nm node might not be as far off from others' 7nm as you would think.

Yeah I know it's measured differently, but 7nm AMD is smaller than 14nm intel

AMD's 7nm products are coming late 2018. Intel 10nm products are coming early 2018.
I'm holding out for late 2018 myself before upgrading. Just glad there's more competition and options now.

Globalfoundries' 7nm is like 0.7nm smaller than Intel's 10nm.

AMD's prioritizing power efficiency over high clocks because they can really only afford one line of chips and they want it to be server chips because that's where the real money is. Ryzen is a clever design that let them hit consumers, high-end workstations, mobile, and servers with what's basically parts of one server chip. If their server chips make as much money as it looks like they will, maybe they'll be able to afford to produce a line focused on clock speed as well, but if they have to choose they're going to stick with power efficiency.

lulwat?
Isn't pretty much all *coin mining done by chinks with custom ASICs for the past few years?

New coins were made more resistant to ASICs so we're back to AMD cards.

There is a new 'coin' called Etherium which is massively profitable, especially on the RX 470, 480, 570 and 580. Unfortunately, those are also the best mid-tier GPUs as well, generally being superior and less expensive then the 1060.

Demand for them is sky high, making them unbuyable and letting Nvidia jack the price up on the GTX 1060 when they would normally be slashing prices to compete with the generally superior AMD cards.

AMD cards are favored for mining due to their architecture differences and use of OpenCL giving it significant performance advantage over any Nvidia card. It is possible to mine on Nvidia via CUDA, but it is considered a last resort if you already have that hardware and are unable to obtain nvidia.

Nvidia card prices are going up not because miners are getting them, but instead because gamers are being denied AMD cards and are instead settling for Nvidia. You now have twice the normal demand on the Nvidia supply from gamers, draining supply. Couple this with RAM shortages and we have the perfect storm of price gouging.

I think you should reread my post, that's what I was saying. AMD cards are sky high due to mining and sold out, and Nvidia is taking advantage by jacking up prices because they are unopposed by the AMD cards that were kicking their ass in that price bracket.

Of course it is you idiot. Any resolution that isn't your monitors native res is going to look like shit.

Except 1080p on 4k panels scales perfectly you idiot.

...

You should learn about what you're talking about before actually doing it

Thanks diversity hires.

waste of money, IPS is the best

Lets have a little math lesson just for special little you!

2160 divided by 1080 equals 4

3840 divided by 1920 equals 4

This means that when you set a 4K monitor to 1080p, one rendered pixel will always take up 4 real pixels on the display!

Therefore, 1080p on a 4k display looks identical to a 1080p monitor of the same size!

Now go back to playing with your action figures you retard.

Modern IPS panels can do high refresh rates though.

i5 3570k OC'd to 4.2 GHz

Fuck intel's overpriced i7s

I do a lot of CAD and photo work, single thread performance is pretty much the only thing I give a shit about and the 7700k was the first cpu that made me think maybe it was time to retire my 5ghz 2600k. Now that I'm seeing the 7900x though and how goddamn easy it is to delid it I'm seriously reconsidering my options. If people are consistently able to hit 5ghz with circumcised 7900x chips then suddenly becomes very attractive.

We'll see, all the 144hz monitors i've seen have TN panels because companies do not want to pay for beefier display controllers, and also because most e-spurts gamurz (which is what these monitors are marketed too) can't tell the difference between gray-scale and full color.

8K when?

They're not dyeing any time soon. They and AMD just got a massive gov grant for hundreds of millions of dollars to make our infrastructure's next super computers.

Pics or it never happened.

hello dyslexia my old fiend

The tech doesn't work like that. Your panel will be upscaling the 1080 to fill the screen. It will look like garbage.

i really recommend you learn about the subjects you talk about

May want to explain that to him considering thats exactly what happens with a 1080p gets stretched to 4k

Your image always reminds me of this

That's what I addressed in the first sentence of
The point I was making is that 1080p on a 4k screen looks exactly the same as if the monitor was native 1080p, so flipping back and forth between 1080p and 4k on a 4k screen is demonstrative of the difference between the two resolutions.

Did you even pass the 1st grade, or are you fucking blind so you can't read what I wrote.
The fuck on out of this thread retard, go play some minecraft.

You too nigger, read this

Remember, goy: Single core performance is everything!

It doesn't , the math works in theory but it doesn't work that way. It's scaled/interpolated regardless and looks like shit.