Is mobile the reason desktop CPUs haven't advanced since 2008?

Is mobile the reason desktop CPUs haven't advanced since 2008?

Other urls found in this thread:

en.wikipedia.org/wiki/Indium_gallium_arsenide
eetimes.com/document.asp?doc_id=1323892
youtube.com/watch?v=IuLxX07isNg
amazon.com/Amd-FD9590FHHKWOF-Fx-9590-8-core-Black/dp/B00DGGW3MI
twitter.com/NSFWRedditGif

Yes and no, it's the death of moore's law mostly.
However since your average consumer only cares about 3 things;

1. Facebook
2. Youtube
3. World of Warcraft/Call of Duty

Neither of which requires a powerful CPU, Intel figured that instead of pouring billions into R&D to figure out how to overcome not being able to shrink gates much further, they instead funneled money into their "power efficiency" department and called it a day.
Consumers ain't gonna give two fucks if your new processor has 24 cores or can run at 5Ghz without requiring liquid nitrogen, but they'll hail you as the second coming of Christ if you tell them that with your new product they can browse Facebook and 9gag for 6 hours instead of 3.

Heck, your average Joe doesn't even want a desktop PC anymore.
They just want a phone so they can like their besties tacobell burrito photos while on the bus, and a tablet so they can watch pewdiepie videos in bed, and maybe a gaming console so they can play the new [Call of Duty 12: Modern Warfare In The Past] while telling everyone how many times they banged your mother.

You forgot about Pokemon Go. It's all about catching pokemon fam, get with the times.

Every drop in power consumed by a CPU or GPU means the same amount also saved in cooling, when you are running hundreds of servers 24/7 every watt you save by going with a more power efficient CPU/GPU means a lot to operating costs.

Heat and Moore's law is whats holding them back.

I don't think Moore's law is holding them back. I think the market is shifting because people are moving to cloud computing. That's why server processors are getting preferential treatment for development. Clients (the average consumer) in the mainstream computing market are no longer required to be fast. They just have to connect to a faster computer.

Look at Dell. Ever sine they bought EMC (Sept 7, 2016) for their superior server hardware, Dell is no longer "Dell Computers", as they are now "Dell Technologies" and are shifting their focus even more towards enterprise servers and cloud computing services.

If you look at Xeons you will find they tend to be lower clock, higher cache and longer queued. They also have many cores.

This is for any reasons (nature of server applications etc) but also to lower heat and energy and the reason you need to do this is because of Moore's law.

We are at the point that transistors can't get much smaller before hitting the quantum tunneling effect or get much faster without hitting diminishing returns on heat/energy/speed.

CPUs are being intentionally held back to promote thin client "cloud" solutions.

Stop please, stop. I cannot take it anymore.

A monopoly is holding Intel back. Why bother competing when AMD has no viable products until Zen drops?

Asians are strange. You never see attractive ones in real life, but you rarely see ugly ones posted online.

As a cynic i believe the military and surveillance state wants to have a bigger comparative advantage over its subjects and that's why they choose to delay the release of new technologies.

LEARN THE PHONE'S PRONOUNS YOU SHITLORD

good looking asian girl has a carrer almost guaranteed

It's because multicore programming is hard. It's done deep in industry (I do it in networking) but Pajeet-tier software domains can't do much with it. So there is a market for servers with 20 cores but desktop users don't need more than 2.

Why would they focus on desktop CPU's right now? That's retarded.

No. We've hit a wall. Way back, new models had higher frequencies and with that a huge speedup. But you can't keep increasing the frequency forever: the higher the frequency the more heat you produce, and good luck dissipating that much heat in such a small scale.

read about indium gallium arsenide. for years it's been considered as having more potential for development vs silicon but it's harder to work with. intel has suggested they may use it in 2017, also that they may need to resort to it to get beyond 10 nm.

en.wikipedia.org/wiki/Indium_gallium_arsenide

eetimes.com/document.asp?doc_id=1323892

i remember reading (alhough i can't find the source now, the wikipedia page used to link to it but it's not there now) about how even in the 90's U.S. government labs were experimenting with InGaAs and had transistors at 100 - 200 ghz range. So I wonder what they might have now?
Also that it was expected InGaAs could lead to processors with expected upper limit around 850 ghz.

If we could get stronger Desktop shit, Desktop users would be able to do more than what they currently do. Limiting them to the same old crap year after year while sueing almost everyone who creates anything worth a damn big for some retarded patent infringement that covers bubble sorting or some other retardation is a recipe for sale innovation and communities tech communities that rot from the inside out.

2017: Year of the Netburst™

Why the fuck is it impossible for me to come to this board without having to cringe at least once at how embarrassingly disconnected from reality you people are?

Do any of you dipshits even own a CPU made in the last 5 years

No wait, don't answer that, I already know the answer "Hurr no because muh Intel botnet!"

AMD has no viable products until Zen drops
I came here to preemptively tell you that Zen will disappoint everyone.

Zen will be an 8 core Haswell. Not a disappointment.

...

WHY

Yes, they're all ARM :^)

...

...

Would be sweet if it works out


This is likely to be a major problem when the question is asked on how do you dispose of such chips.

yes several personally + a DC full of Xeons

You might be full of shit, you might want to have that looked at.

Wouldn't surprise me if a new law was lobbied into place making it illegal to discard the CPUs or store them in "unsafe" conditions (aka outside of a motherboard socket), instead forcing you to pay to return them to Intel, so Intel can reuse the raw materials you so gratiously gave them for free to make new processors they can sell for huge markups.

It's your wet dream for this to become true to support your Jew memes, isn't it

I see no problem with that, my shitpad will still be useful in the current decade.

youtube.com/watch?v=IuLxX07isNg

...

What the fuck is this user talking about? What does he want, us telling him to look into overlooking or tell him that a CPU is only as good as the rest of the computer. IBM makes server processors that clock 5.0. Gpu chips are trash in my opinion, I have a biased opinion due to making the mistake of buying a surface rt.

Where I work right now there's just one really attractive girl and she's asian, but yeah, they're rare.

You're not supposed to eat them, dingus

you know its amazing how much money is spend fabricating what is basically a tiny little imprint on some piece of rare metal shitchip

if only people have +10 iq on average as if magic in the autistic lobes of theirs, we would all be using debian on custom made chips printed by intel, I heard they did bitcoin rigs a while back for some user.

לפחות אני יכול לנוח עם הthinkpad x220 שקניתי מאיבי
ב200$

גוים יקרים אנחנו נכנסים לעידן האפל הראשון של תעשיין הצורן

You obviously have no idea what you're talking about. Moore's law is still valid and has been valid this long for a reason: Moore worked for Intel and since that time Intel has been setting its R&D budget so that this law would hold.
5 GHz will never exist. Ever. Babby's first electronics show 3.4 GHz is the physical limit.

uh wut?

There's been an 8Ghz processor before, the catch being that it needed to be cooled by liquid helium.

I have a 4Ghz CPU personally. It clear that getting to 5Ghz may be bloody difficult to do with conventional cooling but I have no idea where you got this 3.4Ghz figure from.

some processors can work at 12GHz, though they need liquid nitrogen cooling.
3.4GHz is definitely not the physical limit, and it's obvious you've never done such thing as "babby's first electronics".

You may want to try being less abrasive if there's a chance you're wrong.

Show me your 8 GHz Intel processors and how they were high-frequency AC engineered.

Commence back-pedalling.


There is no limit. It's an exponential curve of diminishing returns against heat dissipation.
Now go back on your meds you american cunt.

singularity fags btfo

Moore's Law isn't holding them back, Moore's Law is what they've been unable to continue achieving. They simply cannot improve performance by cramming more transistors into the chip anymore, so they've plateaued and all we're seeing is minor improvements in power consumption.

Why else do you think they're wasting so much time integrating fully featured SoC NSA backdoors now? There's nothing else for them to do but twiddle their thumbs and wait for someone to get quantum computers all sorted out.

this

look at this outgroup faggot and laugh

Moore's law is still true. You still can cram more shit per chip.

On x86. :^)

the fuck kind of rock are you living under, Intel hit 4 GHz forever ago, even historically shit-clocked architectures like SPARC are sporting 32 cores at 4.13 GHz now, and IBM is clocking their mainframes up to 5 GHz

Why the fuck did you bother to make this distinction? What do you gain?

you are ignoring clock speed and heat in your assumption.

also this:

It's about time to get the CPU development into other areas than raw power which often does jack shit (see: AMD). I'm pretty happy to see that my i5-6600 works at a room temperature

Desktop CPUs don't need to advance. Our CPUs are already very efficient. The software on the other hand, sucks dick. We think we are so smart with our optimizing compilers and retarded, unsafe languages. But we're still in the middle ages of software development.

Hardware-wise, the only things we're missing are:
- Cheaper ways to produce CPUs in order to cram lots of them in a box
- Better memory/bus architecture to be more efficient with multiprocessing

It existed already three years ago, you cancerous retards.
amazon.com/Amd-FD9590FHHKWOF-Fx-9590-8-core-Black/dp/B00DGGW3MI

lots of overheating complaints in the comments.

The CPU has a lot of features stripped from it to achieve that speed.

This to some degree, I wouldn't blame unsafe languages though.

I had no need to upgrade from my Core2Quad that I bought about 10 years ago. The main points of failure were

I still run my old c2q PC with an SSD in my lounge room as a media center, it'll also run pre-2015 games just fine on the 680ti.
Ultimately, it's the failure of programmers to properly use the resources allocated to them. Modern software developers have almost no concept of how data is represented in physical hardware, or of efficiency, etc. So I wouldn't blame the languages, but the programmers using them. Most projects are a mess of libraries that build upon other messes of libraries, the inefficiency stack is pretty high.

I would risk it.

No, because the X5677 and X5650 (over seven years old) OC to 4GHz+, cost about $50, and compare reasonably to newer CPU.

Are you fucking retarded? Compare a Core 2 Duo CPU to what is out now and kill yourself.

...

What's you're point in particular? And X5677 is plenty good enough to run any game even at its stock speed, the OC is just a bonus. Seven year old CPU and there's nothing wrong with it.

The better design of newer CPU offers minimal improvement clock for clock. And oh no... it doesn't have AVX..

He's wrong but change that to 2010 and he's right

The irony being that WoW is mostly CPU based, performance wise.

posting from a month-old shitlake box right now
feels no fucking different from any of my nehalem or sandy bridge hardware to any reasonable capacity that can't be explained by the SSD

maybe if you're a gamertard you'll feel a difference in your garbage console ports but for the rest of us the advancement is minimal and even on-paper advancement since ~2011 is fucking pathetic, jerk off to your 5% diminishing returns somewhere else

Gee, it's almost as if a CPU rated with a 220W TDP requires a high-performance cooling solution to keep temperatures in check.

Cue idiots running it in a case without proper airflow and this is what happens.

Wew, someone replied for me in another thread while I forgot about this one (the replies haven't been worth not missing so it's no big deal):

The maximum frequency beyond which EE fucks up is about c/L where c is the speed of light and L is the length of the circuit. Beyond 3-4 GHz (for CPUs) there is no choice but to make smaller circuits. With frequencies beyond c/L, no U = RI, no P = UI, and no Kirchhoff's laws. Not to mention the impossibility to guarantee two components will be synchronous enough since the electrical potential won't be the same along a wire.
Now back to Reddit. All of you.

Jews do not meme, they lie.

Yes, mobile became so popular that people have to cater to mobile devices which are underpowered in comparison to desktops.
You can still use a computer from 2010, just put an SSD and it will do pretty much anything you need it to as well as a new one, except video games of course.
The only thing that's going to make older computers obsolete quickly is the old GPUs/IGPs lack hardware accelerated decoding for h265 and vp9 codecs, so youtube and other video streaming sites will become unusable on computers that ran 1080p x264 perfectly.
It's not all bad though, all the focus on mobile means that at least in 2016 cheap and shitty laptops have great battery life, in 2006 you were happy if you could get a couple of hours on a consumer grade laptop.

People want convenience, not productivity