Commercial microprocessor transistors to stop scaling past 2020

Commercial microprocessor transistors to stop shrinking past 2021

semiconductors.org/clientuploads/Research_Technology/ITRS/2015/0_2015 ITRS 2.0 Executive Report (1).pdf

fudzilla.com/news/processors/41190-transisters-will-stop-shrinking-in-five-years

After 2021, the report forecasts, it will no longer be economically desirable for companies to continue to shrink the dimensions of transistors in microprocessors. Instead, chip manufacturers will turn to other means of boosting density.

In fact this is the last ITRS roadmap and the end to a more-than-20-year-old coordinated planning effort that began in the United States and was then expanded to include the rest of the world.

may as well buy that 1070

Other urls found in this thread:

eetimes.com/document.asp?doc_id=1160025
gnu.org/philosophy/who-does-that-server-really-serve.en.html
twitter.com/AnonBabble

Why bother making better processors in current year when you can just use (((cloud computing)))?

Why the parenthesis? Some weird attempt at bolding text?
The board supports text bolding natively, fyi.

In the immediate future stacked memory and stacked cores will continue scaling performance and memory density in parallel domains such as machine learning and computer graphics. We may see breakthroughs in heat management by necessity for this kind of scaling.

Overall I think this is a good thing because it will force the manufacturers to adapt and explore novel chip designs, architectures or even entirely new computing paradigms instead of making incremental improvements to the last generation of silicon semiconductor chips.

...

Because it's current year summer.

DAMN IT SLIM JIM

Cloud/HPC drives processor demand, user.

parentheses is from Holla Forums and implies malign Jewish derivation.

What I don't understand about 3D processing is how it overcomes existing thermal performance limitations. I guess you could stack two 65w dies on top of each other, but that puts the chip at the current high end of power consumption. That's not very 3D.

There was an addon for (((Chrome))) (Firefox too?) that did things like that, look it up. It's now a grease monkey script too I believe.

It doesn't even have HBM, none of the 1000 series cards do. AMD driver flops don't mean that 3d stacked memory isn't the future for everything, just look at new Samsung SSDs for example. May as well hang on to what you have until there's an HBM device in your price range, hold off on the glorified 900 refresh with a price bump for now.

It's worth noting that this has very little to do with being able to make smaller transistors and everything to do with it being cost-effective. Just like having multiple metal layers was the holy grail in the past, so is having multiple silicon layers today. A 3nm FinFET was demonstrated in 2006[1], made by ion milling with something akin to an electron microscope.
Pretty much all of the time is spent in charging the metal these days. The wires are comparatively huge when put next to the transistors. Wires are also larger the higher the metal layer you're on, so when you hit metal 12 it just gets ridiculous. Being able to place transistors physically closer to each other will be the biggest win in recent memory. Basically all gains you're seeing nowadays are from shrinking the amount of metal per layer.

[1] eetimes.com/document.asp?doc_id=1160025

Wouldn't we move on to architectures such as RISC by then?

Is there any way to ballpark the performance increase of HBM on a chip like Pascal?

my interest in newer processors ended when I learned they are being backdoored.

and to that objectively worthless filth it should return

I've got bad news for you, buddy.

You know what?
I want a return to good old BJTs, none of that CMOS crap we're being served these days

...

It wouldn't be so bad, having a 1070....

>>>Holla Forums
>>>/leftytv/
>>>/leftyb/

/leftyb/ was one thing, but this is getting ridiculous and autistic on a level I didn't think was possible.

yes Holla Forums it's the jews. the jews invented quantum tunneling and now it looks like 5nm transistors will never work.

we're not hitting the hard limits of fabrication and materials science it's the jews making you pay for amazon ec2.

are you triggered, snowflake~

if you removed the latency issues, the supercomputers from Permutation city seem Rad-o.

Can't come soon enough.


Once it starts it just don't stop.

Holy shit they need all of the safe spaces

The reason these limitations exist is because we've never had to deal with them before. Having to stack dies will force new innovations to be developed and adopted, such as electrolytic liquid cooling systems that can both efficiently deliver power and dispose of excess heat.

gnu.org/philosophy/who-does-that-server-really-serve.en.html

Maybe people will start writing better software now.

Good one.

Control? No, just calling these weak manchildren what they are. I can't cure incontinence over the internet.

I dunno. Once we get onto the stage of liquid cooling becoming necessary it quickly leaves consumer territory because your average normalfag doesn't want to deal with plumbing to play whatever new AAA game is programmed awfully enough to need 3D dies, and the normalfag market is where you make your moolah.

and we will show P = NP, right?