7nm 5nm 4nm the end

The End of Moore's "law".
What happens next, why quantum bullshit is wrong and how everybody is losing their minds.

Other urls found in this thread:

en.wikipedia.org/wiki/5_nanometer
metrics.torproject.org/bandwidth.html
cl.cam.ac.uk/teaching/2006/CompArch/documents/all/trends/cpu_history.html#8086
cl.cam.ac.uk/~sd601/papers/mov.pdf
twitter.com/SFWRedditImages

What come's next is risc based proccessors using linux or *nix systems. Since you can save energy on heat distribution due to more cores dedicated to specific tasks. Along with parrellel computing to distibute loads. But that means programmers/pajeets learning to code in a order of execution accountable way. Otherwise yes you are correct. 10/10 bait with honeypot.archive and kike.com.

Gigakike's
I used to subscribe to them, but they progressively shifted more from their centrist positions to the left, and once Trump entered the election race, they became unhinged.

It's as if nobody had ever heard about logistic functions before. Should've been called "Moore's S-curve".

So Android?

The ARM arcitecture is built around a few cores running single lines of execution because pajeets programmed the platform. Even linux or unix based systems will have to be rewritten somewhat to accomodate execution order over say 32+ 800mhz cores. Android is a pile of dogshit because of pajeets not accounting for multiple cores and lines of execution in their programming.

The ARM arcitecure is also a pile of dogshit in it's design for not accounting for parrellel execution orders effeiciently being made for single core execution orders. It is carrying the baggage that is h264 decoding/encoding. Along with ARM (((kikezone))) being embeded on each chip further reducing security and energy effeiciency. The future is something like RISC-V but in desktop/laptop form. Assuming kikes ever release a non-kiked consumer version to plebs. Also assuming someone designs an OS with a language keeping large and different execution orders in mind at low cost to the programmer. So not C or rust. It would have to be a new language taking advantage of the parrelel execution order of RISC for reduced heat.

ARM was made by bongs at Acorn

I thought code parallelization was the programmer job, not the job of the language.

If you are programming in assembly then yes.
The whole intent of the language is to be a abstraction of the assembly language beneath it. If the abstraction doesn't use the features of the language like parrallelization then it's useless for RISC i.e C and rust. Since every other langauage is just a abstraction of C or compiled by it all current and past languages are useless for parellelization at the assembly level. Hence making a new language to use the hardware's features.

To continue, sure C uses some parellelization, but most certainly not at the assembly level. More like jumps between different proccessors at random times to reduce instructions executed on a single proccessor. But the line of instructions is still serial and not parrellel.

Circuits are implicitly parallel, and there is quite a lot of parallelism involved even in a single core, all to create the illusion of sequential steps. You as the programmer express intention, but you are not concerned with how the instructions are actually evaluated, only that they produce a correct output in a certain time. I imagine future designs will rely less on the programmer to make the decision of what should be executed in parallel, and an interplay between the compiler and cpu. If that's not an option, we're going to have to turn everything into a CUDA like style.

How does it feel being brain damaged?

Besides rust, name one.

Oh, you're a rustfaggot. That explains the damaged brain. Probably lacking in the frontal lobe department. But anyway, Fortran, Pascal, Ada, Lisp, Scheme, Forth, Haskell, Prolog, Parasail... I could go on, as at a minimum all languages that predate C count, but you get the idea.

Oh good. Now name one that accounts for parellelization to not jump as an excuse for parrellelization at the assembly level. Or a proccessor that allows for such to begin with and where the heat doesn't destroy performance or energy effeiciency of said parellelization. Hint, it's not x86.

How about every language I just mentioned? Although Parasail tops the list when parallelism is involved, having implicit parallelism absolutely everywhere. Your brain damage is preventing you from understanding that as long as a language has some sort of parallel semantics, it can be compiled to involve parallelism at the assembly level should the architecture allow it.
And you know nothing of CPU design as well. What are pipelines, faggot? What is multicore? What about hyperthreading? What is out of order execution? What are multiple ALUs even used for anyway?

Slow instructions executed in serial that then
Have to be put into order by
A slow scheduler that takes up die space which causes large amounts of heat as it slows down the execution of
This. Those logic units are general, and not dedicated to a task. Taking up further space on the die and creating more heat then neccessary. No one needs ancient and slow assembly calling an ALU that is only on the die for backwards compatibility.
Right but most, x86, x86_64, iatium, mips, and powerpc don't and what you see is out of order execution that then takes up cpu cycles that could be used for something else. ARM kinda does but it's shit due to backwards compatibility ala h264 on the die along with (((kikezone))). RISC-V is the future to compensate for all the above. But that means a language that accounts for parrellelization of commands at the assembly level and not a dedicated out of order scheduler on the cpu die.

Like bees against the honey, right?
You americans are stoopid, having only 2 jew-controlled parties in a country really warps reality.

>ARM (((kikezone))) being embeded on each chip further reducing security and energy effeiciency
You have no idea what you are talking about, nigger. Come back when they start teaching you hardware architectures and virtualization at your college
So what, 16-bit microcontroller with (((le cuck licence))) in desktop form? R-right.
That's called javascript

10/10 bait

This is so bullshit. I can do 10 billion sha256 calculations per second on my fucking desktop now, and that's still scaling out of control. Just because Pajeet can't get his Visual .POO going any faster doesn't mean WHITE MALES aren't making it happen erry day with different methods.
It's time to learn how to write code that scales. That is what will set you apart from the flood of Hour of Code pythonistas and pink haired tech evangelists. The Cray X1 was one of the fastest supercomputers in the world in 2004 at 5.9 teraflops which is midrange for goddamn VIDEO GAMES today. Scaling is still happening, ignore the Intel Jews who want you to accept the engineering process they stalled with $300M of diversity.

I don't care about your stupid sha calculations.
I care that my web browser is working slower and slower, every year even slower

Then stop using your browser to access things that think its an operating system meant to run programs.
Install netsurf. Use that for everything that isnt both mandatory and script cancer.

Care to explain about ""kikezone"", fellow programmer? Is it about cortex A##?
i'm only using low level cortex m# for menial and industrial tasks, maybe some RTOSes along the way.

I don't see how the bourgeoise jew could be left.
Oh wait >>>Holla Forums

You're retarded, aren't you? Who's George Soros? Who do you think are the mega donors of the Democratic Party? Or of all the European Left?
The Bourgeoisie are the Left.
And they're both the Jews.

not a leftist. funded democratization efforts in post soviet states, and dissident movements in soviet states. his organization cites "freedom of comerce" as an important part of free societies.
corporate shills.

"the bourgoise are the left" is in a very literal sense self contradictory. Its like saying "anarchists are reactionaries", or "AI research is primitivist"
Unless of course what you mean by "left" is 'niggerdicks and feminism lol'

The "Classical Left", which must be what you're referring to, instead of the "De Facto Left", which is what the Left is today, simply doesn't exist anymore.
It lost traction everywhere and got replaced by IdPol. IdPol and "Cultural Marxism" is general, is just Marxist (Hegelian) rhetoric expanded.
They're both the same thing, they're both used as disruptive agents as well. Their tactics are the same, their core peoples are the same, their trajectories are the same.

But it's hard for "Classical Leftists" to understand that because of the nature of it all. "Classical Leftists" see their movement as self-sufficient and as an ideology of its own, when in fact it's not, it's just a tool, and once it got exhausted, new ones replaced them.
But "Tool" for who? Well, the ruling class, Bankers and Elites - A revolt of the Economic Materialistic Class against the Monarchical/Local Spiritual Aristocracy.
Communism and Capitalism then, are quite the same, with Materialistic doctrines, being one of the most simple methods to transform a Capitalist system in a Communist one just stamping a 100% Taxation at everything and transferring responsibilities to the State therefore.
It's never going to happen, it was not made to happen and human nature prevents it from happening. If an Anarcho-Something order were to emerge anywhere, in one generation or two an hierarchical system and private ownership would follow.

In the end, again, it's hard for a "Classical Leftist" to see what's going on because they trust their movement as a legitimate force for good, when it's actually just a Capitalist Tool (one out of many) against a Sovereign Nation/Regime. What happened in Russia, for instance, was not based on the people's will, but on Bankers financing the destruction of the Romanov Dynasty, which held 1/4 of all the World's gold, which was a serious threat to the Jewish Banking Cartel (the Romanovs hated Jews, historically) and that were setting Russia to be the World Superpower in a few decades (The whole "peasant" shit people keep repeating is due to the Local "Feudal Lords" not wanting to abdicate of their lands - the Romanovs tried to free the peasants many times, just to see their base of support, the agricultural lords, turning against them. In fact, the Revolution was only possible due to the capitulation of these Lords, who forsaken the Romanovs due to their peasant reforms made by Stolypin and Witte).

...

No it isnt.
hello, bootlicking cuckold. Did you let the lord fuck your wife when you got married like a good little lowblood? God wills it.
human nature is anarcho primitivism, where leaders sometimes dont exist and when they do don't have a lot of power over anyone, and the most imposing authority is the need to get along with your group. insofar as some system of hierarchy is stronger and taller than that, its less aligned with human nature. Some think we'll achieve something with no more hierarchy than that at all, others that we'll get closer, but your delusional authoritarian retardation is the furthest possible thing from human nature.

Well, I knew you wouldn't understand.

It's still trashy destructive one-way computing, and the smaller we get, the shorter it lasts. When can we expect reversible computing on an OpenPower-like architecture?

waiter, there is a faglioli in my soup

I cannot choose browser, I have to use the one that Tor niggers provide.
You can use netsurf with Tor but it will be fingerprinted to death.

Never. At least not in a commercial environment.

That's not the answer I want

Then find an environment where money does not matter. There is always room for dreams in the academic community. Maybe you can convince some spooks or politicians in the intelligence crowd that your ideas would soothe their paranoia. High level corporate managers are always looking for prestige projects to demonstrate their commitment to future growth potential. And there is the possibility of creating an enthusiast community, if you can get the price of entry below that of a sports car.

That's a different issue, that's just creeping bloat.

molecular transistors have already been created user, even transistors based on genetic organic material and that are thus powered by the humans using them

Another level of planned obsolescence, I see.

hardware wont be expanding for the bloat to fill though.
and with the attitudes the people responsible for the software normies uses seem to hold about it, I dont expect them to change their habits much. Their mindset about computers is basically the same as any normies, in some ways even worse, they're just trained.

Add in trans-rights and victimhood mentally and that is quite literally what the left has descended into, at least in America and Canada. The whole left/right divide is really silly though, what we do need is National Socialism to rebuild our nations and to help all of our people aspire to greatness.

...

There's a fantastic amount of alternatives to smaller FinFET devices, some merely plausible, a few already demonstrated:
en.wikipedia.org/wiki/5_nanometer
Things are going to slow down, but they won't quite stop. Also, quantum computing is genuinely useful for many applications, but it is orthogonal to general performance.


My favorite idea is that of using FPGAs as an intermediary supplemental layer between GP CPUs on one end, and ASICs on the other, complete with a standard API, and a design intended to load arbitrary SIP cores during runtime from software, blurring the line between code in RAM and acceleration hardware. The only such effort I'm aware of at the moment is by Intel, who is contemplating FPGAs as a complement to their Xeon Phi platform.


Molecular transistors nothing, read my link, there are already atomic transistors. Of course, processes to economically mass manufacture them are another matter.


Opinions on whether a long-term stall in hardware would cause coders to clean up their act, or if software would simply grind to the limits of unusability and stay there forever?

How many times has to be the "Moore`s Law" be broken until people understand it can simply be worked around with better tech

This time is for real, they're reaching the walls of matter itself.

Why is "law" in quotes? In science, a law is a formula derived by an observation that explains a phenomenon. Moore's Law is definitely a law.

Because it wasn't a law when he conjectured about it, nor it was a law during the courses of years.
He said that things would roughly double every 2 years - they didn't always doubled, and it wasn't every 2 years.

This is your problem now.

the jewish capitalists are the one financing and promoting SJW left, to make people focus on imaginary conflicts and gay/dog/wymen rights, instead of fact that 80% of wealth is in jewish pockets

It is true that those things are being promoted in the first place for the sole purpose of destroying social cohesion, but the reason isn't quite right. People don't give a fuck about being poor when their society is generally healthy in all other respects.

Will germanium save us?

No, would only postpone the inevitable.

risc based processors on gnu* systems

If redhat closes the source to all the GNU GPL 3 components they own when stallman dies then no. Linux or *nix. This is because of the different init systems, muslC library, and busybox not being completely under (((their))) control.

I thought gallium arsenide was the wonder material to drive 250 GHz chips of the future?
Didn't Intel claim their 7nm chips due in 2017 would be made of it several years ago?
Whatever happened to that?

Kikes deciding the goyim don't need more secure and fast chips. It's probably reserved for the botnet A.I's of the not secret agency in utah and the GHCQ in london.

Browsing Tor is slow because Tor itself is slow; no amount of client side optimization will change the fact that there are more clients than Tor relays can handle. Browse even the shittiest sites on the clear web with a stock browser and you'll find it's quite performant. Especially Chrome and new Rust-ified builds of Firefox.

In fact, doing SHA hashing faster improves Tor's speed for you because all hashing in the Tor network is done with SHA.

not really:
metrics.torproject.org/bandwidth.html

You realize nobody's made a CISC ISA since the Pentium Pro, right? Honestly, it's a wonder Intel/AMD don't just add a mode to optionally bypass the decoder in front of their native RISC ISA, eventually phasing out 80X86 for software emulation.

lul wut nigger? The whole point of a new RISC based arcitecture is to avoid old design flaws like execution order schedulers on the CPU die and shitty age old instructions that no one uses. If you really wanted backwards compatibility with x86 just stick a LGA 775 socket on the mainboard with libreboot based software for bringup of some old core 2 duo. That way you get a shitty x86 proccesor that stays powered off for backwards compatibility and isn't as kiked, and a RISC based proccessor for energy effeciency and security. But I doubt someone is going to make those without intel completely assraping them with litigation.

They'll have to abandon x86 sooner or later, there's no solution to the current problems, only a breakthrough.
If or when it happens (it may unleash a decade of staleness), there will be a new architecture for sure, probably something not invented yet.

Was it ARM who was talking about emulating x86 before Intel quickly shut them down?

Yup, there hasn't been a true x86 chip since the mid-'90s. The Pentium Pro and all subsequent Intel CPUs are actually RISC, using a decoder on the front end to convert X86 ISA input to the internal RISC ISA's native "micro"-instructions. In fact, many of the PPro's contemporaries from the likes of AMD & NexGen were older RISC ISA's repurposed for the X86 market with a decoder slapped on:
cl.cam.ac.uk/teaching/2006/CompArch/documents/all/trends/cpu_history.html#8086

Top kek, so there hasn't been any talent in hardware design since the early 2000's. Just copies of old shitty decoders with shitty old designs of risc cpu's slapped together. Can't say I am not suprised. Kikes are gunu kike after all.

What comes to mind now is, if these are all old hardware designs, who designed them all? Why design them all in advanced and not just release them all at once? Why bother keeping everyone on x86 decoders and not just optimizing for a RISC arcitecture where the compiler makes scheduling decisions? What the fuck?

RISC machine code tends to be considerably less memory dense. Backwards compatibility is valuable. And you underestimate the gap between research and mainstream implementation of, well, everything.

No, no I don't underestimate it. Mainstream is preety far behind on literally everything. At this point like 80 years from ZOG global military tech and 15 from (((acedemic research))) tech. It's just confusing why they wouldn't switch to pure RISC at this point to save them money/time on hiring pajeets for political (((moves))) such as importing invaders to europe and america. (((They))) could still keep doing what (((they))) are doing with keeping software pajeet tier, but it would save them the effort of manufacturing such old shit in china. Along with keeping around engineers who might rat them out because of someone needing to understand backwards compatibility.

It was quallcomm emulating x86 and intel got pissy, but Intel can't really do anything about it except flaunt money. Quallcomm BTFO and now Intel is planning to phase out x86 within the next 20 years.

*btfo'd intel

Lol no. Scientifically speaking, it should be called either "Moore's Hypothesis" or better, "Moore's Conjecture." See thermodynamics for actual scientific laws.

Qualcomm is just making noise. They've run into hard times and have gone on a campaign to get people's attention back.

I just wish Spintronics were a thing.

What's spintronics?

Electrons:
Spin Up = 1
Spin Down = 0

The build computer. It's the highest mark ever achievable and I think it's being suppressed and underfunded due to planned obsolescence.

mov is turing complete
cl.cam.ac.uk/~sd601/papers/mov.pdf

So are subleq and addleq. What's your point? None of them are appropriate for making serious CPUs with for various practical reasons.