What went wrong?
What went wrong?
Shitty performance per dollar, Intel Bay Trails Atom > AMD E series
...
Jewtel jewing.
Human-readable text displaying the name of the architecture is a conspiracy now? I'm sure it was just there for fucking debug logs you sperg
Bulldozer
The patent system
Intel lost the court case and had to pay AMD because Intel was caught red handed manipulating benchmarks. I won't spoonfeed you, but look at SYSMARK and Bapco. Intel basically sabotaged the whole thing and pissed everyone else off, including Nvidia, Via, AMD, etc.
AMD marketing is shit. Look at the launch of 7970 and the launch of GTX 680. People ran around saying 7970 sucked and lost to GTX 680 and AMD was terrible. And a year or two GTX 680 wasn't even competitive with 7970.
Zen is looking really good. new GCN has some promise. Both will sell less than if Nvidia or Intel were selling the same exact chip and design. That's just how it is for AMD, they can't shake their "value oriented" image, and it's going to be really difficult for them to get rid of that, no matter how good their products are or how much their drivers improve.
Well they DO sell chips at much cheaper than Intel, so its natural they gained such an image
Raising chip prices would be suicide for them so all they can do is race to the bottom
The only way out of this mess would be clever marketing, which AMD has not
And the fact of the matter is AMD simply does not perform as well as Intels equivalent offerings, call it Intel being Jews if you will, but I don't think Intel has shit to do with AMDs embarrassing driver support for Linux, and Linux having overall better support for Intel in general, you can't blame Intel for that, you can blame Intel for maybe paying Microsoft to support them better, but that holds no weight in FOSS, AMD a shit
That's actually working in AMDs favor because Intel has to pay AMD royalties for the 64-bit extensions and they still fail to milk it properly
What went right? Except for a few very brief moments in their history, AMD's never been better than Intel.
Fuck the poor.
On a related note. I have been loving the shit out of these new amdgpu drivers.
My distro shit the bed on automatic set up, so I had to manually edit xorg.conf, but no more catalyst control center cancer and it just werks.
Their marketing is shit. The entire idea behind them going to sell chips cheaper than Intel and Nvidia was to help people get more bang for their buck. And they couldn't even convey the fact that their products were a better value without people thinking they are cheap and shitty. They can't do anything right.
Their drivers sucked. Now their drivers are good. No one now cares about drivers. They used to be behind in frame times in Crossfire compared to SLI. AMD fixed that, now no one cares about frame times. My guess with Zen is that they'll fix their IPC problems and make a decent CPU, and no one will care about IPC anymore.
That's just how it always is with AMD. They're always one step behind, and they don't have the marketing grunt to make their accomplishments well known and liked.
When will amdgpu support 280x cards?
No tears were shed.
AMD doesn't even have good price/performance ratio, it just doesn't hold up against Intel, Intel is more expensive, but their performance is exponentially better than AMDs equivalent offerings which are only somewhat cheaper at best
I wonder.
Another problem with AMD is they CANNOT properly scale down their architecture
Their lowest-tier general purpose APUs are the E series, the E series doesn't scale down lower than Notebook class, the quad-core E serious APUs are fucking desktop class
Intels lowest-tier general purpose SoCs are Intel Atoms, which offers quad-cores for as low as Smartphones and even equivalent performance on Tablet class Bay Trails, AMD just can't compete
Their CPUs are fucked right now. They released Bulldozer and it was a failure, mainly because it was a high frequency, low IPC design and Global Foundaries couldn't deliver. The module approach isn't that bad, it's just that those chips are designed to run mid 4ghz range at least, and most of them don't. Specially considering FX 8100 series didn't even break 4ghz.
Their HEDT is competing with Piledriver CPU, which is several years old. But, in all fairness, it's probably the first time in a long time that a CPU that old has remained remotely competitive.
What are you talking about? Their cat cores were better than Atoms of the time, and the GPU on cat core APUs is usually competitive with Intel's full laptop GPU (minus the high end Iris ones). AMD has stopped developing their cat cores for the most part, so they've fallen behind. But at the time they were superior chips, specially considering the instruction set they had over Atom and the GPU.
Anti competitive shit on the part of intel
Years of growing use (for Intel)
Stagnated by much lower profits (today)
Actually much better for low end rigs. High end, yes intel has better performance, and a notably higher price along with it.
AMDs cat cores never scaled down properly for anything under Laptop class, Silvermont has equivalent performance to Jaguar but runs fanless on Tablets
They prematurely designed around parallelism which ended up not coming to consumer programs and games for a full decade.
This move made them sacrifice thread performance, which is all that the benchmark and gaming mags cared about.
It also didn't help that Toms Hardware knowingly used code compiled with Intel Compiler on AMD, which made AMDs thread perf even worse than it was.
It should also be noted AMD typically has better multithread performance while Intel has better single-threaded performance
Yes but POWER has defeated all.
Power is the most overrated architecture of all time, people are only interested in it because its different and was made by IBM
The few Power based computers I had were nothing to call home about, hell, even the hipsters at Apple dropped POWER after they couldn't deliver new chips while Intel began running laps around their offerings
power is actually slower than top xeons and you need nuclear reactor to run it
POWER 8 and 9 picked up the game so much that Google has ported its codebase to POWER and is poised to switch.
Per thread it either trades blows or is SLIGHTLY slower but it has 96 threads, power 9 has 192.
Its a kike eat kike world my goy
x86 tablets were a failure. Jaguar is a better netbook chip, without a doubt. It's the only laptop chip you can get with a decent GPU, decent CPU, and decent battery life.
I fail to see how a re-emerging market can be considered a failure champ, poorfag x86 tablets just started coming onto the market in mid 2015 and already they get better performance than ARM counterparts for the same price
Debatable, Intel HD 7th and 8th gens by far have superior video decoders and less performance hit at higher resolutions than AMDs shit
First of all, it's performance/price you retarded spergs. If it was price/performance, the best chip would be a rock that costed $1,000,000,000,000,000,000,000,000
If you're going to go on some retarded rambling on the internet, can you at least try not to contradict yourself?
...
Just get a portable AC nigger
Sure thing bro
Holla Forums has a no niggers policy, go back to africa.
tbh, intel not looking so good. intel is too heavily invested in the desktop / laptop sector while the market has moved towards mobile.
yeah intel has unrivaled capacity at their foundries but if nobody buys what you make the foundries aren't helping you.
This is why intel is laying off ~12k employees and refocusing on IoT / mobile / VR. IoT tbh is a joke and much more software heavy than hardware. If anything, google has the lead here already. Likely is intel will face greater problems in the future.
AMD should focus on graphics, improving PC offerings, and work on high end server offerings, while maintaining current core counts and adding more cores tbh.
IF AMD were to embrace foss and offer better drivers they can manoeuvre into a much stronger position in the future. gnu+linux is increasingly relevant to mainstream, if amd has better support than they are stronger in the future. regarding graphics, amd support for opencl is superior to nvidia and opencl will eventually make cuda irrelevant.
Intel successor to Skylake is predicted to be a downgrade in computational power. x86 architecture / silicon is more and more tapped out. Parallelization is the future which amd has advantage in already.
IF AMD succeeds with Zen then AMD has a bright future. Honestly given AMD stock price is < $4 / share (and was < $2 / share in February) I think it's worth it (if you have money to risk) to buy say 1000 shares AMD and just see what happens. IF AMD succeeds with Zen I can see AMD meeting $15 - $25 range in the near term following Zen. IF OTOH Zen is a failure, AMD will probably go bankrupt.
Zen isn't looking anything at all at the moment. The question isn't even in the immediate performance jump, it's whether the architecture has ability to evolve further.
The first step would be making a non “value oriented” CPU. Their models couldn't top Intel neither on total performance, nor on single thread performance, nor energy efficiency for years.
A 8350 beat its competitor at multithreaded integrer worload when it came out, m8. Stop comparing old CPUs to current ones.
Not the same magnitude, fellow goyim.
3dfx was the king among first 3D accelerator makers.
Ironically, for a long time it was the opposite for CPUs. Because of their design, AMD processors need different optimizations to work optimally than Intel's and unlike GCC, MSVC didn't even have the capability to do this for a long time. Which is probably why in benchmarks with FOSS software, AMD and Intel often come out noticeably closer than with proprietary software.
I remember seeing some 7zip and win-rar, and 7zip results almost the same while with win-rar, Intel magically gained a big lead.
Found the author of the comment trying to justify his shit with even more nonsense and a lot more irrational mad. Go cry alone instead of spilling your shit here
nice effortpost gagfriendo
Intel is actually bothering to tackle Smartphones/Tablets at least, AMD didn't even try, even now there are plenty of Intel based Tablets flooding the market, where the fuck are AMDs tablets?
AMD for a long time hasn't been able to fund significant development to move big into mobile but they have a much stronger custom chip market for things like ps4 / goatse.cx.box / IoT devices and are licensing out x86 which undermines intel's position while helping AMD financially through licensing revenue.
AMD is better positioned in graphics and VR (and Iot through custom chip market), will gain considerably in desktop / laptop with Zen, and has invested in ARM while Intel has pursued x86 near to its limit. AMD is more flexible right now and is growing, while Intel has arguably been mismanaged for ~4 years.
Investing in ARM will prove to be a foolish idea, the ARM ecosystem is way too oversaturated right now, they'll have to go up again Qualcomm, who's profits are comparable to Intels and has the highest performing chips in the ARM market (in perspective, a mid range Intel Atom Bay Trail-T competes with the Snapdragon 805 yet is significantly cheaper for board makers)
I can see AMD taking niche x86-64 markets like custom chips for consoles/arcades, specialized realtime hardware, and the automotive sectors, basically everything Intel isn't focusing on right now
I'm not saying ARM doesn't have competition, I'm saying it gives AMD the flexibility to respond to market demand and makes them less vulnerable in any one sector.
Just saying, random ramblings like this are just hurting your point even more.
but why?
Because their drivers don't fucking work
ATI used to be a completely different company, newfriend.
your OS doesn't fucking work
AMD made me go back to fucking Windows because one fucking kernel update and my OS wouldn't fucking boot anymore because of AMDs shitty drivers, way to fucking go champ
Everything involving Bulldozer (Excavator isn't even close to the performance of Nehalem jesus christ)
Neglecting the Cat cores
Shitty Marketing Team
Not pushing DDR4 APUs hard enough
Shit Linux Support (until recently)
That being said, I seriously hope for Zen to at least keep AMD (nearly) up to speed.
The disappointing thing about the Cat cores is that they actually have decent GPUs weighted down by really bad CPUs (even for the price/market) and really slow memory I/O
this
They're shit loads more socially responsible than Jewtel and Nvidia. They're products have in fact caught up performance wise, yet still cheaper and made with excellent quality. The 'No drivers' meme is now just a meme thanks to AMDGPU, Omega, and Catalyst. Yet, normies throw money at Intel and Nvidia because they're stupid fucking normies. What went wrong? Nothing. They're doing everything right. What needs to happen is there needs to be more PC (not console) products to champion their tech.
You're a fucking retard. I won't even explain why.
...
If you can't take the time and effort to acknowledge some of the valid points user made then you're just as impaired as well.
...
He's fucking retarded for not reading the tread.
(checked)
I'm sorry, but while Bulldozer was shit, Piledriver was (and still is) pretty good.
take it to /g/, Holla Forums doesnt need shitpost threads
you
Bulldozer was only bad because of the inability for it to hit the desired clock rates and dependence on single thread performance. Piledriver fixed the clock rate issues and added some IPC. But the thing is that Piledriver has aged so well because software has been doing better and better at using more cores.
Take a look at Anandtech FX 8150 review
archive.is
Notice how Cinebench is compiled with ICC and it gets crushed even in multi-thread. And the open source projects like 7zip and x264 (at least the multi-threaded part), FX 8150 does well.
The gaming benchmarks are mostly single threaded. You can tell because there are situations where 3.6ghz quad core Phenom beats 3.3ghz hex core Phenom.
archive.is
Some of the games are multi-threaded well, you can tell because FX 8150 does pretty well in them. Crysis Warhead doesn't scale to many cores, you can see how Phenom X4 beats Phenom X6, and FX 8150 is behind all of them. I assume Dawn of War 2 would have been the same, but Anandtech downplayed all of this by not benching an i3 or even the X4 in the DoW2 bench.
Dirt 3 is another one that clearly isn't scaling to many cores as the X4 is beating the X6. Dragon Age is another game missing X4 for some reason.
Metro 2033 doesn't look like it scales very well, but you see some signs of it scaling slightly as X4 drops behind X6 a bit and FX 8150 is no longer in last.
Rage is clearly scaling to multiple cores very well, you can see how X6 and is tied with 8150, beating 2500k and X4 is in last.
Starcraft 2 is notoriously poorly threaded and has big ties to Intel. No doubt it has some foul play there. Same with Wow as there's no difference between 2500k and 2600k, even 2400 is close.
Meanwhile, while nuDoom is shit, here's some CPU benchmarks. Notice how FX 8150 is doing very well compared to 2500k. And go back and look at the non-GPU bound benches at Anandtech, FX 8150 gets creamed most of the time.
Software changed, even original Bulldozer was ahead of its time. Another case of AMD marketing losing massively to Intel and Nvidia's marketing.
So, garbage?
wew
Go back to tumblr.
Upper management and marketers made bulldozer (originally designed to be a server chip) a desktop part.
They went full Jew on the Bulldozer Architecture by releasing it before it was finished and before it could even be considered considered compatible with most Intel-centric compilers used throughout the industry and they have nobody to blame but themselves
How many shekels paid to spread this bullshit? Intel deliberately fucked AMD (and Via) agner.org
One thing I notice about AMDs design philosophy is that they have a very RISC-like approach to architecture design, such as their Heterogeneous System Architecture, which tries to combine the roles of the GPU and CPU in a very elegant, RISC-Like fashion
They need to understand they're not in ARM-Land, they're in x86-Town, and their approach to architecture design is only good for media consumption and undermines the strength of CISC design in the first place
The first iteration of Bulldozer was half-finished shit though, they cut something like 30% off the die size just by machine-optimizing the layout properly like they should've done in the first place.
m8, amd64 is basically a cisc->risc on-the-fly translator.
So are Intel chips, I was talking more about AMDs design philosophy
From when? '07? AMD/ATI's driver support has never been much worse than Nvidia's, but when Nvidia burns down houses, it somehow gets spun into a marketing campaign about how powerful they are, VS AMD catching shit every time a skybox doesn't render correctly due to dev error.
fixing an overheating card is a lot easier than fixing a GPU that can't render a skybox
Yeah, I've got the R9 290X, and it plays everything great in Windows. Perfect 60 FPS framerate on all games at maximum settings.
But then I decided to be a hacker and switch to Linux, because my favorite game was on Linux.
Too bad for me. AMD drivers were so shitty, I was getting less than 30 FPS on mid or low settings. It was pathetic.
AMDs shitty drivers for Linux are the main reason my one AMD PC is on Windows at the moment, there was a small fiasco awhile back where all AMD users got a kernel panic after upgrading their kernels to 4.0+ upon boot because of AMDs shitty drivers
...
Wasn't AMD going to make the official drivers FOSS?
Intels GPU drivers are open source, just sayin'
are there any good GPU drivers on linux? seems like they're all shit in some way.
I refuse to use them if they are FOSS. Either libre or not: gnu.org
short answer: no
long answer: varies by card, kernel version, x.org version, driver version, general luck, and current position of the stars
unless you want intel graphics, but those are shit for gaming anyway
This isn't 2009 anymore fam, Intel HD series destroys AMDs embedded GPUs in their APUs
For now. They only have process on their side.
If only AMD hadn't spun-off their chip foundry as a separate company
their pre-Zen designs would still suck compared to Intel's. I think it was a good move
Intel and proprietary Nvidia are nearly on-par with the Windows drivers. Recent AMD is passable.
AMD has officially worked on both libre and proprietary drivers for a while. The difference now is that they are putting even more effort and emphasis on the free driver. I guess they realized they can't afford work duplication.
that's not the driver m8; but you are right. Intel iGPUs used to work with free software down to the firmware, no they are just as bad as AMD GPUs.
What makes you think Zen won't suck?
Get a real GPU like NVIDIA.
...
God-Emperor Jim Keller created the architecture.
Get a real 4GB faggot.
...
if that were real then holy shit
meant 2GB
HBM.
Yeah, no, I'm a relatively sane human being.
You mean another piece of "technology" AMD hypes the fuck out of only for it to be a massive let down?
then he realised what a fuck up it was and left to work for tesla
jesus christ
Clarify something for me. Does AMD make pure CPUs nowadays or is it all APUs?
They still make AM3+ CPUs.
I thought all their opterons were CPUs
I think pure CPUs have been relegated to embedded and specialty markets, between ARM SoCs, AMD APUs, and Intels chips, having a general-purpose processor with advanced float-point capabilities is expected in this day and age
Zen will first release as a pure CPU with no GPU. The Zen APUs will come later.
Zen should be very cost competitive to Intel, specially Intel's mainstream products, and probably the enthusiast 2011 socket too. Zen won't have a GPU, so the die size will be much smaller than Intel. And smaller dies means more dies per wafer, and you pay a flat rate for a wafer (pic related, a bunch of Haswell CPUs on a wafer before they are cut out and soldered to the PCB you put in your motherboard).
Most of Intel's modern CPUs on the mainstream segment (quads with HT and lower) all have GPUs, and they usually take up 50% or more of the die itself.
Even if Zen is 5% or 10% slower than a 6700k, it should cost significantly less because it has no GPU.
And, there are rumors that there will be no quad core Zen CPUs. Which has me thinking that they won't leave the $150 to $400 price points alone, as Piledriver only really fits in the $150 and lower price point. Which is why I'm optimistically thinking we will see Zen six cores with similar Intel single thread performance for the cost of a typical Intel quad, and then 8 core Zen for the cost of typical Intel 6 core.
AMD actually made some really good choices with Zen, from what I know. It has the idea of Bulldozer, where you make the core smaller so you can fit more dies on a wafer to deliver more performance per dollar. Except it's not doing that retarded CMT core sharing that can bottleneck.
Bulldozer was not that bad in theory. But it took too big of risks and CMT was way too early, games and other software didn't thread so well back then.
You're retarded, they're not doing that. But they are going to get rid of having dual sockets with FM and AM, and just have one unified socket that can run APUs and CPUs.
He's an x86 engineer, AMD is making ARM after Zen.
Completely bullshit m8, there's no fucking way a GPU would make up more than a CPU with many cores of space on a die
Source? I'm looking at basing my next workstation on Zen.
That would be suicide for them, the ARM market is way too overcrowded and Intel has proven x86_64 can compete very well with mainstream Cortex As in both battery life and heat output with Atoms, the only place AMD has there is racing to the bottom with embedded chips. Why the fuck don't they make a mobile Zen processor and do what Intel did for some low end Atoms and use a Mali GPU to reduce costs and get a slight boost on battery life? Why are they making such shitty decisions?
Do modern programs and OS use the gpu in the cpu? Will you miss it at all if you have a gpu card?
a dedicated GPU would make the iGPU redundant, however you could in theory take advantage of both for more processing power, I think that's what AMDs HSA was supposed to sort of be about
Here is a desktop Skylake quad core. I think you owe me an apology. This is what you're paying for when you buy a 6700k.
No. That Intel GPU just sits there, almost entirely useless. It's only really there because a quad core CPU die would be too small to transfer heat from the silicon to the IHS and the CPU would over heat. Instead of Intel giving you 6 or 8 cores for the same price on the same die, they give you a big, useless GPU to help transfer heat to the IHS. Zen is just one big 8 core die, with no GPU. It should be about the same size as that Skylake with the CPU and GPU.
Zen will basically be double the CPU cores without the GPU, which, as you can see what I'm saying, they should be around the same price.
I forget where I read it, so you need salt. But it makes no sense for them to release a quad Zen when they have APUs. The FM2 chips with no GPU sell like shit. Specially when they are releasing Excavator based APUs on AM4, the socket which will also work with Zen.
I'm also making an assumption on price, because it gives AMD the opportunity to massively undercut Intel for CPU performance per dollar while still making very good profits. I imagine AMD will get really aggressive with pricing to get people off of Intel. AMD has been aggressive with 300 series pricing and they've been gaining market share and doing pretty well compared to their previous quarters. Though their main problem with GPUs was that they were going to go to 20nm SOI for the 300 series, and that completely fell through. If it succeeded, AMD would have been able to destroy Nvidia for the last quarter or so because they would have been on a much smaller node. And smaller node means more GPU cores per square inch, which means more chips per wafer, which means better performance per dollar.
But 20nm SOI failed and AMD had to do some bullshit with trying to get Fiji to work, and then a bunch of rebranding.
iirc doesn't intel have some memory related stuff offloaded onto the North/South bridge while AMD still has the same components on their CPU die?
according to the die image he posted, memory IO controllers are on the die on Intel as well
Dude, North/South bridge is a Core 2 Duo era technology. It's been years since everything is on the CPU except for irrelevant shit like SATA and USB, which are usually in the PCH a unified South-North bridge for everything that didn't fit in the processor.
Buying consumer line of Intel on a desktop after C2D is pants on head retarded. Only legit options are AMD or cheap used Intel server CPUs
I've seen footprints with the gpus 75% larger than cpu
Sometimes I think I should sue Intel for emotional damages dealing with their shitty mobile Linux drivers on my laptop.
If the judge doesn't throw it out (they prolly will) you would win in small claims court, its cheaper for them to just settle this shit for $1000 or whatever than actually spend money on lawyers to win.
glue it back on you retard, no need to buy new fans
I fucking swear I had it for more than one year in high school. There's just no fucking way I only got the card a few months before graduating.
The card I had before it was a 5970 or at least I think it was, might've been a 6770. Reason why I got a new one was because the liquid cooling system for my CPU, or more specifically the coolant reservoir, dumped itself all over the inside of my case and fried the power supply and graphics card. I'm never going with an open-system again.
lol
Yeah nb/sb is lame. That's why I have a BTX mobo!
Are you saying you don't want a reasonably-priced CPU with hundreds of times more on-die cache than a $5digit POWER?
AMD is going after the ARM server market. Cloud computing bullshit where 1 vCPU per watt will drive the profit margins into orbit.
If only focus on RISC-V after Zen
RISC-V is still a meme architecture though, AMD is not in the position to throw caution to the wind and try to entertain hardware makers with an arch nobody has heard of yet
Again, like most shit AMD hypes up, it will look great on paper but will make minimal difference at best in real-world applications.
Not true, Intel GPUs are damn powerful for iGPUs, and a lot of users, mostly Intel Atom Notebook/Tablet guys and people with cheap minimalist builds who don't want to pay the extra price for a better dedicated GPU when the one in their processor is more than good enough to play the latest shit at lower-mid settings and older shit at max, and prebuilt plebs, use them, in fact, it keeps the prices of prebuilts low as well because OEMs don't need to add an extra card anymore
plus there are ways of taking advantage of them, its pretty much giving you several ALUs and FPUs for free and can be used as an extra OpenCL core
i want to build a sub 5k gaming rig primarily for scientific computing. i do programming competitions on sites like kaggle dealing with large datasets and these tasks are stupidly easy to parallelize. having more cores will be great, having high flops gpu will be great.
so i'm waiting for zen and the next graphics card line-up to build a pc around it.
especially for scientific computing zen should beat out intel easily. and amd graphics cards have better support for opencl compared to nvidia and don't lock me in with cuda bullshit.
Admittedly AMD has themselves other priorities to stay alive, I just want my RISC-V
I'll believe it when I see it fam
Although I see Zen being very successful for building supercomputers if AMD keeps power consumption minimal because no doubt they'll be less costly than Intel Xeons, which makes them ideal processors for large-scale parallelism
I can see this happening fairly well, given ARM is a potential untapped market for servers. Although ARM is nowhere near the performance of its main CISC rivals, modern ARM chips are more than enough for the average-joe small business server guys who don't need massive fucking mainframes just to display a company web page
It may be the one thing that saves them
what do you expect ?
RIP in peace
AMD RISC-V Architecture Codename: Hitler Did Nothing Wrong
meme it
I was referring to the fact that most people building HEDT systems with things like 6700k don't use the GPU. Some of Intel's GPUs are decent, but compared to even a low end Nvidia or AMD they're still shit.
When talking dedicated cards sure, but thats not really a fair comparison, Intel iGPUs destroy AMDs iGPUs in their APUs in a lot of cases, you can't really compare with Nvidia because they don't make x86_64 SoCs with embedded GPUs
Yeah, obviously they do. Intel is on 14nm and AMD's APUs are still 28nm. Not to mention AMD APUs are bottlenecked by memory bandwidth.
AMD (and Nvidia) both got fucked because 20nm TSMC got canceled. AMD has been stuck competing with 14nm CPUs and iGPU for a while now when all AMD has is 28nm.
I don't think you see the whole picture though. Intel is competing with brute force and taking advantage of their process node. When AMD hits 14nm with APUs, the gap will widen to how it was before. Specially if AMD starts using some sort of high speed cache like Intel.
But you should probably reconsider your knowledge on this subject if you're talking about a 14nm GPU beating a 28nm GPU as a huge sign of a better GPU architecture. Unfortunately Raven Ridge 14nm AMD APUs won't be around for about a year since AMD is focusing on a real CPU instead.
But it's funny how discussions used to be about AMD needs to make a better CPU and their APUs don't have a good enough CPU. And now AMD makes a better CPU and people complain about their APUs.
Keep it up chaim, I'm sure you'll get your check soon enough. The strong womyn in the $300m diversity fund need that money more than you though.
Intel claims it will ship 10 nm chips in the second half of next year.
I just wonder how much smaller it can possibly go. Soon they will have to start using graphine and nanotubes to get things any smaller. I suspect the transition away from silicone will be slow and everyone is going to be stuck on the same size for quite a few years. Could be the perfect opportunity for someone to overtake intel through ingenuity.
I cant believe that in the current year Intel still gimps its lowest end chips to only support 2gb of ram. Total kikes, they need to fall.
I'm pretty sure silicon won't go under 10nm (or mayeb 7) without massive problems.
From what little I've read, going
Intel was already running into some problems back in the 22nm nodes they eventually overcame and the transition from 22 to 14 nm was relatively painless supposedly
According to Intels engineers its unlikely we'll see nodes smaller than 7 or 5 nm unless they switch to a material other than silicon
on the plus side Atoms do have out-of-order processing now which does tend to take a lot of die space but is why Atoms actually don't suck now
isn't this why broadwell was delayed and had unusually short life in market?
Supposedly Intel is ditching their tic-tock release model now since its expected to take more time and R&D to push smaller nodes, now its tick-refresh-tock, whereas tick is a new microarchitecture, refresh is the same node size and microarchitecture but more refined and streamlined to increase efficiency, and finally tock being the die shrink
[citation needed]
AMD is an inferior product
gr8 b8 m8
Google uses highly parallel workloads so it is of benefit for them to switch
If you have 192 threads on a desktop, you're kind of fucked because most desktop applications are still single threaded. Parralelism with windows threads or pthreads is another layer of complexity that most programmers don't want to deal with so POWER chips are comparatively underpowered.
Sick of the "x86 chips are actually RISC chips" meme.
"CISC" and "RISC" refer to the instructions that are exposed to the compiler/programmer (count, etc.), not the CPU's microarchitecture.
what did you expect? this is Holla Forums
besides, there are worse offenders: like that guy who thinks NT has a microkernel inside and believes "hybrid kernel" isn't just a marketing ploy
What makes NT (or linux if you prefer) a "hybrid" kernel and not a monolithic kernel?
1.) Linux has always been known as a Monolithic kernel
2.) Hybrid kernels are distinct from monolithic kernels, as they use microkernels but present themselves as monolithic
Fuck me, sorry. I completely misread your post. I thought you subscribed to the "it has part of the 3D driver in user space so it's totally not monolithic" idiocy. My bad.
x86 is RISC with a proprietary transpiler in microcode :^)
...
fuck off kike
welp, makes sense why they are rivals now, but I don't know which is worse...
My HD 6850 just died after five and a half years of faithful use. Fucking ROBLOX put it on the fritz, somehow, and updating drivers ended up being the final nail in the coffin. I'm going to order a new card, but I don't know if AMD will be my choice in the future. My brother is loaning me a GTX 950 right now and it's working pretty nicely.
That's so much bullshit it's not even funny.
Microkernel is a kernel that only contains the most vital components (memory unit, ...) and defers everything else unto userspace (filesystem, ...).
Monolithic kernel is a kernel that also integrates components that make sense to run in kernelspace -- mainly as a performance benefit since microkernels are known to be slow from all the message passing and context switching needed for simplest of tasks.
Hybrid kernel is a stupid fucking marketing term.
You wanna know how I know?
Well, first and foremost, there's no clear definition of what the fuck a hybrid kernel is. The closest I got to unraveling this mystery is "monolithic kernel that defers 'some' components to userspace". That's a very broad category, however, and Linux kernel also fits it. Which is kind of stupid when you think about it.
Whatever the truth is, NT kernel fits exactly NONE of the possible definitions. Not only it contains all that Linux kernel does, but it managed to absorb a bunch of clearly user-space stuff like font rendering, GUI, and a http server.
If anything, NT kernel is Super-Monolithic.
Furthermore, the notion that "hybrid" kernels contain microkernels is just ridiculous. Whether a kernel is monolithic or micro is determined by the total sum of its parts, NOT by cherry-picking what you like and what you don't.