MUH GRAPHIX!

Have there been ANY games released in the past 2 years that have really set any benchmarks in graphical fidelity? A lot of normalfags will cite nuDoom but that honestly doesn't look that great too me. Is it really diminishing returns? And if yes, is the bottleneck seen through diminishing returns a hardware issue OR is it simply incapable artists? And if it is incapable artists, how would one learn and train artists to properly create photorealism that takes advantage of their hardware in a meaningful way? Do we simply not have enough disciplined graphical artists in general?

Pic related was the last set of tech demos I think that blew me away, the Unreal Engine 4 Room demos. And they are about a couple years old now

Other urls found in this thread:

en.m.wikipedia.org/wiki/Clipping_(computer_graphics)
youtube.com/watch?v=P7A8VYw_ncM
www-01.ibm.com/common/ssi/cgi-bin/ssialias?htmlfid=POV03212USEN
tech.slashdot.org/story/99/10/21/0848202/risc-vs-cisc-in-the-post-risc-era
ark.intel.com/products/75283/Intel-Xeon-Processor-E5-2697-v2-30M-Cache-2_70-GHz
www-03.ibm.com/systems/power/advantages/smartpaper/memory-bandwidth.html#hl-power8s-12-cpus-versus-xeons-18-cpus
twitter.com/SFWRedditVideos

RE7?

Looks impressive, yet not much better than nuDoom, as far as graphics go. I guess I was expecting something much closer to photorealism by now. Where are muh unlimited details?

It's incapable artists and programmers. The artists for being unable to capture a realism that requires imagination to finish it for immersion. The programmers for sucking at their jobs so badly to use the hardware to its full advantage.

In the same place as consumer quantum computer

Game engines are typically expected to pick up the slack for lazy programmers, but then you still end up with lazy artists who never use these engines to their full potential, never diving that deep into the engines dev tools. Are there any game engines out there that really take advantage of its given hardware that also have enough well designed tools to properly assist artists? It seems like all we're stuck with is mainly Unreal and Unity

I guess nuBattlefront, but that game was shit.

As long as programmers don't use multi-core to its full advantage on architecures other then x86 then they will never achieve the best graphics fidelity they could. Disk R/W speeds and the x86 proccessor is holding it back on the technological side. It would require switching proccesors used for vidya over to something like powerpc or RISC-V and then optimizing the hell out of them and teaching parrellization to the programmers. Along with something like a SSD and using unix based systems a mandatory thing for vidya for the R/W speeds. That and GPU manufacturers letting vulkan get more widespread to get native ISA calls for better optimizing for corner cases.

Which means a whole new game engine that is not unreal which caters to x86 or unity which is just a peice of shit. But you don't need graphics fidelity to make fun games with unity or unreal engine though.

The problem with graphics in games like Lootboxfront is that its plain to see that it was mainly done with ready-made effects from some third party mass-produced toolkits. Already made shaders, foliage shaders, water shaders.ets

Go away, you have 0 idea what you're talking about, please stop pretending you know more than billion dollar industrialists. It's embarrassing and cringey

Sonic mania.

They don't seem to know much to begin with

P.T. was looking great

They get their knowledge actually working in the industry, you get your knowledge from spergs on Holla Forums. Please just stop. You can only reinvent the wheel so many times before you get your own diminishing returns. And both Intel and AMD have done plenty of reinventing their own wheels

Corperate goons do what makes and saves shekels. It would cost alot of money to make a new game engine and possibly compiler from the ground up using parrellization on a arcitecture other then x86, and most are bribed to use windows instead of unix for game deving by microsoft on top of not being familiar with unix, lazy to learn, and it costs them more time/shekels.

Then there's the digital rights management side which they do that slows down dev time that is wasted keeping little kids out of a game for like twenty minutes with their very best DRM schemes. Next they are going full retard by not allowing proccessors other then x86 because of hardware DRM soon.

THEN there's the huge variety of GPU's and (((liscense))) argreements most devs that know what they are doing have with GPU makers in order to learn the ISA's of their GPU's better. Which means more money upfront to develope a SDK that is built effectively.

It wouldn't cost them shit because the majority of supercomputers run on some flavor of Linux. Your arguments are the definition of Appeal to Novelty Fallacy. Go away, just stop please

How hard would it realistically be to be just a programmer who wants to make vidya and then find people who like video games and are skilled in creating assets and animations and talk them into setting up their own indie studio for shits and giggles and then make a good game with legitimate photorealistic graphics on a budget of pizza, drinks and electricity?
A studio made by a bunch of autistic hobbyists who just like vidya and have a lot of free time and work in their free time.

Who could have guessed

You aren't re-inventing the wheel switching to a RISC based proccessor. You are using a piston instead of jerryrigging twenty wheels to run in it's place. x86 is a CISC based proccessor which means that it produces high amounts of heat to pipe many instructions for proccessing at a time. This is because of fault correction, out of order execution shedulers, misc timing schedulers, and possibly a CISC to RISC converter being on the silicone die of x86 proccesors of today.

With RISC though you can eliminate the out of order execution schedulers, the fault correction, and the possible CISC to RISC converter by accounting for it at compile time in object files with commands in them accounting for such. Which means programmers accounting for parrellization, the order of execution, and timing it all and predicting all states of the entire software stack to reduce proccessor ISA level bugs or lock ups.

Stop now while you aren't too far behind.

shake my head too be quite honest family

People have jobs for a reason.

Go away goon

Very hard, it's like an art and not the liberal faggot kind.

no.avi

Which most RISC processors end up having already. The paradigm is bullshit. Just like how modern CISC designs are somewhat RISC in nature, modern RISC designs are becoming more CISC in nature by piling SIMD instruction sets on top. And CISC is preferable to engineers anyways simply because of its denser instruction set and more execution pipelines. It's partially why x86-64 is typically seen as the top performing architecture. And TDP is something that is already being solved simply by natural scaling. Intel Atoms run fanless and beat the shit out of ARM. Stop LARPing, you see the names of these shiny new processors and your limited knowledge of the industry tells you it must be good because its "modern"

nerd rage, aspergers, and fighting about what to make would kill that project real quick. then they would blame it on external sabotage like someone was out to get them.

I'm not the one who brought up the idea of moving over to an entirely novel ISA with little actual benefit just to make video games. It;s a solved game. Last generation everyone used PowerPC, today, all consoles use x86-64. There's a good reason for this.

No, there isn't

So why do you disagree with me? This is yet another reason why modern gaming is shit. The vidya compnay would need to design a peice of silicone without any of the botnet or hand holding features of things like x86_64 or powerpc

Not sure why you would expect any fanless processor, be it x86 or ARM, to somehow spit out high-end graphics. That's not the point, the point I was trying to make was that x86-64 scales down to ARMs level just fine.

And you still fail to assert exactly why reinventing the wheel will somehow create miracles. Or even why RISC-V is a "step in the right direction"

Read . It's not reinventing because making instructions execute more locally on a proccessor will reduce heat since it won't have to travel so far i.e CISC vs RISC.
It's better because it specializes instructions in certain places on a proccessors die. Which means less shifting atoms across the proccessor. Which means less heat. But with still holding the programmers/compilers hand with schedulars and out of order execution it will waste space on the die that could be dedicated for better parellelization.

Textures remind me of Saints Row 3, looks awful

And modern x86-64 processors don't? Are you implying there are no dedicated SIMD instruction blocks on modern processors? Do you have any idea how small the actual CPU cores on modern processor dies are? Protip; look at how small a single CPU core is on this Skylake die compared to the GPU

They make it so small because they don't want to leave much space between instructions if it has to transfer on the die. Did you know there is still 8086 instructions on the modern x86 die or it is emulated? Why do we need such old, general, never used, slow, and hot instructions on the die or even emulated? The answer is backwards compatibility with the ibm pc. Which means more heat and space on the die the instructions have to move across. It could get even smaller on the die if all these old legacy and unused instructions were eliminated.

How? Are you talking about the chromatic aberration? That can be disabled.

There's nothing wrong with backwards compatibility user, and the actual 8086 portion of modern x86-64 cores is astronomically small. The original 8086 processor only had 72 instructions. The total footprint is negligible

OP wanted the best graphics. Get rid of the 8086 instruction set, the i186, the i286, the i386, the i486, the i586, the i686, and then switch to a more energy effecient/power efficient/heat effecient proccessor and that will help OP "negligiblely" compounded over seven generations of cruft. Ontop of the other instructions in this thread.

What little improvements you hope to get will be far less then what it's worth if anything at all. Case in point, modern ARM v8 chips. their performance is ass until you start building them up for higher TDPs which require active cooling in which case there's absolutely no benefit from just using x86-64. The advantages ARM had are now all gone. And ARM was designed with RISC in mind from day one

>(((ARM v8)))
>(((ARM kikezone)))
>proprietary everything that you have no control over and have to pay out the ass for the ISA manual which you won't even get the full version of because of (((reasons)))
I already called this shit. Why is this even a example? It's a laughing stock if anything.

Code monkeys get their knowledge from worthless degrees in university which the vidya industry chews up and spits back out. None of them are skilled enough in code to actually accomplish anything substantial outside doing what their told to do by their boss and the ones who do know are so specialized in their assigned job they'd get royally fucked if they tried to do anything else in the company they work for. All good coders and programmers learn from studying computer science by themselves and practice the coding on their own. Incidentally everyone who is competent at coding don't actually want to be part of the vidya industry and avoid it like the plague.

This
/thread

nuDoom did. It looked amazing on strong hardware after disabling the postblur and was the first real graphical advancement we'd had in years. I was playing that at release with everything cranked up at 1440p at around 100-120fps and it was glorious.
Other than that, Shekel Yiddizen is gorgeous but has no gameplay and likely never will have any meaningful gameplay.
No, it's poor people. I work for vidya and the target hardware these days is just depressing as everyone's chasing after Chinabux and their systems are like eMachines..

I remember pirating nuDoom and cranking everything to max and too be honest while it looked fantastic it didn't really look much better than shooters from previous years

nerds

what i found most impressive about nudoom was how smooth it looked, how good the reflections looked (the muted reflections of light on the barrel of your super shotgun looked exactly the way it should), and how well it ran on average hardware. that said, the engine limitations vis a vis only 12 monsters can spawn at once etc probably had a lot to do with the performance

I have a tablet with Intel Atom. It runs too hot and eats battery like crazy. For this sort of use, ARM is better. For PCs, though, you don't need to go fanless.

I totally understand this videogame discussion

uncharted 4, uncharted 4.5 and horizon zero dawn. it's hard to make these games on PC any more because of how expensive it can be to make a full game that takes advantage of the latest graphics hardware. So you get highly specialized games built for under powered consoles, but because of how highly specialized they are, they can push quite a few graphical boundaries.

...

Wait, is that an actual dynamically generated screenshot from a gameplay? It's fucking gorgeous.

No, its a tech demo

It wouldn't be the first time that happened and they shit out an awesome game.

...

Seriously nigger?
Seriously?

Add, add with carry, logical and? THESE are the instructions that are holding back modern gaming? What the fuck is wrong with you? How are you capable of typing with this level of retardation? What the fuck? How can society function with you pulling all the intelligence from around you like a black hole? how??

We have a system wars cuckchan infestation.

can you stop throwing buzzwords around and just talk normally? this isn't some secret club you dumb retard.

Board has been filled with Sonygaffers for a while.
Only got worse when NeoFag got taken down and 4cuck mods deleted all discussion of it which brought a shitton of cuckhanners here.
They started getting out of the woodwork here after PSX 2016 thought, much earlier than that event.

You either had PS4 owners who bought it for 3 games and were ok with it but weren't biased, guys who bought it but thought it was shit and nor worth it and the extremely rare Sonygger shill who would sometimes be derailed and get banned.

Nowadays we have faggots shilling the PSVR and the Pro.
No, you own and like the PS4?
Nothing wrong with
You own and like the Switch?
Nothing wrong with that

Blatantly shilling and defending kike tier shit associated with it, defending mediocre games because they're exclusive and pretending like they don't have massive issues is being a fanboy and should be dealt it by either the users or vols.

Go back >>>/cuckchan/

The Division has unmatched volumetric lighting/fog

quads

this image works both ways

Do us a favor google this pics name then fuck off there and never return. You fucking mongoloid.

its hard because not enough people buy the games to justify making expensive games that push graphical boundaries. that audience is already really small.

Odd when devs have said the Ps4 is a pain in the ass to work with.

It's easier if you start in your early twenties, and even easier than that if you start out as a modding group in high school. You could still theoretically get a group together, but the lack of spare time and similar work schedule tends to screw it up.

did they? I think there's a few reasons why they wouldn't really care about difficulties with working on the platform. It's odd they would say that because I've often heard the opposite argument, due to the x86 platform it would be really easy to work with because it's essentially a PC environment. Are you mistaking it for the PS3?

...

I haven't seen what high-end PC's are capable of the last few years but Uncharted 4 has some of the best graphics I've seen in a while. It really has that "wow" effect in a lot of places where you just stop and look at the environments for a while.

Good graphics and high poly models take a lot of time and money with skilled artists. The only way you can make a game look REALLY good and have it be possible to run for over 15% of the PC market is if it's extremely linear and small scale, or have settings go so low that it's graphical edge is just gone like Crysis, which hardly anyone could run at the time at high settings but lower settings were basically potato mode. Lack of decent artists is a issue too who put little attention into action = reaction, in most games shooting a gun at a wall is pointless and unsatisfactory, shoot at a wall in FEAR, in that game there's an insane amount of dust that blows up and some very good particle effects, physics and destruction.

There's lots of nice little touches too, like under certain lighting conditions the cartilage in peoples ears will be illuminated by light or the miniature rock slides. I think those little details have a much bigger impact than 4096x4096 textures on a small brick.

For some reason I've found that you're most likely to find groundbreaking graphics in driving games over any other genre at the moment; Driveclub, the NFS reboot alongside its recently released sequel Payback and GTA V with the NaturalVision shaders applied look the most impressive IMHO.

What do you think a PS4(pro) is? It's just outdated notebook hardware running NetBSD with a bit of DRM on top.

I would agree, until you look at the bystanders around the race.

Huh, forgot to attach the other two photos. My bad. Have another one for NFS.

Yeah, that and the inflatable sign/punching bag or whatever it is that separates them, shading is completely wrong for the weather conditions and time of day.

its not PC (windows) is what it is.

Cry more, you fucking bitch. Go back to reddit and suck liberal cock there.

I wanna fuck those cars

...

you people are just apart of the normalfag cancer killing video games.

BLOATED COST OF PRODUCTION IS KILLING== fun ==!

What's your point? Doesn't that mean it's even harder for them to make impressive-looking games? Why is that a negative for them?

Phantom Pain is pretty impressive for its quality and its optiomization combined.

Quantum Break was pretty nice graphically stylistically… But I spose Remedy always have done "real world" textures pretty well.

Yes, It's Microsoft- but it can still fry a decent GPU trying to get Ultra settings on a beefy PC, so my 1050Ti has no fucking hope.

Console warrior >>>/out/

Quantum Break is full of cut corners, has a shitty filter and low res textures and most levels are static as fuck to cut on processing power though, just like Uncharted 4.
It doesn't look that great no matter where you play it.

You're confusing that with linear, right? Because there are constantly shifting aspects of levels because of time- that's the whole premise of the game… There's a whole section with the hull of a ship constantly crashing off it's drydock attachments over and over again, as well as a train constantly smashing through the Monarch HQ building etc. that can be used as giant environmental weapons.

So although the game is ''linear" as fuck, I'd argue it's not static.

thats not a complaint you autist

...

Do you even know what clipping is?
en.m.wikipedia.org/wiki/Clipping_(computer_graphics)

Why do people use the term linear like its a dirty word?
I'd rather have a linear game like quantum break then "non linear" game like assassins creed.
Plus it wasn't that linear, you could make choices that actually effected the story and kind of affected the gameplay just a little.

Yeah, I understand that- but in the other user's post (>>13971808) he was referring to the linear level design…

At least I think he was…

Never said cliiping and textures are related.
What I was pointing out was the fact that everyone gives a shit about insane textures but ignores when objects clip through each other constantly.
I see these 4K games but then the main character will cross their arms and they end up inside each other, or their realistic looking hair goes inside their eyes.

Isn't that a Z-buffering issue though? Unless it's a physics based mesh which is another thing- can't remember the term for that…

You're talking about a completely different kind of clipping (culling) than , who's talking about model clipping.
Completely getting rid of model clipping takes a mix of good animations, good collision meshes, and whatever the fuck people use for animating hair and fabric (soft body physics could work but is probably overkill).

Phantom Pain looks like shit. Why are there so many blind people fellating TPP's graphics? Have you never seen anything but an Xbox 360 game before?

Good point, there's so many techniques used in rendering these days it's getting hard to keep up with them.

They're just Kuckjima shills, TPP has only decent graphics, and they only got there by having an empty ass world to compensate.

Graphics would be much better if we would stop focusing on superfluous (((4K))) shit.

There's nothing jewish about having pixels too small for the eye to discern, user. What's jewish is going BEYOND the resolution that gives you that effect. You want your UI to be as smooth as possible. If you can't see a difference between a regular resolution screen and, say, an iPad or iPhone's screen (or the laptops and desktops too, these days), you need your fucking eyes checked. Once we have hardware that can reliably push retina graphics, they'll be forced to improve textures, framerates, and polygons.

The cost of production isn't what's driving up the cost of actually putting out the product. You have shareholders demanding every possible penny be squeezed out of a studio. You have SJW morons demanding 50% of developers have a vagina, and driving up costs and driving down gme value to make it happen. You have piss poor management hired because he's the brother of a shareholder. You have said management refactoring halfway through, abusing employees, and generally getting in the way of the people trying to get shit done.
ONE
bad manager will cost more than the cost of a full time expert graphics technician for the same amount of time.

You have unoptimized garbage being pushed through an causing bugs three stages down the pipeline. You have shit API's. In the massive garbage fire that is modern gaming development you're ignoring everything to focus on… high quality graphics. No. Fuck you. FUCK YOUR IGNORANT BULLSHIT.

It has a lot to do with texture size, which has mostly stayed the same.

Really makes ya' think.

This man speaks truth.
Exploring graphics nowadays is the last thing you could reach out to. With most gaems being complete lootbox/early access crap (or console shit releasing outright unfinished), just making a good/above mediocre game (with actually inventive gameplay) sets you apart from the abundance of toxic waste.
It just baffles me to think that there are so many "artistic" styles that could be done in a game today that are being neglected because "muh realism".

It takes waaay more time and money than you would think if your time is not made of people all over 150 IQ
Think about No Man Sky, I'm pretty sure they started like this and ended up wherever they are now.
Also, Time=Money in case you forget

crysis

...

Kill yourself, Kike.

Kill yourself, Kike.

wait is that nigger shill still around who wants games to become cheaper to produce so they can focus entirely on marketing

The next step in graphix is going to be "multilayer modeling," where the internal organs and systems of a character are both modeled and functional.

Realistic healing, reactions, and survival based mechanics will emerge from this.

Eventually when ai reaches an event horizon you'll be able to have full conversations with in game characters and events will generate themselves based on npc's individual motivations ambitions and goals, though all npc will will in actuality be driven by a singular ai that controls them all through a complicated series of functions.

Maybe in an ideal world

It certainly feels like that now, but one day a turbo autist will make a break through because he wants to fuck his perfect god ai waifu and we will all benefit from the struggles of Autismstein. Just like lightbulbs, airplanes and the internet.

Glory to Mankind

sounds like a remarkably stupid waste of resources.

Yeah, and we'll never have the capability of flight. Have some imagination, user. Or at least the capacity to predict based on prior events

describe some practical use that will benefit game play.

You're missing the point, faggot. Emulating a heart, a spleen, and a liver is a massive and utterly wasteful idea.Even emulating a human brain for the sake of creating AI is enormously wasteful. You can use the same silicone to just emulate the things the player interacts with for 1 millionth the cost.

No user, it is you that is missing the point.

You want better models, when they can't even use the ability of the Geo-Mod in most games, because they have no idea how to tell the story in such an environment?

look at all those good reasons for wasting resources on things players will never see or care about.

Nevermind, you faggots will never get it

You do realise that RISC-V is already being used on nvidia graphics cards?

Not to mention that unless you go 4K upper tier cards don't give a flying fuck about modern gamesunless of course it's a shitty console port bit since we are in a place where it is now common to see 1 shade painted characters that look like they have been in an oil plant one can hardly expect to suddenly get handed a technical masterpiece

Pure truth right there. When banks want to clean house, they kick VPs out to the curb first as they not only have higher salaries than entire branches, but also jeopardize the institution the most because of their position and quickly get replaced by sharper, younger and more jewish successors. I can only dream of this happening in vidya, minus the jewish part, since leadership and retarded managers are partly to blame with the shit that's happening nowadays.

Our world isn't the hogfather movie where things become, just because you belief in them.

filtered multistage specular lighting, shadowed 3-d micromesh, and high quality body hair are what's needed to make a realistic human skin. That shits expensive.

But that's literally how it works

The things that I wished for when I was young have been falling into my lap over time on a consistent basis since as far back as I can remember.

Usually on a several year lag though, results may vary

Wow, faster drives read to faster "R/W" speeds? I'm surprised you didn't also find a way to stretch "lower latency" across another four sentences.

Lower latency is different from thoroughput of a drive. If you have a 40MB R/W from a drive you need to define what bus it will travel across to give latency times. With a SSD the bus can become a bottleneck, but you then have to worry about everything on the CPU and how effeciently the GPU is receiving commands i.e thoroughput of such.

With modern PCI-E 3.1 busses and eventually 4.0 with a thoroughput of 31.5GB a clock cycle you aren't going to see much of a problem until the CPU stops being a bottleneck for latency of transfering commands from disk to CPU this is where the latency starts effecting visible performance to GPU and then to the screen.

I honestly don't know how they would make busses have less latency and more bandwidth. Does someone else know better?

Tachyons

The user who is arguing to get rid of the x86 architecture is correct, IBM POWER architecture would actually improve gaming quite a lot if and only if the IBM shills are correct. Fact is x86 is not the most powerful architecture and there are legit reasons to stop using it. Granted any kind of switch to ARM for gaming is retarded, but ARM is not the only alternative architecture. The three big architectures are ARM, x86, and POWER right now. ARM is designed for low power consumption for its performance, which is lower than x86, so obviously it is useless for anything other than mobile, which will never be cutting edge. POWER however is a serious contender for replacing x86 as a gaming platform, as it gets much better performance than x86. The reason that we aren't on POWER is because consumer level hardware does not exist for it yet, and even then it will take a lot of adjusting to write software that can "catch up" to the POWER architecture's real potential.

I cant speak for any other claims made.

Name every single one.

No because of 2 reasons:
And 2 years is too short of a timeframe, more like this decade.

I still remember CoD w@w campaign, and the soldiers had minute, gritty fabric textures and.beads of.sweat on their pored faces. I couldn't wait to see where graphics technologies would.end up ten years. Fucking nowhere since then.

FUCKING LOL
this is complete and total retard thinking along the lines of "if GPUs are faster than CPUs then why do we even have CPUs".
POWER architecture is not designed for client workloads, it is for server and HPC use NOT for desktop machines and it is NOT faster than x86 on any tasks which are sensitive to instruction latency (i.e. how long it takes for an instruction to enter the pipeline and then complete).
the fact is that x86-64 ISA plays host to the fastest CPUs in the world in terms of latency. even AMD's potato Zen core is many times faster than POWER in terms of latency and while ARM also has a low-latency design it is lacking in compute throughput even in the biggest ARM cores which IS important – take the poor performance of current-gen game consoles which are based on the extremely low-latency but also low-throughput Jaguar CPU core design. Like it or not, Intel's x86-64-based "big cores" ("Lake" family) are the best fusion in the world of low latency and high throughput. this is a FACT.
anti-x86-64 shillery is just down to ignorance and/or trolling at this point, same as "devs can't make use of multi-core".
LEARN ABOUT AMDAHL'S LAW YOU GOD DAMN RETARDS.
in a machine that has a dedicated massively parallel processor like a GPU, a multi-core CPU is not particularly useful beyond a couple of extra cores to handle complex subsystems. games are already making solid use of quad-core CPUs and anything further is just going to be adding load to the game to make it seem like the game is "omg so multicore optimized". case in point Crysis 3, almost impossible to run on "Very High" settings on a dual-core CPU because the "Very High" setting makes the game run physics on ""individual leaves and blades of grass."" The "high" setting looks identical in screenshots and a Core i3-6100 isn't CPU-limited even with a 1080 Ti.
I'm so fucking sick and tired of kids who don't know a god damn thing about computers or programming talking about this game being poorly optimized or that hardware being superior to this hardware, just
SHUT THE FUCK UP AND STOP POSTING LIKE YOU FUCKING KNOW SOMETHING

I really want IBM LARPer retards to leave

This

Nono user you don't understand! Power is different and exotic thus its somehow better! My friends on Holla Forums told me so aren't I smart?!

reminds me of back in 1994 when I had a falling out with a upper-class "friend" (I was middle-class and we were only "friends" because we were both more into computer gaming than console gaming unlike the other teens) who got REALLY BUTTHURT that my "slow" and "outdated" Am386DX-40 machine could run Doom while his dad's amazing and brand-new mac quadra with the same clock speed, 40 MHz, couldn't.
later marathon came out and I sneered at it because I was already loading custom wads from the internet and hacking the game myself using dehacked. now as an oldfag I can admit that marathon was pretty cool but goddamn if he wasn't such a fucking smug shithead about his "superior" mac. whereever you are, eat shit cody.
for xmas that year my dad upgraded my machine to 8MB of RAM and added a cyrix fasmath 80387 FPU that he managed to nick from his job, heh. incredibly it could run quake at a pretty solid 40 FPS in 320x240 mode X, I knew people with 486es that didn't run that well. I wish I still had that machine

x86 shits all over power8 for gaming as x86 has far better single thread performance and gamedevs don't have the time or experience for parallelism. God help us if a current year gamedev attempts to make non-trivial use of threads, like what's happening now with Shekel Yiddizen and its 400+ spinlocks.

First two sentences make no sense, so I don't have a response to them.
Do you have any statistics to back that up?
Also, i'm not sure why the rest of this post goes onto a tangent about Crysis 3, which I didn't mention. Anyway in modern times we can expect 8-core optimization to be important since both the PS4 and Xbox One have 8-cores. The problem with multicore being the new frontier for speed increases is not something that is unique to POWER and it's much superior threading with 8-way SMT, (compared to intel 2-way (((hyperthreading))). Also please don't pretend that HPC hardware isn't viable for consoles or gaming, have you forgotten the cell processor that the ps3 used?


This is a great story, yes i'm sure that just by bringing up something other than IBM PC hardware I am a macfaggot who drinks coffee - also DOOM not running on it in 1993 is a software problem, has absolutely nothing to do with the hardware so you might as well be bragging about being able to play mario on a nintendo system… today that machine can probably run doom just fine. consider youtube.com/watch?v=P7A8VYw_ncM
proving your point about doom to be just total bullshit

Please, refer to my video both of you. Also game devs not knowing how to use threads is expected, the real elite are people who have to write the cryengine and unreal engine, because they have to be competent while building a sandbox for legions of retarded game developers. This is why games all use the same middleware today.
www-01.ibm.com/common/ssi/cgi-bin/ssialias?htmlfid=POV03212USEN

Sorry that I cant compress it, i only have a core 2 duo CPU right now so it's not fast enough to re-encode the video

Look out lads we have a real hardware expert in here

In the first 30 seconds of your video they state that the Power8 system they're using is clocked nearly 1GHz higher than the Intel equivalent. For clock the Power8 is absolutely awful. I don't think IBM is the most unbiased source here

I really think you need to leave

Yeah, it's really weird there is no progress.

Graphics are generally second place to Marketing which can bloat and equal the entire cost of the game itself or more. That said when cutting out Marketing graphics is generally the biggest thing that takes up development and Devs have admitted this comes at the cost of everything else.

There is nothing wrong with graphics when it isn't the only main selling point of the game
Excluding the last few generations of consoles developers would work hard to find every last bit of horsepower instead of slapping together pre-made bullshit in UE4
Always was.
Forced by who?

I know this is bait and you're (1) and done. Get a picture that isn't for ants next time you come baiting like this is cuckchan.

So we're talking over a decade ago, things change.

By the developers you fucking dingdong.

Are you fucking retarded? These are the only correct answers. Holla Forums will always go full autism (muh RISC-V, muh optimization).

That's not to say there hasn't been improvements, there has been tons of improvements of last gen, primarily PBR and dynamic area lights. Some games even have realtime GI/indirect lightning. But Holla Forums doesn't care, because MUH SHADERS. All you faggots care about is high resolution models and textures.

This reads like that "6GB? Try 500GB!" youtube comment.

Yeah in a sense. The big graphical upgrades in the last fifteen years has been higher res textures and models with higher polygon counts. Those, where done well, have basically reached very realistic levels. This is why shits looked the same for the last few years. Devs have been lazily adding shitty filters and effects on top of it but those don't do much.

The secret to graphics improvement moving forward is 100% lighting. That's why OP's screenshot looks so good. Of course the problem here is that great lighting requires knowledge of mathematics and physics, which modern devs obviously don't have. So we're fucked for now.

What the hell are you sperging about?

I care about game play which is the main thing lacking from common games but graphics have stagnated now too and there is no excuse for it. It's a symptom of a larger problem in the industry.
"Things change" isn't an argument. How about we explore the core of the issue. Large companies like EA have swallowed everything in their path and outsourced all development out to code monkeys in sweat shops. Same thing happened to animation in both the states and Japan. Everything is shit for a reason: people keep buying shit.
Replace "developers" with "publishers" and you'll have a better argument. That said, they are no authority and again the reason for lack of innovation comes down to normalfags buying shit and even defending it.

The core of the issue of visuals not improving within the generation is that anything previous to this gen was not x86, i.e you had a lot of room to expand within the hardware as you went along because the platform was mostly an unknown quantity. This gen with everything being pretty much x86 there's nothing to grow with, it's a known quantity.

Yup, you're retarded.

Why taunt me.

Could you rephrase


Go back to ResetEra with your industry shilling please, thank you

That sounds like a tool problem to me, and a need for even faster render rigs. Render the scene for the developer in real time as he edits and he won't need a degree in math

all my this

You forgot to call them a pedo lover. You shame us all.

Seriously? We know what we're working with and that's why progress is slow? Do you, like, put your head down and run full speed into a wall before each post?

My nigger

not him, but the game looks like the opposite of painkiller. they've forgotten how to make a fast fps.

You can tell that from 10 seconds of pre alpha gameplay footage?
Nigger' I'll hit you
Painkiller movement was slow as shit, what was fast was your reaction times to deal with the enemies and their attacks.

yes, I can tell from 10 seconds of pre alpha gameplay footage.

player is constantly using irons for what is probably a dinky starter pistol. there are dumb useless "powers" like that shockwave thing. dashing everywhere.

if it was painkiller, he would bunnyhop around the map and shoot that hairy snail thing in the face with a shotgun. maybe you think painkiller is slow because you didn't hit the space bar more than once in a row?

Could you elaborate? Isn't it a golden opportunity to be the next Carmack? Or make a game so well optimized that it runs in any rig making the selling range way higher and unexpectedly pleasing the costumers giving you a fantastic word of mouth advantage to competition?

Nigger you haven't see what the game offers yet, its a showcase and nothing more.

The great graphics push has slowed down a lot because of 4K hardware. Everyone wants 4K for some retarded reason, and modern consoles can't handle it, so every other improvement is effectively on hold. Even PC is going to take a while to effectively catch up to this, currently only 1070+ fags can handle it, and most of the current gen machines are running 1060 / AMD 580, and that's not even counting the people who haven't upgraded yet.

Honestly though, I'm not even mad. Graphics faggotry is shit and needs to die.

Imagine retroactively becoming a faggot by defending mac computers past and present. Amazing.

Not featured in this pic but world shadows project upon smoke.

You completely avoided my point about all the shit that makes all development expensive to focus on which kind of development is relatively most expensive. Meaning, once again, you're using a magnifying glass to look at one tree in a burning forest.

Blame reviewers for only running 4k and only judging games and cards by their 4k performance.

Who gives a shit?
Massive could make a nigger taking a shit look good.
This is a 2007 game
Fuck Ubisoft fuck it to hell along with their shitty fucking games.

it's not promising when every shot they're previewing showcases shitty mechanics. I'll reserve final judgement for when I finish the game, obviously. but you're delusional if you think such glaring issues don't exist just because all they have to show is a vertical slice. who is to say those mechanics aren't final?


my 760 could pump out 90% of what I asked from it, and was only replaced (with a 1070) for non-performance related reasons.
this

PT still has the most impressive photorealistic graphics I've seen.

of course it looks good, its a fucking walking simulator

Yeah, you can probably get something that looks like modern games on a ps2 if you make an indoors walking simulator like that. why am I even talking reason to a consoletard

because they weren't so prominent here 4 years ago

There's also the VR meme to take into account.

And Company of Heroes from 2006 had fully destructible environments.

If you know exactly what the platform can do from the start, how do you expand out from there? There is minuscule difference in quality between launch games and current games, because there is nothing extra to tap unlike something like the Cell or Emotion Engine where no one knew what the fuck they were doing starting out. How is that hard to understand?

A good example is Killzone: Shadow Fall which has IBL, dynamic area lights and volumetric lights yet is still a launch title. I think it even has lit particles, but probably no GI other than SH.

...

go back to slashdot. this retarded debate is 20 years old.
tech.slashdot.org/story/99/10/21/0848202/risc-vs-cisc-in-the-post-risc-era

When will graphics drivers be updated to start drawing at the center of the Foveal region and spiral outwards from there, ditching the "draw from top left to bottom right" methodology? A timer could be used so that after about 16 milliseconds or so, the frame is committed to the video output and the next one begun, regardless of how far out into the peripheral was actually rendered. Honestly, you only need top fidelity inside the foveal region anyway, middling fidelity in the blend region, and blurry shite for the peripheral. It's how your eyes/brain work, so why over-render areas the human player isn't going to look at anyway?

Only SJWs use the singular they.

Someone wasn't bunny hopping

Bull fucking shit. The human eye moves 900 degrees per second, with unpredictable stopping location. 90 FPS isn't nearly fast enough to cope with that.

This is already being done, but instead of changing the fundamentals of rendering, you just reduce LOD in the peripheral region, which is much easier. Lots of research is being done to combine it with eye tracking for dynamic handling of LOD within your field of view.

That's cool, and smart, but the idea I had in mind would also guarantee a solid frame rate. You start by drawing the #1 most important region of the screen and render until you run out of time (16ms for 60 FPS). You commit the frame regardless of how far out it made it. You never go over time.
It would be a more fundamental shift than aggressive LOD, so harder to implement, but it seems clever.

Are you fucking retarded?
Why expand when they barely use an inch of what it can do and consumers will lap it up regardless?

There has been a game to really set a benchmark in graphical fidelity since Metal Slug 3.

Serious Sam did that shit a million years ago

That's a terrible idea. Your idea of the most important part of the scene has almost nothing to do with where the player is looking at any one moment.

Fuck off, retard.

I fucking lose Metal Slug and its God tier spritework, but come on.
KoF XIII is the peak of spritework and no one will likely invest on something like that ever again.


Hello cuckchan normalfaggot

It's fast enough that you can't see flickering and can't tell if the tracking beam lags, and that's all you really need. With eye-tracking you can do selective motion blur so that objects you're tracking are clear and things that move across your retina get blurred - they do not naturally do that in modern VR due to low persistence, you get stop-motion effect all around if you go without blur.

That is some seriously retarded 4chan logic your using there.

Not an argument.
I'd be interested in some real world eye-tracking data. I highly doubt the player is looking at the periphery of a computer monitor (even in the traditional setup) and probably even more so with VR. The foveal is real, and it makes sense (to me) to prioritize it in every draw call.

you are objectively retarded. The problem of bloated production cost go back to 2001. Its the reason why Shenmue 3 art is being outsourced to street shitters. Yes that can all be nebulously blamed on "bad management" but that does not refute the fact that the FUCKING BLOATED COST OF PRODUCTION IS A LARGE PART OF WHY THEY ARE TAKING LESS RISK. If developers worked with a middle market budget they could potentially be more profitable but that also is more risky. Spending money on marketing and catering to SJW, and Graphic Whores is a sure thing. At the end of the day making a good video game is more of a gable then pumping money into graphics, shallow virtue signaling and advertising.
>>>/reddit/

Never liked XIII's rotoscoping on 3D models personally.

Are you pretending like its not what we're seeing here?


Its not production costs
Its marketing costs that are driving up costs of these shitty fucking games.
They cut corners on everything, completely disregard attention to detail, use proprietary engines so they don't have to pay for a license.

You think Destiny 2, Call of Duty WWII and NuFront 2 cost a lot to make?
I'd argue Bethshit like NuDoom and NuPrey cost more to make but had less marketing costs.
Its all marketing
You know how much did Modern Warfare 2 cost to make?
50 Million
Do you know how much it cost to market?
200 Million

They want the normalfag audience and will push out garbage if they can market it and make people of aware of it, thats how ot works.

Jesus, are you a game journo?
Shit was amazing, and it has full air control so you can do crazy things. I just finished replaying the original and the booh expansion on trauma. It's still a great game today but is probably too hard for today's FPS audience, especially the difficulty spike on Leningrad.

Adjusted for inflation, production costs have been consistently dropping for the past decade across all developers. There isn't any valid rationale, they're just money hungry kikes, is all.

Fair enough, building sprites from scratch is indeed different and the sprites would indeed look different if not based on polygon models.

What the hell is bunnyhopping
t. someone who has never played many FPS

There's no PC hardware that can run 4k at good frame rates yet. 4k@60 is a console pleb target, no one should be playing @60 in 3 CYE.

It's a bug whereupon you don't lose momentum if you jump on the same frame as you landed.

also an afterthought,

This meme that its the "Overplayed Management/CEO" is true as far as its symptomatic of the nepotism of "Jewry" qua "Jewry". But companies would not rationally spend billions on a useless CEO if profit was the soul motive. In other words, it cannot be logically both be "Muh profit motive" and "Muh greedy CEO" at the same time. If its the profit motive the bloated cost of production still influenced the decision making process. If its "Muh greedy CEO" then it still does not address the issue of the bloated cost of production. For example Rareware (see embedded Jewtube) sold-out because they could not afford to remain independent after the 6th generation because of the INCREASE COST OF DEVELOPMENT.


nigger you mean to tell me that a team of 5 guys in 1990 can do the same work done by a modern game development team? Don't you even understand why "Indie" games became popular to produce? Because the Bloated Cost of developing "AAA" games left a vacuum in the market. Too bad it's being filled by shitty SJW and their pretentious "art" games. But that is just how the world works.

The minigun is still fucking trash with its weird spread and hit detection
I don't think I ever finished Battle Out of Hell, should get to it
See

I don't really dabble much in technology. A friend of mine who programs on the side and is an electrical engineer says that games are exceedingly unoptimized. He says that with a multi cored , decent CPU we should be getting games with far more advanced everything and running exceedingly well. This true much?

Many FPS games prior to the console apocalypse of 2005 copied the bugs of Quake-era games because they formed a 'skilled movement' system everyone had gotten really good at. Bunnyhopping was one of those bugs that was intentionally added to games, where you'd gain speed if you jumped immediately after a jump. And air control lets you turn in mid-air which people would use to do obvious things like jump around corners but also to change the distance of a jump with careful wiggling.

this is not true, In fact their is some stupid video by Tarmack floating around the internet that makes this claim and actually refutes itself with its own evidence provided.

Speaking of miniguns. Every dev that makes miniguns inaccurate or puts long windup times should be lined up and shot with 30 mm vulkan cannon. Which would be most of them.

The minigun is great, has almost no spread, and is hitscan. I don't know what to say, son. Maybe you played the fuckawful remake rather than the real thing?

Yes, most AAA companies are PR and marketing.
See

Titanfall 2 art and animations were done by one or two guys.

Does it count if I'm working on a turn-based game where triple barreled miniguns are so inaccurate from recoil that the best strategy is dual wielding?

^sorry wrong video. But Rare did make a public statement once as to why they sold to Microsoft and the cost of development was the prime motivator.


And I am saying the Bloated Cost of devilment is a large reason why it is rational to spend money on PR and marketing.

Because:

Higher Cost = More Risk = Need to sell more = Large Demographic of Potential Consumers = Catering to the lowest common denominator = Catering to SJW because controversy can hurt sales

fuck, its like talking to a brick wall.

Miniguns are basically extreme rapid fire sniper rifles, complete with long sturdy barrels and all. Also they fire between 30 and 80 rounds per second, and no bigger miniguns are not slower. That said, you have no hopes of wielding a minigun barehanded. Not only does the piece of shit itself weighs more than you are, it's ammo box is triple that weight yet.

development*

I'm an AA gamedev and they are absurdly unoptimized today. I'm cleaning up a Unity game for release at the moment and the code is horrifying. A lot of the code is written like you'd write bad jQuery code as no one knows how to even use datastructures anymore.

You've obviously never played the game, using the minigun at anything but close-midrange is like pissing against an fully armored Medieval Knight.
You have to wait for the Bayonet WWI Skellie soldiers to line up and be close before you fuck them up and hope the bigger ones don't fuck you up.

Its a pain in the ass

I forgot it's not a minigun, just a triple barreled automatic rifle found as a rare drop from a joke enemy.

Back in the day, you couldn't hire pajeet for a bag of curry, because he wouldn't be able to make an assembly program that actually runs. Few years later, that would be C program but it would still crash immediately with plentiful memory violations provided. Now that managed is a thing, even the most incompetent codemonkeys can take a jab at it and produce something that doesn't crash, even if's exceedingly ass-backwards and slow.

...

There is no higher cost, they reuse assets, sounds and animations constantly while making money out of lazy shit DLC that modders could do in 1 months time and now are trying to force microtransactions in it.
Fucking GTAV was profitable on its first 10 hours of release and 2K now wants Microtransactions on all their games.

The high production costs is a ResetEra tier meme so the normalfags they pander with overblown marketing budgets will actually defend this shitty industry

I literally just finished painkiller black on trauma this morning. I replay this game every few years.
Maybe you just can't aim? Painkiller's minigun is unusually accurate at distance, although if you aren't a shitter you'll arc stakes at range.
… why are you using a non-AoE weapon against groups? Circle them to round them up into a pack and hit them with explosives.

No surprises there, the barrier of entry to Unity is so low and people use it for the wrong kind of games.

The devs of the game I'm working on aren't Pajeets, they're almost all WHITE MALES from SF. Gamedevs are not what they used to be. They're basically webdevs thanks to extremely sophisticated engines letting them get away with awful code.

4K 60FPS < 2K 144FPS
Unless you want to buy a fucking Titan V for 3K for not only vidya but 3D modelling and and editing/actually developing shit that requires that level of processing power right now.

You know what actually, at this point hiring pajeets would be a better option.

This. Hell I'd gladly run 1080p at a higher refresh than 4K at 60 (or less…). Getting better frame rates absolutely will improve the way games feel, and in fast paced games will improve your play.

4K is a meme anyway.

this is what happens when compsci programs are a joke and c and c++ are "too hard"


don't want to increase dependence on that shithole of a country. we should be hiring vets, not pajeets.

You know that's not gonna happen. That extra $50 grand that you spend hiring a veteran is a $50 grand not going into your pocket. You know full well the kikes that run the industry will not stand for that.

Yeah I have a gtx 1080 and an overclocked 1700x and he was telling me with a properly coded game i should be running everything excellently due to how much they could offload processing onto cpu cores. I have no idea how any of it works. Im a chef, not a technical guy.

They do hire vets near the end of development to come in and clean up. I do that contract work.

This doesn't makes my point any less valid. If anything, it proves it.

No
No, WWI shit is the worst fucking thing because you have to rely on the minigun for everything on it including the tanks that just make you waste even more explosives

I feel like I need to explain how to play cuphead..
Ok, so in painkiller, you shouldn't be using the minigun on mobs of enemies at long range, that's dumb. You use the minigun w/cards against bosses, to pick off lone enemies at some range that you don't have time to stake, and sometimes to pick off enemies at closer range that are too close to explode.
For large numbers of mobile enemies, you corral them into a group and use explosives (rockets, stake grenades, that sniper bouncy ball thing, etc.) or fire (booh).
For packs of enemies that aren't very mobile, you stake them while jumping around as sanic. You also stake things as they approach and stake things you can get away with staking because it's fun. If you jump and stake their feet to the ground you can harass their corpse by mashing control and it sprays money like a loot pinata.
For 'big guys', you freeze them and one shot them.
I literally did not use the minigun on Leningrad trauma.

Hey AA dubsman, tell us about some of your work. I'm eager to hear about it.

Can you read at all? That's mid-sentence.
"This is a great story, yes i'm sure that just by bringing up something other than IBM PC hardware I am a macfaggot who drinks coffee"


Retard, it's not "defending mac computers". The point that the guy was making was that because he could run platform exclusives like Doom, that it somehow was saying something about the hardware's capabilities. When in reality, modern source ports prove that he's just wrong and mac hardware at the time could have run Doom, it's just that it's a platform exclusive and was never ported.

Let's put it another way: Do you think that nintendo has better hardware, because only a nintendo system can run Mario?


At least this is a legitimate criticism- but it's easy to do further research and find the basis for comparing these two CPU's: The 12-Core POWER8 at 3.1 ghz is $2500, and the
Intel E5-2697V2, which has 12 cores at 2.7 ghz is $2614. So it's in fact completely reasonable to compare the two as they both cost the same amount of money. These are the sources:

ark.intel.com/products/75283/Intel-Xeon-Processor-E5-2697-v2-30M-Cache-2_70-GHz

www-03.ibm.com/systems/power/advantages/smartpaper/memory-bandwidth.html#hl-power8s-12-cpus-versus-xeons-18-cpus

Static room demos stop being impressive when you change to dynamic instead of baked lighting. We're still a generation out from fully dynamic global illumination.

I don't want to get into specifics as it'd not be a good career move to be associated with hatechan. But right now it's just the usual Unity cleanup: massive garbage generation causing hitching, heavy code running per-frame for no reason (with Unity, that's often the DOM-like Component searches which get expensive and spherecasts), and the lack of anything resembling event-driven code.

No shit, the spread and hitreg is fucking awful
No, its not, not only this is not how a minigun works but you also have to avoid the long range hitscan fire of 25% of the mob enemies.
So you have to bunnyhop and rocket launcher fire until you run out so you can circle around the mob and waste 30 boolets to kill a single enemy.
Shit like that isn't fun or skillfull, its a boring chore.

Its like pretending the Toad flood sections in Serious Sam are fun, especially the Secound Encounter where you can Rocket Jump and then only have to shoot rockets at them for 3 hours.

You the guy who worked with Warren Specter?

No.

...

Tesseract has realtime global illumination but no one gives a shit about it because the attached instagib game isn't as good as Cube 2 Sauerbraten.

It's a common issue with Unity games as it's much easier to just check everything per frame in Update/FixedUpdate. Same thing with timers. Gamedevs can't into asynchronous code anymore so their code is generally either checking Time every frame or using coroutines which have their own issues.

...

Which is odd, since most of them are just web developers in disguise where pretty much everything is event driven today.


The Tomorrow Children has real-time GI.

that's a great example of zero artistic competence being carried by technology.

I don't know about that. There's a lot of work behind the looks of this game, even if you might not like it.

I dunno if I'd grant something credibility just because it took a lot of work to make. I'm sure battleborn took a lot of work to make, and it looks awful.

So you can get paid peanuts, the game director gets the credit for everything you've done and then you're fired and replaced by a less competent coder?

My cousin refuses to play older games. He's been wanting a legend of dragoon remake for a while now but won't play it because the graphics are shit. It's weird. He has pretty good taste in vidya, except he won't play game without next gen graphics.

resources will be far far greater, they will also still be fucking finite, maybe spend it on something more meaningful.

contradictory statement

...

...

On this board: hipsters

>>>/reddit/
>>>/cuckchan/
>>>/ResetEra/
Bye

What a ground-breaking statement.

...

A platitude is a truth you're tired of avoiding.

Because often the human's eye may flick around to other areas of the screen without actually moving the mouse, so now they're looking at part of the world that is in the blurry area of the screen. People don't spend 100% of the time with their gaze locked to the centre of the screen, we look around because our eyes are faster and more precise than our hands. Your mouse movement is analogous to head movement. Yes, you need it to make a large change in viewing direction, but you don't swing your head around for every little movement of the eyes.

Same reason DoF doesn't work in video games and makes them basically unplayable.

eye-tracking dof might be interesting. the current implementation of it is awful, though.

The only way to successfully use eye tracking is to check for eye movement and focus every few milliseconds and render everything in an oval in the direction the person is looking.

DoF is absolutely retarded because we already have natural DoF. Its like layering the same effect twice. Why do devs do this and why does blur in general exist is fucking beyond me.

that statement is so objectively stupid.

From 7th to 8th gen?
Barely any difference
Both the PS4 and Xbone were sold at a profit due to their weaksauce already outdated hardware compared to the PS3 and Xbox 360 which were both sold at a loss to the company, Nintendo games always have outdated visuals and most of the money goes towards marketing.

Theres barely been any improvements over visuals, textures or otherwise and some of the most popular games out there look like 2004 releases.

You're fucking wrong
"Higher production costs" is a meme, its gospel spread on the likes of Reddit, Gamefaqs and NeoFag/ResetEra as an excuse and while it was actually relevant during the firsf half of 7th Gen it sure as fuck isn't now.

...

No. x86 is now microcode running on a RISC core. This microcode is largely undocumented. RISC won. x86 hangs around as a compatibility layer in microcode to keep the intel shitbox running.

That's a supreme car for a supreme gentleman.

available street shitters and build tools are not good reasons.

The CPU is a red herring. GPU is where the work gets done. rasterization is on the way out, so the role of marshaling vertex data will be taken away from the CPU. sphere (path) tracing, ray tracing, physics systems: all running 100% on the GPU. Which is a massive parallel RISC floating point pixel fucker.

It has no footprint. It is a microcode compatibility layer. It's literally software. It can be patched via firmware updates. It's shit on top of shit, in a impressive shitpile reaching to the sky.

I think having acutal documentation for the hardware and the ability to license and produce not-ABC-backdoored silicon is a pretty fucking huge benefit.
IBM Power9 nigger. Intel pee-pants shills doing chan damage control. lol.

they have a limited mind user.
let the poor fools be.

What part of
Did you fucking miss?

I know this, y u no real time ray trace yet? GPU's are already fucking blazing fast. The real problem is the time it takes from the main CPU usually x86 to transfer instructions to the GPU to be proccessed after disk R/W and GPU software has been optimized. Hence my incessent faggotry about CPU's being such shit.
What is this meme? I said possibly above because I have no way to confirm this myself. Is there some way to test the RISC layer underneath of x86 proccessors? Or to recompile a entire system ala gentoo style and use the straight up RISC based arcitecture underneath? I already have my hands on a deblobbed to the max and HAP bit flipped x86 system. How do I remove even more botnet or even test for it?

You say shit ontop of shit. What is the shit underneath the x86 layer? Does it have a architecture name? I read somewhere that proccessor arcitectures have been pre-planned for the past 20 ish years. Is there any truth to that?

You and your boyfriend have shit taste.

That has literally always been the case, though.