The world would be a better place if processor development never went past 130nm

Just think about it

Other urls found in this thread:

phys.org/news/2004-09-industry-mass-production-dram-90nm.html
guru3d.com/news-story/nvidia-maxwell-to-be-first-gpu-with-arm-cpu-in-2013.html
urbandictionary.com/define.php?term=Web 2.0
youtube.com/watch?v=sdSSsuSssg0
eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/
code.org/
docs.python.org/3/
twitter.com/SFWRedditVideos

The worst mistake when it comes to bloat was RAM and garbage collection, not CPUs.

Yeah, I forgot how RAM wouldn't have gone past about 256 MB per stick so you wouldn't be able to put more than 1GB of RAM on most motherboards.
phys.org/news/2004-09-industry-mass-production-dram-90nm.html

Fuck off NSA

b-b-b-b-b-b-b-b-bu-but muh gemmen
t. Holla Forumseddit

...

Even most gamers on this site would agree, the early to mid 2000's was considered a golden age and games from that era like CS are still very popular. The improvement in graphics is a joke when you consider the top selling games since then were Minecraft, Wii Sports and Angry Birds.

Better graphics has only resulted in polished turds they get to charge more for.

too be fair, ever since the Commodore 64, video games in the PC space have been treated with a lot of respect, given its importance in driving hardware sales, and driving hardware development at the same time. Games have also been shown to be a great demonstration tool for new hardware, even for non-gamers, like people who need high-end workstations will be shown a demonstration of the latest graphics, making them excellent marketing tools.

Of course I wouldn't expect anyone in this thread to care about that stuff, I mean, I'm in a thread literally titled "the world would be a better place if processor development never went past 130nm"

That was a pack in you double nigger.

Even so he still has a point, just shitty examples, he could've said New Super Mario Bros. because that wasn't part of a bundle, or hell, he could've said the entire Pokemon Franchise as the ultimate example

Sometimes I wonder what it would be like if computer components still came as throughhole DIPs. Building a PC would literally involve soldering RAM, EEPROM, RTC, CPU and other chips onto perfboard, and wiring it up. If you want something upgradeable you just put a ZIF socket in. We'd have computers running DIP CPUs, like the Amiga or ZX Spectrum.
Then again I just like the look of a nicely laid out set of DIPs.

this is good?
lack of user choice is good?
you think people don't know about privacy breaches?
why is this good?
why is this good?

this is a pretty bad post op delete it now

And then virtualization would have never taken off and if you wanted a home lab you would need a rack full of pizza boxes rather than just one box. No thanks.


you do realize DIP sockets exist?

Yes, but ZIF sockets look cooler and are easier to work with. They are bigger, so a regular DIP socket would work for saving space if that's needed.

This sounds like a really bad idea.


Have fun bending pins trying to insert a 10" long DIP CPU into a DIP socket and god forbid you ever have to take it out because it will fucking snap in half if you do

A 10'' long DIP would be silly, but something a little smaller would be neat. Biggest DIP I know of is the Motorola 68k CPU, which is what the Amiga 500 uses.

The world would be a better place if processor development never went israel.

The problem would be fitting enough pins onto it, Intel's Slot 1 that was used for the Pentium II and III has 242 contacts and is about 5" long with the contacts spaced a lot tighter than they are on a DIP.

I sincerely hope you're not implying that Palm and Windows mobile was good in any sense of the word.

Yes because source code optimization can magically overcome hardware limitations and give us modern day performance on shit hardware.

You assume that "everyone" would be intelligent enough to understand what causes it and why, much less care.
Protip: You're talking about the kind of people who quite literally buys a new computer because they installed a virus.

Ah yes, the good old days of pastel and rampant malware because everything you ran had root permission to do whatever the hell it wanted.

Because websites looking like a Microsoft Word document is truly the epitome of what modern web design should be aiming for.

user, try to understand. These people have no jobs.

I guess you're pretty young but they make DIP insertion and extraction tools

...

Quake 3 never needed more than 800MHZ 1 core with 256MB RAM and some reasonable GPU.
Other decent games need even less than that.

Oh and did I say that crap games don't matter?

You are just pretending CSS doesn't exist?
Or just being dumb?

boy, do I hate luddites.

Not really, no.

What if the course of history was like " either choose better processors or space travel/time travel/flying cars/holograms n shit" a la Road Not Taken?

No, but modern smartphones have done irreparable damage to the internet.

You can actually do quite a bit with old hardware if you cut out the bloat. There's also the the option of multi processor workstations if you really need more power.

There are plenty of people who understand how software spies on them today but ignore it because they have "nothing to hide". If the situation also involved their machine slowing down noticeably then a good portion of those people would start to care.


You say websites that look like a Microsoft Word document, I say websites that don't make my browser hang, try to track me everywhere with JS based browser fingerprinting techniques, or potentially try to run malicious JS.

Go fire up a single core Athlon or Pentium 4 and tell me how great it is.

I'd love to go back to the time when the interface thread and the threads doing all the hard work ran on the same CPU! I really miss the days of interfaces locking up because you're encoding MP3s!

There was a study done years ago taking a Pentium II 400Mhz, Windows 2000 and Office 2000 and comparing it to a top of the line Pentium 4, Windows XP and Office XP. They ran some industry standard benchmarks on both machines and as it turns out, the Pentium 4 machine was slower despite being massively more powerful. Sure, the days of rendering 1080p video in minutes as compared to rendering sub-dvd video in hours would not be possible without modern hardware, but for most tasks, the POZ and bloat has basically enabled normies to use computers and degraded the user experience.

They used to say what Intel gives, Microsoft takes away. My personal experience, going from a Pentium 75 with dialup to a Pentium 150 with a cable modem still in the web 1.0 days was an absolutely shocking speed and ux transition. Games ran smooth, webpages loaded INSTANTLY. Then Windows XP happened, along with Pentium 4 and soon after Web 2.0. The experience has only degraded since the heady days of 1997/1998.

Here I am on a 3.4GHz i7 with 32GB of ram and a 200mbit symmetrical connection and the typical webpages load as frustratingly slowly as they did 10 years ago. Pages are shit, javascript is shit, browsers are shit, OS development is shit.

Pentium 4 was the first POZ LOAD. AMD developed a kick-ass alternative to the p6 architecture in their K7 so Intel decided to play on normie ignorance by kicking up clockspeed and shitting up execution efficiency. No one fell for it of course, and AMD fucking assraped them until the Core 2 came out, but because Intel was such a dominant player and faggots like DELL just rolled with it, and normies were all like "dude i got a dell" the Pentium 4 survived for years, and with it the philosophy of bloat and flashy bullshit because who cares? We have tons of clock cycles.

I really miss 90s technology. Hell even Apple's fruity G3 and G4 hardware was infinitely more interesting than the Windows 10 dogshit we are getting.

I was just given a 2000 IBM and a badass AOC CRT

currently shopping for voodoo2

I can hear the deus ex theme song already

This is so true, the ecosystem of bloat needs to end. We're throwing hardware at software problems and we're never going to get anywhere if we keep going "throw more shekles at it to make it run faster", why the hell shouldn't my 2GB i5 laptop run everything fine? There are piles of bloat at every level and they're adding up, it's death by a thousand cuts. The only way we're going to deal with this is by going ahead and optimising every open source project out there with needlessly slow code, starting with the foundations: standard libraries and interpreters.

The bloat needs to end or we'll just be paying more and more, throwing out hardware that's perfectly fine in order to buy something we shouldn't need.

Lol just wait til you see that ryzen presentation from mid december. That nigger is full of bloat you couldn't imagine, built right in to the hardware

As for now, web is the safest (because of origin policy and isolation), fastest (because of tons of optimisations in JS implementations) and the most easy2learn technology stack.

Off course modern ``apps'' are slower than programs that were written in compiling languages, but this overhead is not for nothing. You can't get this level of abstraction cheaper.

Also, imagine if there is no JS and ``appers'' used to write their shit in C/C++. Pretty bad scenario, isn't it? That's why I think it's good that JS exists. At least it can't break out of the sandbox and never segfaults.

CY+2 in one image. The 90s did it better.

That's photoshoped you know...

It's kind of hard to imagine a female person of poo is going to account for the reflection in both the frame and the reflection of themselves in the screen

Kind of. The mobile ecosystems are ABSOLUTELY DOMINATED by the makers(owners) of the systems, unlike the desktop, where you at least have the option to pretend to exercise control and ownership over your computer.
No, widespread awareness of bloated programs -> pressure to keep programs slim
is what is good
If device performance were at stake, people might give a fuck
Because XP was awesome. Imagine if MS had to adhere to all the good design principles that went into XP.
Because closed-source (((((Web 2.0 Apps))))) would have not become a thing.
Although I would argue that the Web 2.0 App thing is nice because it allows closed-source code to run in a sandbox.

CAN I GET A MOTHERFUCKING AMEN.

However. Conversely.

Mint XFCE + Adblock means my Pentium 4 rig is performing about as well as most new computers (including some Le Gaymen rigs) I've seen with Win 7/10

I have this fridge. If you plug in a USB stick with images, it'll display them in a slideshow. She just loaded a .PNG of a fakey loading screen onto her fridge.

why would you do this?

I'm renting, it was already here.

No one's stopping you from using 20-year-old technology on your quad core CPU.

doesn't that take all the minerals out of the water?

adding minerals is a fancy way for companies to say they did the bare minimum to pass inspection plus the daily dose of chlorine

Go to whole foods and pick up a few different fancy mineral waters (evian, voss, aqua panna, gerolsteiner, perrier, etc.) and taste them at room temperature. There's a yuge difference in taste. also, you get minerals from water that you don't in your regular diet.

IF the world could adopt the Smalltalk or LispOS approach to computing then several layers of bloat would be eliminated.

I sure like to agree, OP.

However:


Just think about it

nah, they just would look different

Weebs are good, retard.

...

k

KOREA GO TO HELL!

That implies that browser based bloatware is barely acceptable with todays computing power.

But it isn't. Therefore, they would have found a way to bloat/botnettify those lower ressources.

And indeed that happened. Remember Quicktime or Realplayer?

...

Not korean, but I'm pretty sure I'm already in hell, with mentally ill -> weebs

no fam, it works like this:

96 to 2001 had so many incredible titles released it boggles the mind. Some examples, and many overlooked as well:
Quake
Duke Nukem 3D
Shadow Warrior
Half-Life
Thief
Deus Ex
System Shock 2
Sin
Unreal Tournament
Fallout / Fallout 2
Arcanum
Planescape:Torment
Starcraft
Descent:Freespace
Wing Commander

The decade before too with all its point & click adventures, RPGs, simulators etc.

I think it started to go to shit around 2005/6, with only the occasionally truly quality title. The mid-late 90s had comparatively stone age era technology, and yet the 3D positional audio was far better, the somewhat basic 3d graphics have aged gracefully, sometimes even in software rendering, the funny games had attitude, action games were fun and tense and the serious games had great storytelling.

Now we have water effects, a bukkake of particles, bloom and other meme rendering tricks and paper thin story, garbage-tier voice acting and worst of all, mass normie appeal. Hate to be the stereotypical 'back in my day...' grandpa but things really were better back then, if even in the gayming front.

Peripherally related: From the purely aesthetic standpoint, even industrial design has gone to shit in large part, pic related. You did have some wild and ugly shit in the 90s, but it was the exception, not the rule like it is in CY+3

Ever heard of ZIF sockets? They work wonders.

You're just trolling right? Web interfaces the fastest? No way.... but then you're probably excluding the truly fast interfaces for emotional / marketing reasons. I remember 15 - 20 years back when they "upgraded" the local county library system. We had perfectly good dumb terminals that talked to a central server, they were always up unless someone physically damaged the terminal, and they were as fast as the server was. Then.... someone wrangled a federal grant. They installed top-of-the-line (for their time) Dell desktops with nice monitors, and they were locked into doing nothing but providing an IE / JS based web interface to the exact same server. Extremely slow, often down, librarians couldn't figure out how to use them, sometimes they'd get stuck in spic language only mode, and each station had a nice stone monument with a plaque explaining about the grant money (probably cost as much as one of the computers). What did I have to do just to find a book? Go to the children's section, where they still had the text terminals.


Implying more abstraction is a good thing. Don't you want to know where and how your bits and bytes are dancing? I do.


I suppose it's good for shit apps written by shit "appers", but I don't see those as a good things really. Maybe it's ok for certain uses.... maybe for the masses to write the latest crap clone of some game.... but for me, personally, the closer I can get to the hardware, the better. Yes I'm talking assembly (at least on pics and atmels) although for PC I'll concede that having a bit higher level language makes dealing with a GUI easier (C prefered, C++ begrudgingly for compatibility).

I see what you did there. Ha!

About once every six months I hit up metacritic and the top 100 games on bittorrent just to see if there's anything worth playing. Yeah after year, there never is.

Last single player game I enjoyed was STALKER: Call of Pripyat and even that one was uninspired compared to the previous two. Nearly everything is multiplayer now anyways which is immediately disqualification because I hate gamers as well.

I remember how bad Pentium 4's were during their actual era. Every fucking business from mom & pop to IBM was loaded up with these exploding capacitor shitboxes. I'd sit there all day waiting for a series of .NET updates to install or uninstall so I could get whatever industry specific business shitware to run. Then god forbid run into one running Vista...

If most of these didn't die with 30 leaking capacitors I'd probably still see them everywhere. It's like every motherfucker in the world upgraded during the worst fucking series of processors possible, then decided to sit back and not upgrade for 15 years.

Losing argument dude. Developer time costs the company money / execution time costs the user money

..I've used mine for gaming since Xmas 2011

No hardware issues other than the shitty integrated audio grounding

I only retired it in late 2016 when I got a new rig

Now wait just a damn motherfucking second

I was one of the people who thinks this thread was fucking retarded

But I will be damned if people say the Pentium 4 was anything but the processor with the greatest longevity of any other processor. People rocked those things as late as 2012. I don't think we will ever see a processor with such longevity ever again (maybe Skylake and Zen now that CPU performance gains are hitting diminishing returns) But having a high stock clock speed+hyperthreading+eventual Intel 64-bit extensions meant that the Pentium 4 managed to remain relevant well into the multicore era since the majority of programs still used only one core so for most people the Pentium 4 didn't start becoming a real bottleneck for software until maybe 2010 at the very least

Computers running a terminal emulator or actual terminals? The libraries I went to around that time were still using card catalogs for everything and the one that had a computer for searching had it behind a desk and you had to ask someone who worked there to look stuff up for you.


Last or newest? Fallout New Vegas is pretty good and has great mod support. Fallout 3 is absolute garbage though and I can't even stand to play it after playing New Vegas. Haven't played CoP yet but I gave up on SoC rather quickly due to the combination of low gun damage and inaccurate guns forcing me to play more up close run and gun.

Nah, they were fucking terrible at their time. You can get by with them if you use a light linux but I remember damn well how consistently painful they were simply running XP or Vista, Outlook, and whatever loadout of shitty .NET business software that you'd always run in to.

And the most common boxes had insanely high death rates due to bad capacitors and overheating problems. Or more likely they would just randomly flake for a few years before dying completely.


I can't fucking stand Fallout 3 and hate it so much I doubt I'd like New Vegas.

SoC is potato guns until you get the Ak-74 or to be more honest, NATO rifles. It's one of the most modded games ever with new mods still being released, so that can all be fixed dozens of different ways. Also the firearms are simply modded yourself in the plain text ltx files. Also play on the most difficult setting because the only thing it changes that on easier settings it nerfs guns in a really stupid way for both player and npc

And the longevity had more to do with general economic conditions and the fact that everybody chose to upgrade during that product cycle due to that general push that you got with XP and other Wintel business software. Then everybody decided to never upgrade again so they kept using it.

Speaking of diminishing returns

Wouldn't it make more sense to make specialised processors and pack them like socs?

One for sound, another for video, third for input and fourth as the main?

Kind of like cores but not as shitty.

There's a large difference between the two. I have a couple hundred hours into New Vegas while I struggled to put 30 hours into Fallout 3 over a period of several months while telling myself it has to get better.

Might have to try that. I've had a feeling that I shouldn't skip further ahead in the series without beating the earlier games.

I've heard that, I was playing on master.

You're referring to a known incident that involved a Chinese company that supplied rejected capacitor batches to certain motherboard manufacturers like HP. But I believe they only affected computers sold between like 2003 and 2006

Thats exactly what APUs are, since modern GPUs are pretty much just highly-parallel general purpose CPUs. Nvidia GPUs are literally ARM chips with many integrated cores for example. Intels GPUs are something similar, just many small float-point computation units with a controller wrapper that exposes APIs like Directx and OpenGL to the programmer

Don't forget that websites wouldn't be bloated clusterfucks devoting 99.9999% of bandwidth to appearance and 0.00001% to content. Imagine webpages and even streaming videos loading instantly, like zero lag.

Web 2.0 would never have happened.

Yeah titles like Daggerfall, Deus Ex, Thief II, System shock, HL2....

All before triple A cancer.

Oh yeah, and another thing, most smartphone SoC DO in fact also pack the audio chip (in Qualcomns case its a DSP that is also used as a DaC for audio) GPU, input controller, and even the 4G/3G/Wifi radio chip all on the same exact chip package as the CPU. ARM CPU cores actually take up very little die size

web 2.0 refers to user-generated content, chucklefuck. you're shitposting on a "web 2.0" website right now.
quit using buzzwords you do not understand.

Source on Nvidia's CUDA cores being ARM based?


Yes, it would have. Any website with any client side or server side scripting is "Web 2.0". Every website that users can post on in any way is "Web 2.0" including forums, imageboards, and Wikipedia. The first "Web 2.0" sites came out in the late 1990s.

...

Coined in 1999, you fucking retard.
>>>Holla Forums

Nigger can you not read? The first poster described the outcome while the second poster gave a more technical description of the changes that made that outcome possible. I don't use Google.

guru3d.com/news-story/nvidia-maxwell-to-be-first-gpu-with-arm-cpu-in-2013.html

Coined in 200 BC
>>>/cuteboys/

The generally accepted textbook web 2.0 definition is completely different from the derogatory use of it across the internet.

urbandictionary.com/define.php?term=Web 2.0

Before acting like a jackass, ask people what they meant by a often misused term.

That's just another one of their Tegra SoCs for phones/tablets/whatever, in particular the Tegra K1 which has 2 or 4 ARM cores and 192 CUDA cores. I don't know what that author is smoking that they think it's the first SoC that Nvidia has put out since they've been making SoCs with various ARM cores since 2008 when they were using an ARM11 core for the CPU.


You mean the definition used by retards who don't know that they're talking about and just latched onto it as a way of complaining about things they don't like?

You just described 100% of the people using the term web 2.0 for the fifteen years since its invention.

Oh please
Devs are writing shittier code because they're not forced to live under hardware constraints anymore. Actually, writing shit code actually INCREASES the amount of people who would buy high-end devices, so those two big corps are teaming up with each other to make their big shit-pile intensify

WHO THE FUCK WOULD SUPPORT THIS

This da truth.

It's a vicious cycle of "yay, we can add another abstraction layer" and planned obsolescence

...

If you're doing unrealistic alternate histories why not just say the world would be a better place if normalfags never warmed up to computers post dotcom-crash?


And you don't have to sacrifice a legitimately useful technological advance either. Your points are kind of weak too:


IMHO the issue is that people are being pragmatic and not idealistic. When a device or software is shit, but does one useful function, many agree to suffer the shit just for the useful function. They don't have the discipline to say "I don't care about the incremental improvement in usability if it is harmful software", and thus create a market for products that can get away with all sorts of shady bullshit so long as they do one thing right. There are small groups who are exceptions to this, but there is a large segment of compromise-junkies who will support any shitty product without thinking.

You have the same problem with consumer products in general, say clothes. Everybody loves bitching about muh sweatshops, muh made in bangladesh, muh low quality etc. But when they go to the store all that is forgotten. They won't pay %50 more for a shirt made in the USA, even when the shirt is dirt cheap to the point of irrelevance either way. And these faggots have reached critical mass, so no matter how responsible a consumer you are, you will never attract much interest from the industry simply because they can make more money from retards.

Abstraction layers are not a bad thing. What is bad is how they are used in modern times. A good abstraction layer allows you to elminate the code below it thereby reducing total bloat. But in the modern world its layer of shit upon shit upon shit.

Would also mean that RISC-V and other alternate ISAs would have a chance at competing in performance.

This. A good example of this is with consoles where the PS2 from 2000 had 32 MB of RAM and 4 MB of VRAM, the PS3 from 2006 256 MB of RAM and 256 MB of VRAM, and the PS4 in 2013 jumped to 8 GB of shared RAM/VRAM. How much did games really advance from that last hardware jump? I know Fallout 4 when it came out couldn't even maintain 30 FPS throughout normal play on the PS4 and Xbox One because the developers just stopped caring about optimizing the game to run well on the hardware they were making it for, despite games now days having budgets over $100,000,000.

Ya

I mean, I love me some Python, and do appreciate the architecture- and OS-agnosticness of HTML/JS

But I do sympathize with this vicious cycle of abuse relying on Moore's law

Black Isle games are fucking garbage, blow your brains out D&D scum.

normalfags ruined technology.

youtube.com/watch?v=sdSSsuSssg0

All those things would still have happened to some extent, its just that it would be even worse than what it is now.

Developers would simply git gud and optimise the ever loving fuck out of their code, the issue is though that this would make most programs even more locked to a particular platform (x86) since they would utilise as much of the specialised instruction sets as possible.

You don't understand what fuels advancement in tech, its not the consumers, its industry. Businesses who are constantly looking for an edge over the competition are what drive advancements, you think the GPU in your computer is what it is today because consumers demanded it? fuck off, its because industry demanded it and it eventually filtered down to you. Its this reason why the most advanced Nvidia GPUs are their enterprise cards (eg, the P100) and their consumer cards are always one step behind on features (ignoring things like the Titan).

Just because the hardware stops advancing doesn't mean that businesses will stop demanding better performance from their systems or more elaborate websites. Its just going to mean that code will become more and more platform (x86) specific.


Processors being faster is what is allowing alternate ISAs to gain traction you fucking idiot. As I said above with the instruction sets, imagine if every program relied so heavily on specific x86 instructions to the point where half the code was highly optimised ASM to get the speed. Doing a port to a different ISA would be so costly that no developer would even consider it and so no one would bother buying alternate architecture chips because lolnosoftware.

FOSS or proprietary software which was written in C or other languages and didn't rely on platform specific ASM would be completely awful in comparison and few would actually develop such software.

Compare this to today, where developers can create C# or Java programs which are superior in functionality to many C or C++ programs of 10 years ago. Despite being often poorly coded they still run well on modern systems and can be released on alternate ISAs without much effort.

Also RISC-V is garbage, if it was any good then manufacturers would be all over it like flies on shit because it means no more having to license IP from ARM.


You have objectively shit taste.

I could have the same system I'm getting February it'd just be EATX instead of UATX.

No.


No, you wouldn't, because going to that extent would be ridiculously expensive compared to what similar performance costs now days.

This is wrong. Enterprise-level equipment is the exception in the GPU world, not the norm. The GPU industry ultimately relies off of consumers wanting to play new videogames as a source of billions of dollars of revenue.

Of course enterprise GPU's will be more powerful; their unit price is much higher. However, don't believe for a second that if consumers magically stopped purchasing GPU's overnight that the industry would carry on like normal. That would wipe out the ~90% of the funds for research and development.

Forgot picture

The problem is that we advanced too fast. When computer development was slow, people had time to innovate with the technology we had at the time. There were constraints, you had to work with the hardware. Now we have resources so superfluous software developers can make the file size of programs 5x what they should be to prevent piracy. Software is still developed in the same way it was (maybe even) two decades ago.

Dude, I forgot that for all these years until now.

You motherfucker ass-faggot.

Not responding to your post, just that pic
I fucking despise idiots who see PC shipments are going down and saying "The PC is dead!"
No, you stupid fucks, the PC isn't dead, PC sales are just normalizing, it means the PC has no more room to grow. "Post PC era" is a play on "Post Industrial era" it doesn't mean industry is dead in the first world, it just means industry is not expected to experience anymore significant growth in the first world. Prompting investors to focus on information and services instead. People completely misunderstood both these buzzterms

let me rephrase that
That would wipe out the ~90% of the money for the shareholders

Gpu are a scam since 2010+ imo
See this post who is still going since 2013

Until we don't have manuals and freedom to do we want with them their is no point changing gpu... well except when it dies.
I have a geforce 6200 and it more than enough to read any videos (even hd) and basics games.
(plus some function in the driver weren't completely made on it.)

Forgot link
eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/

Please fuck off back to reddit where you belong.

You don't know shit about business. The only reason why the lower price brackets are so cheap is because they were able to bulk the same wafers into the higher-price brackets.

If Nvidia decided to stop this practice and make new silicon for the higher price brackets, the prices of the lower-end would have to go up as a result. Would you rather have that?

go do a cum tribute to your call of duty disc you dumb fucking nigger pleb.

I am very mad about this new web design. With google they at least give you the traditional design when they detect you disabled the scripts but other websites will just force you to scroll down to the end if you want to see all items. Then your browser will freeze and crash.

It's like they want goyim to see all the products so they can cash in spontaneuos buying behaviours.

You're blaming the medium rather than the idiots who misuse it.

Pentium 4 was utter crap even back when it was new. High clocks meant shit when terrible IPC had it soundly beaten by Athlons with 30% lower clock rates (and prices). The only reason a glut of people held to them for this long is due to economy crash in the West leaving people with little spare money for not-absolutely-necessary upgrades.

As far as longevity is concerned, Sandy Bridge is likely what will last the longest. Every subsequent iteration of desktop chips since has been a tiny bump in IPC with no improvement in clocks nor cache size nor core count nor whatever else. If you bought a quad-core Sandy five years ago, you're set today and will likely be fine-ish five years into the future.

Note that CPUs will likely start improving faster from now, once AMD is back in the game and Intel is forced to step it up.

t. eight-core Sandy Bridge Xeon owner.

Not sure they're perfectly analogous though.
Post industrial societies really have permanently lost industrial / manufacturing jobs.
Whereas PC upgrade cycles just keep getting longer and longer, but my time spent using it remains very high. Phones/tablets have replaced close to zero of the utility of my PC.

They're too slow for the modern world

Smart home devices are the worst shit that has yet been invented and accepted by consumers. Worse than gaming laptops.

To be fair it's mostly what you have described but you have to take into account the modern teaching that we have.
I got family and one of the young ones showed me how they were learning to code.
The boy is 14 and goes on this
code.org/
How the fuck do you want people that can make software correctly if they only have garbage teaching.

Also
see sponsors

Phones/tablets DO IN FACT replace almost EVERY task one could do on a PC, with the exceptions of gaming and coding.

(This is depressing but true nonetheless)

If a kid has even a sliver of initiative, docs.python.org/3/ and [email protected]/* */ will teach him everything

I honestly can't tell the difference in a lot of cases

Games do not need anything more advanced (graphically) than Quake 3 or Half-life 2 had. These things don't really add value to the gameplay at all.

This is true. I had more fun playing mario than grand theft auto or witcher.


Whats wrong with that? What would you teach a 14yo?

SICP

GTA:SA was really good, though

OK this guy for Trumps computer science Tsar.

The little schemer is a perfect book for teachers teaching scheme.
It's extremely basic how it approaches problems and the difficulty goes gradually so you must fully understood/guess the previous pages.
Someone with no skill can learn from it.
So a teacher, teaching kids is more than feasible.

Better lighting can make a better stealth game. Better physics and more powerful hardware means that shooters can be built with destructible environments. The problem is that it is too expensive to fully push to the graphics hardware these days unlike 15 years ago. Plus, no one optimizes or has a QA department anymore because frameworks and cost.

Is the constant devaluation of the dollar by the bankers the cause of companies axing QA departments? The purchasing power of the dollar is stripped out so companies start axing things like the QA people. Sales slow down because the costs are too high and people do not have enough money.

If the game industry would use free/libre game engines they would have them optimized each time a game would be made instead of making shit like Bethesda and their pitiful game engine that hasn't been updated since FA:NV.

It's a bit of everything.

do you really believe they do this to reduce costs only to survive? Any investment is calculated with ~12% ROI. If the ROI is not met, the investment will not be done. If the company survives its first few years, it usually gets the ROI they calculated. Then the company gets more known for their products, they improve and get better known. After all, you cannot survive as a newcomer if you produce crap anyway.

Then at some point, managers and consultants show up and say you need to (((optimize))) things to get a better ROI for your investors. What happens next is they fire all the best engineers who had all the know how because their wages were the highest in the company and instead they import pajeets who work for 60% of the wages. Then they change the raw materials to cheaper ones or just cut down the materials to 85% of the former use. Then they fire all their test engineers because today you as the customer are the beta tester. Short: Quality sinks heavily.

Meanwhile the prices for their products stay the same and they increase their ROI from 12% to 18% for a few years. Then all the customers who bought from them again after experiencing a good buy the first time, feel betrayed and leave for competition. Then they go bust. And all this shit only because some tech illiterate managers were too greedy and had no idea that you don't fire your best people.

it appears you used ixquick or startpage user that engine sucks balls these days

That post smells like it was experienced.

For who you where working before being fired and replaced by pajeets ?

That should be hardware and typing intensive tasks, though a convertible tablet or a tablet with a decent bluetooth keyboard (not one of those keyboard cases) can do a decent job for the later.

It's not like you're forced to be public. Just start a private company, borrow funds and go small in the beginning unless you have capital sitting around.

In software you don't even need any expensive equipment or materials. Just people capable of seeing a vision and accepting little or no pay until a project gets off the ground.

This is true

Many nice toys (for adults and kids) would not exist or kost you a fortune.
Not to forget no programmable or electric helpers in your household.

ITT: DA JOOOOS

that and mergers gobbling up the innovative/creative companies that make what most of us consider 'good'.

Fuck you. Multiple times my computer has hung for minutes at a time when I open some abortion of JS and CSS on some god awful news website
The "Design" and "aesthetics" are fucking worthless. Sites made by designers dont look pretty, or cool, or anything noteworthy, they look completely mundane and pedestrian. All this for nothing
I hope Holla Forums takes power and sends all designers into the ovens. If not Holla Forums then I hope we have a communist revolution and they get sent to gulags.

I forgot to add, this is on my desktop with 8 GB of ram.

Yeah, wouldn't want the poor kid to play with something like legos now would we? Worse yet, he could get some unpowered wood working tools or something and pick up a crafts hobby. Or (gasp) go outside and play with his friends.


What's wrong with how MS Word looks anyhow? For centuries books have worked great for disseminating information. If it was such a bad format you'd think they'd have switched to something else by now.

The TMS9900 was about 80mm..