The world would be a better place if processor development never went past 130nm

Levi Johnson
Levi Johnson

Just think about it
handheld mobile devices never advance past early-mid 2000s PDAs and early smartphones
dual core processors don't take off meaning bloated software will cause your computer to hang completely, which will be obvious to less knowledgeable users pushing even them away from bloated software
programs that send information about you to companies in the background would have an obvious effect of performance and would be rejected by everyone
Windows doesn't advance much past XP
javascript never advances past being used to add fancy effects for users who choose to enable it

All urls found in this thread:
http://phys.org/news/2004-09-industry-mass-production-dram-90nm.html
http://www.guru3d.com/news-story/nvidia-maxwell-to-be-first-gpu-with-arm-cpu-in-2013.html
http://www.urbandictionary.com/define.php?term=Web+2.0
https://www.youtube.com/watch?v=sdSSsuSssg0
http://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/
http://code.org/
https://docs.python.org/3/
Grayson Sullivan
Grayson Sullivan

The worst mistake when it comes to bloat was RAM and garbage collection, not CPUs.

Christopher Thompson
Christopher Thompson

Yeah, I forgot how RAM wouldn't have gone past about 256 MB per stick so you wouldn't be able to put more than 1GB of RAM on most motherboards.
http://phys.org/news/2004-09-industry-mass-production-dram-90nm.html

Eli Jackson
Eli Jackson

Fuck off NSA

Cooper Reyes
Cooper Reyes

b-b-b-b-b-b-b-b-bu-but muh gemmen
t. /v/eddit

Cameron Cooper
Cameron Cooper

implying games weren't better then than they are today

Mason Powell
Mason Powell

Even most gamers on this site would agree, the early to mid 2000's was considered a golden age and games from that era like CS are still very popular. The improvement in graphics is a joke when you consider the top selling games since then were Minecraft, Wii Sports and Angry Birds.

Better graphics has only resulted in polished turds they get to charge more for.

Oliver Kelly
Oliver Kelly

too be fair, ever since the Commodore 64, video games in the PC space have been treated with a lot of respect, given its importance in driving hardware sales, and driving hardware development at the same time. Games have also been shown to be a great demonstration tool for new hardware, even for non-gamers, like people who need high-end workstations will be shown a demonstration of the latest graphics, making them excellent marketing tools.

Of course I wouldn't expect anyone in this thread to care about that stuff, I mean, I'm in a thread literally titled "the world would be a better place if processor development never went past 130nm"

Anthony Hill
Anthony Hill

Wii Sports
That was a pack in you double nigger.

Christian Evans
Christian Evans

Even so he still has a point, just shitty examples, he could've said New Super Mario Bros. because that wasn't part of a bundle, or hell, he could've said the entire Pokemon Franchise as the ultimate example

Adrian Stewart
Adrian Stewart

Sometimes I wonder what it would be like if computer components still came as throughhole DIPs. Building a PC would literally involve soldering RAM, EEPROM, RTC, CPU and other chips onto perfboard, and wiring it up. If you want something upgradeable you just put a ZIF socket in. We'd have computers running DIP CPUs, like the Amiga or ZX Spectrum.
Then again I just like the look of a nicely laid out set of DIPs.

Gabriel Rodriguez
Gabriel Rodriguez

handheld mobile devices never advance past early-mid 2000s PDAs and early smartphones
this is good?
dual core processors don't take off meaning bloated software will cause your computer to hang completely, which will be obvious to less knowledgeable users pushing even them away from bloated software
lack of user choice is good?
programs that send information about you to companies in the background would have an obvious effect of performance and would be rejected by everyone
you think people don't know about privacy breaches?
Windows doesn't advance much past XP
why is this good?
javascript never advances past being used to add fancy effects for users who choose to enable it
why is this good?

this is a pretty bad post op delete it now

Jaxon Kelly
Jaxon Kelly

And then virtualization would have never taken off and if you wanted a home lab you would need a rack full of pizza boxes rather than just one box. No thanks.

you do realize DIP sockets exist?

Benjamin James
Benjamin James

Yes, but ZIF sockets look cooler and are easier to work with. They are bigger, so a regular DIP socket would work for saving space if that's needed.

Jack Taylor
Jack Taylor

We'd have computers running DIP CPUs
10" long DIP CPUs just to have as many pins as one of the old Slot 1 CPU sockets
This sounds like a really bad idea.

you do realize DIP sockets exist?
Have fun bending pins trying to insert a 10" long DIP CPU into a DIP socket and god forbid you ever have to take it out because it will fucking snap in half if you do

Landon Nguyen
Landon Nguyen

A 10'' long DIP would be silly, but something a little smaller would be neat. Biggest DIP I know of is the Motorola 68k CPU, which is what the Amiga 500 uses.

Aaron Powell
Aaron Powell

The world would be a better place if processor development never went israel.

Connor Ortiz
Connor Ortiz

but something a little smaller would be neat
The problem would be fitting enough pins onto it, Intel's Slot 1 that was used for the Pentium II and III has 242 contacts and is about 5" long with the contacts spaced a lot tighter than they are on a DIP.

Christian Myers
Christian Myers

handheld mobile devices never advance past early-mid 2000s PDAs and early smartphones
I sincerely hope you're not implying that Palm and Windows mobile was good in any sense of the word.

dual core processors don't take off meaning bloated software will cause your computer to hang completely, which will be obvious to less knowledgeable users pushing even them away from bloated software
Yes because source code optimization can magically overcome hardware limitations and give us modern day performance on shit hardware.

programs that send information about you to companies in the background would have an obvious effect of performance and would be rejected by everyone
You assume that "everyone" would be intelligent enough to understand what causes it and why, much less care.
Protip: You're talking about the kind of people who quite literally buys a new computer because they installed a virus.

Windows doesn't advance much past XP
Ah yes, the good old days of pastel and rampant malware because everything you ran had root permission to do whatever the hell it wanted.

javascript never advances past being used to add fancy effects for users who choose to enable it
Because websites looking like a Microsoft Word document is truly the epitome of what modern web design should be aiming for.

Dylan Martin
Dylan Martin

user, try to understand. These people have no jobs.

Daniel Lee
Daniel Lee

I guess you're pretty young but they make DIP insertion and extraction tools

Isaiah Brooks
Isaiah Brooks

hurr durr new things are bad

Alexander Ortiz
Alexander Ortiz

Quake 3 never needed more than 800MHZ 1 core with 256MB RAM and some reasonable GPU.
Other decent games need even less than that.

Oh and did I say that crap games don't matter?

Nathaniel Reed
Nathaniel Reed

Because websites looking like a Microsoft Word document is truly the epitome of what modern web design should be aiming for
You are just pretending CSS doesn't exist?
Or just being dumb?

William Richardson
William Richardson

boy, do I hate luddites.

Ethan White
Ethan White

programs that send information about you to companies in the background would have an obvious effect of performance and would be rejected by everyone

Not really, no.

Evan Hernandez
Evan Hernandez

What if the course of history was like " either choose better processors or space travel/time travel/flying cars/holograms n shit" a la Road Not Taken?

Aaron Butler
Aaron Butler

I sincerely hope you're not implying that Palm and Windows mobile was good in any sense of the word.
No, but modern smartphones have done irreparable damage to the internet.

Yes because source code optimization can magically overcome hardware limitations and give us modern day performance on shit hardware.
You can actually do quite a bit with old hardware if you cut out the bloat. There's also the the option of multi processor workstations if you really need more power.

You assume that "everyone" would be intelligent enough to understand what causes it and why, much less care.
There are plenty of people who understand how software spies on them today but ignore it because they have "nothing to hide". If the situation also involved their machine slowing down noticeably then a good portion of those people would start to care.

Ah yes, the good old days of pastel and rampant malware because everything you ran had root permission to do whatever the hell it wanted.
implying security fixes couldn't still be added as time went on

Because websites looking like a Microsoft Word document is truly the epitome of what modern web design should be aiming for.
You say websites that look like a Microsoft Word document, I say websites that don't make my browser hang, try to track me everywhere with JS based browser fingerprinting techniques, or potentially try to run malicious JS.

Christopher Torres
Christopher Torres

Go fire up a single core Athlon or Pentium 4 and tell me how great it is.

I'd love to go back to the time when the interface thread and the threads doing all the hard work ran on the same CPU! I really miss the days of interfaces locking up because you're encoding MP3s!

Christopher Hughes
Christopher Hughes

Yes because source code optimization can magically overcome hardware limitations and give us modern day performance on shit hardware.

There was a study done years ago taking a Pentium II 400Mhz, Windows 2000 and Office 2000 and comparing it to a top of the line Pentium 4, Windows XP and Office XP. They ran some industry standard benchmarks on both machines and as it turns out, the Pentium 4 machine was slower despite being massively more powerful. Sure, the days of rendering 1080p video in minutes as compared to rendering sub-dvd video in hours would not be possible without modern hardware, but for most tasks, the POZ and bloat has basically enabled normies to use computers and degraded the user experience.

They used to say what Intel gives, Microsoft takes away. My personal experience, going from a Pentium 75 with dialup to a Pentium 150 with a cable modem still in the web 1.0 days was an absolutely shocking speed and ux transition. Games ran smooth, webpages loaded INSTANTLY. Then Windows XP happened, along with Pentium 4 and soon after Web 2.0. The experience has only degraded since the heady days of 1997/1998.

Here I am on a 3.4GHz i7 with 32GB of ram and a 200mbit symmetrical connection and the typical webpages load as frustratingly slowly as they did 10 years ago. Pages are shit, javascript is shit, browsers are shit, OS development is shit.

Pentium 4 was the first POZ LOAD. AMD developed a kick-ass alternative to the p6 architecture in their K7 so Intel decided to play on normie ignorance by kicking up clockspeed and shitting up execution efficiency. No one fell for it of course, and AMD fucking assraped them until the Core 2 came out, but because Intel was such a dominant player and faggots like DELL just rolled with it, and normies were all like "dude i got a dell" the Pentium 4 survived for years, and with it the philosophy of bloat and flashy bullshit because who cares? We have tons of clock cycles.

I really miss 90s technology. Hell even Apple's fruity G3 and G4 hardware was infinitely more interesting than the Windows 10 dogshit we are getting.

Hudson White
Hudson White

I was just given a 2000 IBM and a badass AOC CRT

currently shopping for voodoo2

Nathaniel Clark
Nathaniel Clark

I can hear the deus ex theme song already

Blake Wilson
Blake Wilson

This is so true, the ecosystem of bloat needs to end. We're throwing hardware at software problems and we're never going to get anywhere if we keep going "throw more shekles at it to make it run faster", why the hell shouldn't my 2GB i5 laptop run everything fine? There are piles of bloat at every level and they're adding up, it's death by a thousand cuts. The only way we're going to deal with this is by going ahead and optimising every open source project out there with needlessly slow code, starting with the foundations: standard libraries and interpreters.

The bloat needs to end or we'll just be paying more and more, throwing out hardware that's perfectly fine in order to buy something we shouldn't need.

Joshua Johnson
Joshua Johnson

Lol just wait til you see that ryzen presentation from mid december. That nigger is full of bloat you couldn't imagine, built right in to the hardware

Ayden Howard
Ayden Howard

javascript never advances past being used to add fancy effects for users who choose to enable it
As for now, web is the safest (because of origin policy and isolation), fastest (because of tons of optimisations in JS implementations) and the most easy2learn technology stack.

Off course modern ``apps'' are slower than programs that were written in compiling languages, but this overhead is not for nothing. You can't get this level of abstraction cheaper.

Also, imagine if there is no JS and ``appers'' used to write their shit in C/C++. Pretty bad scenario, isn't it? That's why I think it's good that JS exists. At least it can't break out of the sandbox and never segfaults.

Jason Baker
Jason Baker

CY+2 in one image. The 90s did it better.

Kevin King
Kevin King

That's photoshoped you know...

Benjamin Lee
Benjamin Lee

It's kind of hard to imagine a female person of poo is going to account for the reflection in both the frame and the reflection of themselves in the screen

Brody Kelly
Brody Kelly

this is good?
Kind of. The mobile ecosystems are ABSOLUTELY DOMINATED by the makers(owners) of the systems, unlike the desktop, where you at least have the option to pretend to exercise control and ownership over your computer.
lack of user choice is good?
No, widespread awareness of bloated programs -> pressure to keep programs slim
is what is good
you think people don't know about privacy breaches?
If device performance were at stake, people might give a fuck
Windows doesn't advance much past XP
why is this good?
Because XP was awesome. Imagine if MS had to adhere to all the good design principles that went into XP.
javascript never advances past being used to add fancy effects for users who choose to enable it
why is this good?
Because closed-source (((((Web 2.0 Apps))))) would have not become a thing.
Although I would argue that the Web 2.0 App thing is nice because it allows closed-source code to run in a sandbox.

Levi Rodriguez
Levi Rodriguez

The experience has only degraded since the heady days of 1997/1998.
Here I am on a 3.4GHz i7 with 32GB of ram and a 200mbit symmetrical connection and the typical webpages load as frustratingly slowly as they did 10 years ago. Pages are shit, javascript is shit, browsers are shit, OS development is shit.
CAN I GET A MOTHERFUCKING AMEN.

However. Conversely.

Mint XFCE + Adblock means my Pentium 4 rig is performing about as well as most new computers (including some Le Gaymen rigs) I've seen with Win 7/10

Kevin Lopez
Kevin Lopez

I have this fridge. If you plug in a USB stick with images, it'll display them in a slideshow. She just loaded a .PNG of a fakey loading screen onto her fridge.

Hunter Gomez
Hunter Gomez

buying a """smart""" fridge
drinking tap water out of the dispenser
drinking tap water at all
why would you do this?

Ethan Murphy
Ethan Murphy

I'm renting, it was already here.
implying I would drink it if it wasn't piped through a reverse osmosis filter first

Blake Bennett
Blake Bennett

No one's stopping you from using 20-year-old technology on your quad core CPU.

Adam Long
Adam Long

doesn't that take all the minerals out of the water?

Hunter Cox
Hunter Cox

adding minerals is a fancy way for companies to say they did the bare minimum to pass inspection plus the daily dose of chlorine

Luis Robinson
Luis Robinson

Go to whole foods and pick up a few different fancy mineral waters (evian, voss, aqua panna, gerolsteiner, perrier, etc.) and taste them at room temperature. There's a yuge difference in taste. also, you get minerals from water that you don't in your regular diet.

Luke Gray
Luke Gray

IF the world could adopt the Smalltalk or LispOS approach to computing then several layers of bloat would be eliminated.

Jack Garcia
Jack Garcia

I sure like to agree, OP.

However:

the world would be a better place if comic development never went past cave drawings

Just think about it
no mongos and animuh
no weebs jacking off to it and posting on anonymous online boards about it
no weebs migrating to technology boards and claiming it's legit because the origin was some degenerates nippon board

Lucas Smith
Lucas Smith

IF the world could adopt the Smalltalk or LispOS approach to computing then several layers of bloat would be eliminated.
nah, they just would look different

Logan Lewis
Logan Lewis

Weebs are good, retard.

Aiden Adams
Aiden Adams

weebs
good

Dylan Miller
Dylan Miller

I think they are fine because they like a country with an aging ultra-conservative to the point of dying off and using IE society that happened to be arch masters at genocide.
k

Isaac Fisher
Isaac Fisher

KOREA GO TO HELL!

Ian Turner
Ian Turner

That implies that browser based bloatware is barely acceptable with todays computing power.

But it isn't. Therefore, they would have found a way to bloat/botnettify those lower ressources.

And indeed that happened. Remember Quicktime or Realplayer?

Kevin Williams
Kevin Williams

implying

Easton Cooper
Easton Cooper

Not korean, but I'm pretty sure I'm already in hell, with mentally ill -> weebs <- allowed to spam me with animuh pics.

William Morgan
William Morgan

no fam, it works like this:
pre-1982: Prehistory
1982 - 1987: Golden Age
1988 - 1993: Silver Age
1994 - 2001: Bronze Age
2002 - $CURRENT_YEAR: Age of Shit

Caleb Myers
Caleb Myers

the early to mid 2000's was considered a golden age

96 to 2001 had so many incredible titles released it boggles the mind. Some examples, and many overlooked as well:
Quake
Duke Nukem 3D
Shadow Warrior
Half-Life
Thief
Deus Ex
System Shock 2
Sin
Unreal Tournament
Fallout / Fallout 2
Arcanum
Planescape:Torment
Starcraft
Descent:Freespace
Wing Commander

The decade before too with all its point & click adventures, RPGs, simulators etc.

I think it started to go to shit around 2005/6, with only the occasionally truly quality title. The mid-late 90s had comparatively stone age era technology, and yet the 3D positional audio was far better, the somewhat basic 3d graphics have aged gracefully, sometimes even in software rendering, the funny games had attitude, action games were fun and tense and the serious games had great storytelling.

Now we have water effects, a bukkake of particles, bloom and other meme rendering tricks and paper thin story, garbage-tier voice acting and worst of all, mass normie appeal. Hate to be the stereotypical 'back in my day...' grandpa but things really were better back then, if even in the gayming front.

Peripherally related: From the purely aesthetic standpoint, even industrial design has gone to shit in large part, pic related. You did have some wild and ugly shit in the 90s, but it was the exception, not the rule like it is in CY+3

muh silver painted plastic guangdong gutter oil street shitter aesthetic
muh gloss black aluminum oil and scratch magnet normietricker
muh bukkake of small low contrast tn panels and useless buttons
muh everything by wire meaning everything has latency
muh disposable cheap shit
muh michael bay school of vehicle design and tons of stupid buttons
muh curves and jutting triangles everywhere
muh normalfag prole infestation

Mason Turner
Mason Turner

Have fun bending pins trying to insert a 10" long DIP CPU into a DIP socket

Ever heard of ZIF sockets? They work wonders.

Logan Richardson
Logan Richardson

As for now, web is the safest (because of origin policy and isolation), fastest (because of tons of optimisations in JS implementations) and the most easy2learn technology stack.

You're just trolling right? Web interfaces the fastest? No way.... but then you're probably excluding the truly fast interfaces for emotional / marketing reasons. I remember 15 - 20 years back when they "upgraded" the local county library system. We had perfectly good dumb terminals that talked to a central server, they were always up unless someone physically damaged the terminal, and they were as fast as the server was. Then.... someone wrangled a federal grant. They installed top-of-the-line (for their time) Dell desktops with nice monitors, and they were locked into doing nothing but providing an IE / JS based web interface to the exact same server. Extremely slow, often down, librarians couldn't figure out how to use them, sometimes they'd get stuck in spic language only mode, and each station had a nice stone monument with a plaque explaining about the grant money (probably cost as much as one of the computers). What did I have to do just to find a book? Go to the children's section, where they still had the text terminals.

Off course modern ``apps'' are slower than programs that were written in compiling languages, but this overhead is not for nothing. You can't get this level of abstraction cheaper.

Implying more abstraction is a good thing. Don't you want to know where and how your bits and bytes are dancing? I do.

Also, imagine if there is no JS and ``appers'' used to write their shit in C/C++. Pretty bad scenario, isn't it? That's why I think it's good that JS exists. At least it can't break out of the sandbox and never segfaults.

I suppose it's good for shit apps written by shit "appers", but I don't see those as a good things really. Maybe it's ok for certain uses.... maybe for the masses to write the latest crap clone of some game.... but for me, personally, the closer I can get to the hardware, the better. Yes I'm talking assembly (at least on pics and atmels) although for PC I'll concede that having a bit higher level language makes dealing with a GUI easier (C prefered, C++ begrudgingly for compatibility).

Jacob Brooks
Jacob Brooks

Peripherally related:
I see what you did there. Ha!

Adrian Rogers
Adrian Rogers

About once every six months I hit up metacritic and the top 100 games on bittorrent just to see if there's anything worth playing. Yeah after year, there never is.

Last single player game I enjoyed was STALKER: Call of Pripyat and even that one was uninspired compared to the previous two. Nearly everything is multiplayer now anyways which is immediately disqualification because I hate gamers as well.

Bentley Parker
Bentley Parker

I remember how bad Pentium 4's were during their actual era. Every fucking business from mom & pop to IBM was loaded up with these exploding capacitor shitboxes. I'd sit there all day waiting for a series of .NET updates to install or uninstall so I could get whatever industry specific business shitware to run. Then god forbid run into one running Vista...

If most of these didn't die with 30 leaking capacitors I'd probably still see them everywhere. It's like every motherfucker in the world upgraded during the worst fucking series of processors possible, then decided to sit back and not upgrade for 15 years.

Grayson Murphy
Grayson Murphy

Don't you want to know where and how your bits and bytes are dancing? I do.
Losing argument dude. Developer time costs the company money / execution time costs the user money

Bentley Brown
Bentley Brown

..I've used mine for gaming since Xmas 2011

No hardware issues other than the shitty integrated audio grounding

I only retired it in late 2016 when I got a new rig

Thomas Clark
Thomas Clark

Now wait just a damn motherfucking second

I was one of the people who thinks this thread was fucking retarded

But I will be damned if people say the Pentium 4 was anything but the processor with the greatest longevity of any other processor. People rocked those things as late as 2012. I don't think we will ever see a processor with such longevity ever again (maybe Skylake and Zen now that CPU performance gains are hitting diminishing returns) But having a high stock clock speed+hyperthreading+eventual Intel 64-bit extensions meant that the Pentium 4 managed to remain relevant well into the multicore era since the majority of programs still used only one core so for most people the Pentium 4 didn't start becoming a real bottleneck for software until maybe 2010 at the very least

Levi Sanchez
Levi Sanchez

text terminals
Computers running a terminal emulator or actual terminals? The libraries I went to around that time were still using card catalogs for everything and the one that had a computer for searching had it behind a desk and you had to ask someone who worked there to look stuff up for you.

Last single player game I enjoyed was STALKER: Call of Pripyat
Last or newest? Fallout New Vegas is pretty good and has great mod support. Fallout 3 is absolute garbage though and I can't even stand to play it after playing New Vegas. Haven't played CoP yet but I gave up on SoC rather quickly due to the combination of low gun damage and inaccurate guns forcing me to play more up close run and gun.

Tyler Parker
Tyler Parker

Nah, they were fucking terrible at their time. You can get by with them if you use a light linux but I remember damn well how consistently painful they were simply running XP or Vista, Outlook, and whatever loadout of shitty .NET business software that you'd always run in to.

And the most common boxes had insanely high death rates due to bad capacitors and overheating problems. Or more likely they would just randomly flake for a few years before dying completely.

I can't fucking stand Fallout 3 and hate it so much I doubt I'd like New Vegas.

SoC is potato guns until you get the Ak-74 or to be more honest, NATO rifles. It's one of the most modded games ever with new mods still being released, so that can all be fixed dozens of different ways. Also the firearms are simply modded yourself in the plain text ltx files. Also play on the most difficult setting because the only thing it changes that on easier settings it nerfs guns in a really stupid way for both player and npc

James Gutierrez
James Gutierrez

And the longevity had more to do with general economic conditions and the fact that everybody chose to upgrade during that product cycle due to that general push that you got with XP and other Wintel business software. Then everybody decided to never upgrade again so they kept using it.

Andrew Anderson
Andrew Anderson

Speaking of diminishing returns

Wouldn't it make more sense to make specialised processors and pack them like socs?

One for sound, another for video, third for input and fourth as the main?

Kind of like cores but not as shitty.

Brayden Rodriguez
Brayden Rodriguez

I can't fucking stand Fallout 3 and hate it so much I doubt I'd like New Vegas.
There's a large difference between the two. I have a couple hundred hours into New Vegas while I struggled to put 30 hours into Fallout 3 over a period of several months while telling myself it has to get better.

Also the firearms are simply modded yourself in the plain text ltx files
Might have to try that. I've had a feeling that I shouldn't skip further ahead in the series without beating the earlier games.

Also play on the most difficult setting because the only thing it changes that on easier settings it nerfs guns in a really stupid way for both player and npc
I've heard that, I was playing on master.

Luis Nelson
Luis Nelson

And the most common boxes had insanely high death rates due to bad capacitors and overheating problems. Or more likely they would just randomly flake for a few years before dying completely.

You're referring to a known incident that involved a Chinese company that supplied rejected capacitor batches to certain motherboard manufacturers like HP. But I believe they only affected computers sold between like 2003 and 2006

Carson Hughes
Carson Hughes

Thats exactly what APUs are, since modern GPUs are pretty much just highly-parallel general purpose CPUs. Nvidia GPUs are literally ARM chips with many integrated cores for example. Intels GPUs are something similar, just many small float-point computation units with a controller wrapper that exposes APIs like Directx and OpenGL to the programmer

Noah Ortiz
Noah Ortiz

Don't forget that websites wouldn't be bloated clusterfucks devoting 99.9999% of bandwidth to appearance and 0.00001% to content. Imagine webpages and even streaming videos loading instantly, like zero lag.

Web 2.0 would never have happened.

"hmm i need a new phone"
go to amazon
search: "phone"
select: "by price - lowest to highest"
scroll through page
"this all seems to be accessories, not phones"
bottom of page
no option to move through multiple pages
have to manually click through 60 pages of garbage
each page taking 20 seconds to load due to bloat
FUUUUUUUUCK
lose interest in what i want

Even most gamers on this site would agree
Yeah titles like Daggerfall, Deus Ex, Thief II, System shock, HL2....

All before triple A cancer.

Asher Cox
Asher Cox

Oh yeah, and another thing, most smartphone SoC DO in fact also pack the audio chip (in Qualcomns case its a DSP that is also used as a DaC for audio) GPU, input controller, and even the 4G/3G/Wifi radio chip all on the same exact chip package as the CPU. ARM CPU cores actually take up very little die size

Aiden Scott
Aiden Scott

web 2.0 refers to user-generated content, chucklefuck. you're shitposting on a "web 2.0" website right now.
quit using buzzwords you do not understand.

Jace Evans
Jace Evans

Nvidia GPUs are literally ARM chips with many integrated cores for example
Source on Nvidia's CUDA cores being ARM based?

Web 2.0 would never have happened.
Yes, it would have. Any website with any client side or server side scripting is "Web 2.0". Every website that users can post on in any way is "Web 2.0" including forums, imageboards, and Wikipedia. The first "Web 2.0" sites came out in the late 1990s.

James Jenkins
James Jenkins

the most controversial term means EXACTLY what i just googled it to mean!
two posters manage to give two different definitions
because google gives different results according to location

Aaron Taylor
Aaron Taylor

World Wide Web websites that emphasize user-generated content, usability (ease of use, even by non-experts), and interoperability (this means that a website can work well with other products, systems and devices) for end users. The term was popularized by Tim O'Reilly and Dale Dougherty at the O'Reilly Media Web 2.0 Conference in late 2004, though it was coined by Darcy DiNucci in 1999
Coined in 1999, you fucking retard.
/v/index.html

William Davis
William Davis

two posters manage to give two different definitions
Nigger can you not read? The first poster described the outcome while the second poster gave a more technical description of the changes that made that outcome possible. I don't use Google.

Angel Long
Angel Long

http://www.guru3d.com/news-story/nvidia-maxwell-to-be-first-gpu-with-arm-cpu-in-2013.html

Hudson Turner
Hudson Turner

cybernetic - skilled in steering or governing a boat
Coined in 200 BC
/cuteboys/index.html

Adrian Howard
Adrian Howard

The generally accepted textbook web 2.0 definition is completely different from the derogatory use of it across the internet.

http://www.urbandictionary.com/define.php?term=Web+2.0

Before acting like a jackass, ask people what they meant by a often misused term.

David Cox
David Cox

That's just another one of their Tegra SoCs for phones/tablets/whatever, in particular the Tegra K1 which has 2 or 4 ARM cores and 192 CUDA cores. I don't know what that author is smoking that they think it's the first SoC that Nvidia has put out since they've been making SoCs with various ARM cores since 2008 when they were using an ARM11 core for the CPU.

from the derogatory use of it
You mean the definition used by retards who don't know that they're talking about and just latched onto it as a way of complaining about things they don't like?

Robert Gutierrez
Robert Gutierrez

You just described 100% of the people using the term web 2.0 for the fifteen years since its invention.

Bentley Green
Bentley Green

Oh please
Devs are writing shittier code because they're not forced to live under hardware constraints anymore. Actually, writing shit code actually INCREASES the amount of people who would buy high-end devices, so those two big corps are teaming up with each other to make their big shit-pile intensify

WHO THE FUCK WOULD SUPPORT THIS

Grayson Turner
Grayson Turner

This da truth.

It's a vicious cycle of "yay, we can add another abstraction layer" and planned obsolescence

Thomas Roberts
Thomas Roberts

accepted textbook definition
urbandictionary.com

Joseph Gray
Joseph Gray

If you're doing unrealistic alternate histories why not just say the world would be a better place if normalfags never warmed up to computers post dotcom-crash?

computers/internet are seeing as purely business-related boring things
autists and hobbyists support a niche market similar to the status of ham radio today
pozzed retard uis never a thing because of weak market demand
social media never gets big and cancerous because normies see offline interactions as richer
online shopping never grows too much because normies prefer a human cashier
online spying is never done seriously since too few interesting people are online for the govt to care
no sjw poz in tech because computers are lame and boring and capitalist
no fake nerds and gammergrls because technology isn't fetishized so there is no incentive to fake it

And you don't have to sacrifice a legitimately useful technological advance either. Your points are kind of weak too:

mobiles have always been poz with just enough useful features to con you into accepting a ton of bullshit
you can have bloat regardless of how many cores there are, deliberate bloat (as opposed to busy beaver bug) usually doesn't eat all the resources for that precise reason, but just enough to make everything slow
phoning home has always had minimal performance impact, this has been a retard-friendly meme used by privacy advocates to argue with advertisers, in reality the problem is not performance but privacy (which is harder to defend to the general public ie. retards)
windows was always shit compared to linux, although xp was less shit yes
this is also somewhat legit

IMHO the issue is that people are being pragmatic and not idealistic. When a device or software is shit, but does one useful function, many agree to suffer the shit just for the useful function. They don't have the discipline to say "I don't care about the incremental improvement in usability if it is harmful software", and thus create a market for products that can get away with all sorts of shady bullshit so long as they do one thing right. There are small groups who are exceptions to this, but there is a large segment of compromise-junkies who will support any shitty product without thinking.

You have the same problem with consumer products in general, say clothes. Everybody loves bitching about muh sweatshops, muh made in bangladesh, muh low quality etc. But when they go to the store all that is forgotten. They won't pay %50 more for a shirt made in the USA, even when the shirt is dirt cheap to the point of irrelevance either way. And these faggots have reached critical mass, so no matter how responsible a consumer you are, you will never attract much interest from the industry simply because they can make more money from retards.

Cooper Parker
Cooper Parker

Abstraction layers are not a bad thing. What is bad is how they are used in modern times. A good abstraction layer allows you to elminate the code below it thereby reducing total bloat. But in the modern world its layer of shit upon shit upon shit.

Jordan Barnes
Jordan Barnes

Would also mean that RISC-V and other alternate ISAs would have a chance at competing in performance.

Ian Rivera
Ian Rivera

Devs are writing shittier code because they're not forced to live under hardware constraints anymore.
This. A good example of this is with consoles where the PS2 from 2000 had 32 MB of RAM and 4 MB of VRAM, the PS3 from 2006 256 MB of RAM and 256 MB of VRAM, and the PS4 in 2013 jumped to 8 GB of shared RAM/VRAM. How much did games really advance from that last hardware jump? I know Fallout 4 when it came out couldn't even maintain 30 FPS throughout normal play on the PS4 and Xbox One because the developers just stopped caring about optimizing the game to run well on the hardware they were making it for, despite games now days having budgets over $100,000,000.

Angel Wood
Angel Wood

Ya

I mean, I love me some Python, and do appreciate the architecture- and OS-agnosticness of HTML/JS

But I do sympathize with this vicious cycle of abuse relying on Moore's law

Chase Miller
Chase Miller

Black Isle games are fucking garbage, blow your brains out D&D scum.

James Ramirez
James Ramirez

normalfags ruined technology.

https://www.youtube.com/watch?v=sdSSsuSssg0

Leo Wilson
Leo Wilson

All those things would still have happened to some extent, its just that it would be even worse than what it is now.

Developers would simply git gud and optimise the ever loving fuck out of their code, the issue is though that this would make most programs even more locked to a particular platform (x86) since they would utilise as much of the specialised instruction sets as possible.

You don't understand what fuels advancement in tech, its not the consumers, its industry. Businesses who are constantly looking for an edge over the competition are what drive advancements, you think the GPU in your computer is what it is today because consumers demanded it? fuck off, its because industry demanded it and it eventually filtered down to you. Its this reason why the most advanced Nvidia GPUs are their enterprise cards (eg, the P100) and their consumer cards are always one step behind on features (ignoring things like the Titan).

Just because the hardware stops advancing doesn't mean that businesses will stop demanding better performance from their systems or more elaborate websites. Its just going to mean that code will become more and more platform (x86) specific.

Processors being faster is what is allowing alternate ISAs to gain traction you fucking idiot. As I said above with the instruction sets, imagine if every program relied so heavily on specific x86 instructions to the point where half the code was highly optimised ASM to get the speed. Doing a port to a different ISA would be so costly that no developer would even consider it and so no one would bother buying alternate architecture chips because lolnosoftware.

FOSS or proprietary software which was written in C or other languages and didn't rely on platform specific ASM would be completely awful in comparison and few would actually develop such software.

Compare this to today, where developers can create C# or Java programs which are superior in functionality to many C or C++ programs of 10 years ago. Despite being often poorly coded they still run well on modern systems and can be released on alternate ISAs without much effort.

Also RISC-V is garbage, if it was any good then manufacturers would be all over it like flies on shit because it means no more having to license IP from ARM.

You have objectively shit taste.

Cameron Sanders
Cameron Sanders

130nm in 2020
Zen architecture
1C 2T per socket
5 GHz liquid cooled
2/4/8/16 socket motherboards are mainstream
8 stick of RAM for 4GB, 16 sticks for 8GB, 32 for 16GB
multi-GPU would be the norm

I could have the same system I'm getting February it'd just be EATX instead of UATX.

Kevin Lee
Kevin Lee

processors no longer being able to shrink means that companies will just try to further optimize software for current architectures and that no one would ever consider developing a more optimized architecture that didn't have to worry about decades of backwards compatibility
No.

2-16 of the highest end processors on the market to hit that level of overclock
all the liquid cooling equipment to support all those processors at that level of overclock
8-32 of the highest capacity RAM sticks on the market
MOAR GPUs
implying this wouldn't be minicomputer size for any configuration with more than 4 processors and be using a larger motherboard than eatx past 2 processors
not realizing how much electricity a beast like this would consume
No, you wouldn't, because going to that extent would be ridiculously expensive compared to what similar performance costs now days.

Wyatt Stewart
Wyatt Stewart

You don't understand what fuels advancement in tech, its not the consumers, its industry. Businesses who are constantly looking for an edge over the competition are what drive advancements, you think the GPU in your computer is what it is today because consumers demanded it? fuck off, its because industry demanded it and it eventually filtered down to you. Its this reason why the most advanced Nvidia GPUs are their enterprise cards (eg, the P100) and their consumer cards are always one step behind on features (ignoring things like the Titan).

This is wrong. Enterprise-level equipment is the exception in the GPU world, not the norm. The GPU industry ultimately relies off of consumers wanting to play new videogames as a source of billions of dollars of revenue.

Of course enterprise GPU's will be more powerful; their unit price is much higher. However, don't believe for a second that if consumers magically stopped purchasing GPU's overnight that the industry would carry on like normal. That would wipe out the ~90% of the funds for research and development.

Liam Peterson
Liam Peterson

Forgot picture

Brayden Wood
Brayden Wood

The problem is that we advanced too fast. When computer development was slow, people had time to innovate with the technology we had at the time. There were constraints, you had to work with the hardware. Now we have resources so superfluous software developers can make the file size of programs 5x what they should be to prevent piracy. Software is still developed in the same way it was (maybe even) two decades ago.

Isaac Hill
Isaac Hill

"dude i got a dell"
Dude, I forgot that for all these years until now.

You motherfucker ass-faggot.

Thomas Lewis
Thomas Lewis

Not responding to your post, just that pic
I fucking despise idiots who see PC shipments are going down and saying "The PC is dead!"
No, you stupid fucks, the PC isn't dead, PC sales are just normalizing, it means the PC has no more room to grow. "Post PC era" is a play on "Post Industrial era" it doesn't mean industry is dead in the first world, it just means industry is not expected to experience anymore significant growth in the first world. Prompting investors to focus on information and services instead. People completely misunderstood both these buzzterms

Nolan Myers
Nolan Myers

That would wipe out the ~90% of the funds for research and development.
let me rephrase that
That would wipe out the ~90% of the money for the shareholders

Gpu are a scam since 2010+ imo
See this post who is still going since 2013

Until we don't have manuals and freedom to do we want with them their is no point changing gpu... well except when it dies.
I have a geforce 6200 and it more than enough to read any videos (even hd) and basics games.
(plus some function in the driver weren't completely made on it.)

Dylan Taylor
Dylan Taylor

Forgot link
http://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/

Angel Parker
Angel Parker

normie
Please fuck off back to reddit where you belong.

Austin Johnson
Austin Johnson

You don't know shit about business. The only reason why the lower price brackets are so cheap is because they were able to bulk the same wafers into the higher-price brackets.

If Nvidia decided to stop this practice and make new silicon for the higher price brackets, the prices of the lower-end would have to go up as a result. Would you rather have that?

James Nguyen
James Nguyen

go do a cum tribute to your call of duty disc you dumb fucking nigger pleb.

Connor Jackson
Connor Jackson

I am very mad about this new web design. With google they at least give you the traditional design when they detect you disabled the scripts but other websites will just force you to scroll down to the end if you want to see all items. Then your browser will freeze and crash.

It's like they want goyim to see all the products so they can cash in spontaneuos buying behaviours.

Henry Murphy
Henry Murphy

You're blaming the medium rather than the idiots who misuse it.
The world would be better place if fire was not found.
Think about it
no pollution
no traffic jams
no loud cars
no car accidents and car deaths

Tyler Parker
Tyler Parker

Pentium 4 was utter crap even back when it was new. High clocks meant shit when terrible IPC had it soundly beaten by Athlons with 30% lower clock rates (and prices). The only reason a glut of people held to them for this long is due to economy crash in the West leaving people with little spare money for not-absolutely-necessary upgrades.

As far as longevity is concerned, Sandy Bridge is likely what will last the longest. Every subsequent iteration of desktop chips since has been a tiny bump in IPC with no improvement in clocks nor cache size nor core count nor whatever else. If you bought a quad-core Sandy five years ago, you're set today and will likely be fine-ish five years into the future.

Note that CPUs will likely start improving faster from now, once AMD is back in the game and Intel is forced to step it up.

t. eight-core Sandy Bridge Xeon owner.

Michael Powell
Michael Powell

Not sure they're perfectly analogous though.
Post industrial societies really have permanently lost industrial / manufacturing jobs.
Whereas PC upgrade cycles just keep getting longer and longer, but my time spent using it remains very high. Phones/tablets have replaced close to zero of the utility of my PC.

Nolan Brown
Nolan Brown

They're too slow for the modern world

Blake Powell
Blake Powell

Smart home devices are the worst shit that has yet been invented and accepted by consumers. Worse than gaming laptops.

Michael Mitchell
Michael Mitchell

Devs are writing shittier code because they're not forced to live under hardware constraints anymore
To be fair it's mostly what you have described but you have to take into account the modern teaching that we have.
I got family and one of the young ones showed me how they were learning to code.
The boy is 14 and goes on this
http://code.org/
blocks (kinda interesting for younger children)
javascript
HTML- CSS
java
python
How the fuck do you want people that can make software correctly if they only have garbage teaching.

Also
see sponsors
cancer everywhere

Hunter Adams
Hunter Adams

Phones/tablets have replaced close to zero of the utility of my PC.
Phones/tablets DO IN FACT replace almost EVERY task one could do on a PC, with the exceptions of gaming and coding.

(This is depressing but true nonetheless)

Jayden Martin
Jayden Martin

python
How the fuck do you want people that can make software correctly if they only have garbage teaching.
If a kid has even a sliver of initiative, https://docs.python.org/3/ and [email protected] will teach him everything

Nolan Long
Nolan Long

polished graphics

I honestly can't tell the difference in a lot of cases

Mason Collins
Mason Collins

Games do not need anything more advanced (graphically) than Quake 3 or Half-life 2 had. These things don't really add value to the gameplay at all.

Aaron Anderson
Aaron Anderson

This is true. I had more fun playing mario than grand theft auto or witcher.

Whats wrong with that? What would you teach a 14yo?

Jason Rogers
Jason Rogers

What would you teach a 14yo?
SICP
I had more fun playing mario than grand theft auto or witcher
GTA:SA was really good, though

Carson Green
Carson Green

OK this guy for Trumps computer science Tsar.

Christian Lopez
Christian Lopez

The little schemer is a perfect book for teachers teaching scheme.
It's extremely basic how it approaches problems and the difficulty goes gradually so you must fully understood/guess the previous pages.
Someone with no skill can learn from it.
So a teacher, teaching kids is more than feasible.

Grayson Ward
Grayson Ward

Better lighting can make a better stealth game. Better physics and more powerful hardware means that shooters can be built with destructible environments. The problem is that it is too expensive to fully push to the graphics hardware these days unlike 15 years ago. Plus, no one optimizes or has a QA department anymore because frameworks and cost.

Is the constant devaluation of the dollar by the bankers the cause of companies axing QA departments? The purchasing power of the dollar is stripped out so companies start axing things like the QA people. Sales slow down because the costs are too high and people do not have enough money. </ramble>

Parker Howard
Parker Howard

The problem is that it is too expensive to fully push to the graphics hardware these days unlike 15 years ago
Plus, no one optimizes or has a QA department anymore because frameworks and cost.
If the game industry would use free/libre game engines they would have them optimized each time a game would be made instead of making shit like Bethesda and their pitiful game engine that hasn't been updated since FA:NV.

Is the constant devaluation of the dollar by the bankers the cause of companies axing QA departments? The purchasing power of the dollar is stripped out so companies start axing things like the QA people. Sales slow down because the costs are too high and people do not have enough money.
It's a bit of everything.

Nathaniel Brown
Nathaniel Brown

do you really believe they do this to reduce costs only to survive? Any investment is calculated with ~12% ROI. If the ROI is not met, the investment will not be done. If the company survives its first few years, it usually gets the ROI they calculated. Then the company gets more known for their products, they improve and get better known. After all, you cannot survive as a newcomer if you produce crap anyway.

Then at some point, managers and consultants show up and say you need to (((optimize))) things to get a better ROI for your investors. What happens next is they fire all the best engineers who had all the know how because their wages were the highest in the company and instead they import pajeets who work for 60% of the wages. Then they change the raw materials to cheaper ones or just cut down the materials to 85% of the former use. Then they fire all their test engineers because today you as the customer are the beta tester. Short: Quality sinks heavily.

Meanwhile the prices for their products stay the same and they increase their ROI from 12% to 18% for a few years. Then all the customers who bought from them again after experiencing a good buy the first time, feel betrayed and leave for competition. Then they go bust. And all this shit only because some tech illiterate managers were too greedy and had no idea that you don't fire your best people.

Jayden Collins
Jayden Collins

it appears you used ixquick or startpage user that engine sucks balls these days

Aiden Taylor
Aiden Taylor

That post smells like it was experienced.

For who you where working before being fired and replaced by pajeets ?

Andrew Ross
Andrew Ross

with the exceptions of gaming and coding
That should be hardware and typing intensive tasks, though a convertible tablet or a tablet with a decent bluetooth keyboard (not one of those keyboard cases) can do a decent job for the later.

Caleb Miller
Caleb Miller

Then at some point, managers and consultants show up and say you need to (((optimize))) things to get a better ROI for your investors.
It's not like you're forced to be public. Just start a private company, borrow funds and go small in the beginning unless you have capital sitting around.

In software you don't even need any expensive equipment or materials. Just people capable of seeing a vision and accepting little or no pay until a project gets off the ground.

Evan Allen
Evan Allen

This is true

Joshua Bell
Joshua Bell

micro processors
Many nice toys (for adults and kids) would not exist or kost you a fortune.
Not to forget no programmable or electric helpers in your household.

Dylan Hall
Dylan Hall

ITT: DA JOOOOS

Lincoln Harris
Lincoln Harris

that and mergers gobbling up the innovative/creative companies that make what most of us consider 'good'.

Liam Perry
Liam Perry

Because websites looking like a Microsoft Word document is truly the epitome of what modern web design should be aiming for.
Fuck you. Multiple times my computer has hung for minutes at a time when I open some abortion of JS and CSS on some god awful news website
The "Design" and "aesthetics" are fucking worthless. Sites made by designers dont look pretty, or cool, or anything noteworthy, they look completely mundane and pedestrian. All this for nothing
I hope /pol/ takes power and sends all designers into the ovens. If not /pol/ then I hope we have a communist revolution and they get sent to gulags.

Oliver Fisher
Oliver Fisher

Multiple times my computer has hung for minutes at a time
I forgot to add, this is on my desktop with 8 GB of ram.

Henry Lee
Henry Lee

Yeah, wouldn't want the poor kid to play with something like legos now would we? Worse yet, he could get some unpowered wood working tools or something and pick up a crafts hobby. Or (gasp) go outside and play with his friends.

What's wrong with how MS Word looks anyhow? For centuries books have worked great for disseminating information. If it was such a bad format you'd think they'd have switched to something else by now.

Michael Cooper
Michael Cooper

The TMS9900 was about 80mm..

Disable AdBlock to view this page

Disable AdBlock to view this page