Every single 10 year old that had a home computer in his house knew how to...

Justin Rodriguez
Justin Rodriguez

30 years ago
every single 10 year old that had a home computer in his house knew how to program in BASIC
today
adult in his early 20's who uses a computer for almost everything doesn't even know what a variable is

what happened?

Attached: moves.gif (2.56 MB, 300x424)

Other urls found in this thread:

people.eecs.berkeley.edu/~kubitron/cs252/handouts/papers/symbolics.pdf
cosm.sfasu.edu/gharber/353/notes/Unix_philosophy.pdf
youtube.com/watch?v=AWXP_Ao-JIk

Christian Harris
Christian Harris

Computers got too easy to use.

Adam Wilson
Adam Wilson

no one has computer nowadays

Attached: gameoflife.mp4 (4.68 MB, 1280x720)

Tyler Johnson
Tyler Johnson

30 years ago
Computers are hardly useful for more than work, thus mostly used by professionals
today
Computers are basically a necessity for modern life and everyone has one or several of them

Luis Russell
Luis Russell

people have smartphones, computers are for the ancients

Julian Thompson
Julian Thompson

Attached: whatisacomputer.jpg (69.59 KB, 1280x720)

Anthony Hughes
Anthony Hughes

In 1988 the computer gaming industry was already big, and the hardware was good enough to do music and pixel art (esp. on Amiga and Atari ST). But even on PC, lots of dudes made things, even if it was just ANSI art, and the BBS scene was pretty strong.
But overall it was still more of a hobbyist environment, where you learned to program games in BASIC or Turbo Pascal and did shit on your own or in small groups, without any corporate/political influence or a bunch of random faggots telling you what you're supposed to like and do.

Attached: A-Talk.jpg (291.78 KB, 1199x930)

Kevin Clark
Kevin Clark

thats a facebook-machine, not a computer

Jace Long
Jace Long

Subhuman normalfags ruin everything.

Luis Howard
Luis Howard

Was Jobs a mistake?

Attached: ClipboardImage.png (480.59 KB, 1280x720)

Jaxon Anderson
Jaxon Anderson

what happened?
Eternal September and the easiness of use of computer in the most dumb way that it could have been done:
IT JUST WORKS
There was no good pedagogical system made in the beginning for the masses and there still isn't the use of a computer should have required a computer license but nooo lets just let any niggercattle drive a car with no license.

Easton Miller
Easton Miller

Reminder that Jobs sniffed his own farts so hard that it killed him.

Colton Turner
Colton Turner

We must have many children and teach them the lost art of proper computing and steer them away from the wide road to hell that is mobile platforms.

Leo Morales
Leo Morales

Holla Forums
having children
lol

Gavin James
Gavin James

only rich well-to-do families had computers 30 years ago.

Nolan Morris
Nolan Morris

BASIC
That's the problem. Corps were hiding computers from users under layers of abstraction even back then.

Matthew Gray
Matthew Gray

You could still use assembler if you'd like, and many did because BASIC was too slow for more ambitious projects.

Zachary Sanders
Zachary Sanders

Obligatory

Attached: Terry-Davis---Where-It-All-Went-Wrong.webm (1.39 MB, 1280x720)

Aiden Anderson
Aiden Anderson

every single 10 year old that had a home computer in his house knew how to program in BASIC
t. not me
adult in his early 20's who uses a computer for almost everything doesn't even know what a variable is
t. me
op teach me how to be good at computers like the good ol days

Leo Bell
Leo Bell

I don't expect everyone to be able to code, but people should at least understand the basics of using a computer. Like those people who save everything to the goddamn desktop because they don't understand how directories work. That's simply not acceptable.

Nathan Myers
Nathan Myers

Millennials are truly a wasted generation.
At least Gen Zyklon B are using SBCs at the age of 10.
/thread

Benjamin Russell
Benjamin Russell

not so much wasted as lost. unfortunately for gen z the economic effects of an entire generation being utterly destroyed economically will be felt long after we are dead. the boomers are crashing this plane with no survivors.

Cameron Morgan
Cameron Morgan

They made the machines stupid so they wouldn't scare the nigger cattle, that's what happened. Every baboon and retard can have a pooter nowadays.

Mason Powell
Mason Powell

There wasn't much abstraction. You didn't get all kinds of libraries and frameworks, you had to write your own stuff. BASIC even gave you direct access to all the computer's memory and ports, and you could even overwrite the OS if you wanted to (pretty much the equivalent of ring 0 on modern hardware). But mostly that was useful for POKE'ing machine language instructions into memory and calling it like a subroutine. Because BASIC itself was good enough for turn-based or slow-paced games, but if you wanted to do a fast action game, you needed to use some machine language.
Terry mentions in one of his videos that a very common beginner's program was a memory dumper. You could do something like that in a dozen lines of code, and then expand it into an actual hex editor, and later modify it to read/write floppy disk instead of memory. There weren't any big hurdles in the way like nanny OS or complicated filessytem. And best of all, if you fucked up, simply hitting the reset button brought you back to a sane state in one second (assuming your machine had BASIC in ROM or on cartridge), otherwise it took a few more seconds.

Attached: CoverTRS80News.jpg (402.42 KB, 508x657)

Jonathan Gutierrez
Jonathan Gutierrez

Do you even need to ask?

Attached: RMS-on-Steve-Jobs'-Death.jpg (55.91 KB, 800x450)

David Price
David Price

install gentoo fgt

Zachary Adams
Zachary Adams

The need to learn how to program is far less today than it was before

Hunter Collins
Hunter Collins

I don't agree, and not everyone needs to be a programmer, but people don't even know what an operating system is. Even when I was a kid in the 90s we were at least taught basic stuff about computers, dos, and windows. Now they just 'teach' nigger-cattle to use microsoft office, and every aspect of the computer itself is just voodoo.

Zachary Rivera
Zachary Rivera

IQ curve. When computers were only available to top researchers the users were even smarter. As more people get access to computers of fucking course the quality of the average user will go down.

Jaxon Foster
Jaxon Foster

If you think a variable in computer science and a variable in grade school mathematics are substantially different things, then you don't know shit about computers.

Michael Nguyen
Michael Nguyen

t. doesn't know shit about computers

Blake Torres
Blake Torres

variable
in math
user, in 1 = x + 2 x is not a variable. It's a constant.

Tyler Howard
Tyler Howard

but in y=x x is a variable

Levi Sullivan
Levi Sullivan

Well actually, it is an equality.

Joshua Baker
Joshua Baker

No, it's a constant with a value of y. If you have f(x) = x then x is a variable but I don't think you see that in grade school.

Michael Baker
Michael Baker

3rd world shithole of a country
had to learn "informatics" by cardboard print-outs of a keyboard
still knew more about Holla Forums than today's kids drowning in pocket computers, gaming battle stations, flying toy robots and government issued school-laptops.

Isaac Gomez
Isaac Gomez

They were dumbed down for lower-middle class whites, fag.
In what dimension did Nigs have money to buy computers en masse before the mid-2000s?

Josiah Bennett
Josiah Bennett

y=x is only used for basic algebra. After Year/Grade 8, you use f(x).

Lincoln Cooper
Lincoln Cooper

As long as the point is getting across.

Attached: 2b4aa324da0089e8c7245734bed4f5c611d1067e852a00546aeb23d8947205b7.jpg (261.22 KB, 1200x900)

Xavier Long
Xavier Long

Both x and y are variables, x being independent (taking any values from the functions domain) and y being dependent on x (and thus taking values from a range which depends on both the domain of x and the relationship betwen x and y established by y = f(x)).
After Year/Grade 8, you use f(x)
The derivative expressed as dy/dx or y' is hardly "basic algebra". Even in differential equations y is used because it's much simpler to write y (when it is obvious what y is and what it is dependent on) than f(x).

Jonathan Garcia
Jonathan Garcia

BASIC on just one floppy came with many computers, while C compilers and things like that were nowhere near the obviously easy thing to get back then.

Jacob Taylor
Jacob Taylor

BASIC was simple. I remember finding it on the Windows 98SE CD, running it and I could already start to program. These days, a guy wanting to program has to: choose a language, install compiler, install IDE, install libraries. Then deal with dumb shit such as declaring variables, semicolons, or God forbid, forced fucking indentation, instead of just coding.

Owen Cruz
Owen Cruz

Seems incredible parochial to imagine that every non-brainlet dedicating there time to learning to program would be constructive. What is physics, electrical engineering, biology, etc.

Jackson Ward
Jackson Ward

No one demands everyone to become expert hacker, just basic computer literacy is enough.
What is physics, electrical engineering, biology, etc.
Fields of science where even a little bit of programming know-how can help immensely.

Parker Price
Parker Price

To be fair people are already taught the bare basics of physics and biology. IMO more of electricity should be taught, I've sometimes felt it necessary. Basics of coding should be taught too, but CS should not (if there's time add some more math or logic instead). For dummies coding is something anyone with a 3-digit IQ is able to grasp (with effort, I'm not saying it's easy), don't forget even MBAs manage it with Excel.

Ayden Taylor
Ayden Taylor

Programming is too focused on UNIX languages like C, C++, Java, JavaScript, and PHP, which all suck. BASIC is simple and easy to learn.

I feel compelled to submit the following piece of C code:

switch (x)
default:
if (prime(x))
case 2: case 3: case 5: case 7:
process_prime(x);
else
case 4: case 6: case 8: case 9: case 10:
process_composite(x);

This can be found in Harbison and Steele's "C: A Reference
Manual" (page 216 in the second edition). They then remark:

This is, frankly, the most bizarre switch statement we
have ever seen that still has pretenses to being
purposeful.

In every other programming language the notion of a case
dispatch is supported through some rigidly authoritarian
construct whose syntax and semantics can be grasped by the
most primitive programming chimpanzee. But in C this highly
structured notion becomes a thing of sharp edges and lose
screws, sort of the programming language equivalent of a
closet full of tangled wire hangers.

Eli Lewis
Eli Lewis

and they grew up to be faggots who post stories on HN about how they used basic when they grew up and they still suck. seriously, everyone there is retarded and they spend their entire life coming up with new memes like "le abstraction doesn't exist" and "security is hard". they don't even know how to do some basic concurrent programming, or how to sanitize input into their retarded database queries
adult in his early 20's who uses a computer for almost everything doesn't even know what a variable is
nope, all the kiddies know JS now. but yes unfortunately they don't know what an integer is and all their programs spew undefined and NaN everywhere

Noah Flores
Noah Flores

Computers went from extremely niche items that only autists would seek out to being extremely common items that are in every single household no matter what.

Hunter Bennett
Hunter Bennett

30 years ago

every single 10 year old that had a home computer in his house knew how to program in BASIC

All 100 of them. kek

Gavin King
Gavin King

Wat. 30 years ago was 1988. There were tons of home computers in homes at that time. What you're talking about is more like the mid 70's when you had to build your own computer from a kit.
I personally knew well over a dozen other kids with various computers, everything from Apple II, TRS-80, Amstrad CPC, and Atari ST. And they all knew how to operate it and write some BASIC. The manuals that came with the computer taught you everything.

Attached: 2.png (2.34 KB, 384x272)

Chase Phillips
Chase Phillips

They used to teach BASIC to every kid in school who wasn't at the lead paint chip eater tier.

Dartmouth pioneered the idea of teaching it to liberal arts majors and this spread far and wide, some people even credit this for sparking the home computer revolution.

These days the funding and energy is directed toward <insert non white male person> studies departments so the Chinese can dominate us. Yes, it was commies on the Long March Through The Institutions who ruined education in the USA.

Lincoln Hughes
Lincoln Hughes

The manuals that came with the computer taught you everything.
Not my manuals man. I want a single page. It needs to only contain surface material. I don't need to know the internals of programs. I don't need to know how the outputs work with out programs. Just a single page, preferably with one sentence on it. Also, everything is electronic data now, no more physical copies, ever.

Xavier Gonzalez
Xavier Gonzalez

When HP started selling RPN calculators without a printed full manual I knew we were fucked.

Camden Nguyen
Camden Nguyen

Attached: watch-out.jpg (44.48 KB, 639x419)

Carson Smith
Carson Smith

A reality which universities and faggots on the internet refuse to accept is that LOWER-LEVEL LANGUAGES ARE EASIER TO LEARN FOR BEGINNERS; and because they generally mimic the operation of actual hardware better (i.e. you can convert a statement to an approximation of clocks used in your head), PEOPLE WHO LEARN WITH LOWER LEVEL LANGUAGES ARE SUPERIOR PROGRAMMERS.

The reason crap like C# and Javascript keeps getting taught as a _first_ language is that people who use it in their jobs hope novices will be baby-duck'ed into using it too, just like them. The other reason is that for the people who spend their time developing these frameworks and abstractions, they simply don't want to see something they put work into ignored.

However, the actual experience for a complete novice using one of these is

"Oh hey, let me try doing something that wasn't explicitly stated in the tutorial"
Get some error message that either is too vague to be useful, or generates some barf about inheritance and abstract virtaul yadda-yadda-yadda
"Hey guys, how do I make this small change?"
"Well first of all what you want to do is make a class with a constructor/destructor, overload this set of operators, and select which members are public and private. Then you make some friends of that class which extend the ............
....... But really, why are you implementing it yourself? Just use ThisFadLibrary instead".

Leo Martin
Leo Martin

Do you think computer classes should start with something like Altair-8800?
That would be rad.

Brody Moore
Brody Moore

and because they generally mimic the operation of actual hardware better
C only "mimics" the hardware on hardware made to run C in the first place. RISCs mimic the operation of C because RISCs were designed to run C and UNIX programs. Any instruction not used by a C compiler is considered CISC by RISC weenies. Lisp machines are about efficiency and productivity. They were invented to make dynamic typing and GC and bignums faster because they were slow on most other hardware. Having these features on the lowest levels makes them more efficient and more productive.

The reason crap like C# and Javascript keeps getting taught
C# and JavaScript look like C and they were designed for C and Java programmers. Hating C# and JavaScript means hating UNIX.

The other reason is that for the people who spend their time developing these frameworks and abstractions, they simply don't want to see something they put work into ignored.
You're starting to understand the problem, but this applies even more to C and UNIX. That's why UNIX-Haters is still relevant. That's why they're still shilling Plan 9, an OS that was bad for 1991.

Subject: Hating Unix Means Hating Risc

Date: Fri, 22 Mar 91 21:34:47 EST
From: JW

Hey. This is unix-haters, not RISC-haters.

Look, those guys at berkeley decided to optimise their
chip for C and Unix programs. It says so right in their
paper. They looked at how C programs tended to behave, and
(later) how Unix behaved, and made a chip that worked that
way. So what if it's hard to make downward lexical funargs
when you have register windows? It's a special-purpose
chip, remember?

Only then companies like Sun push their snazzy RISC
machines. To make their machines more attractive they
proudly point out "and of course it uses the great
general-purpose RISC. Why it's so general purpose that it
runs Unix and C just great!"

This, I suppose, is a variation on the usual "the way
it's done in unix is by definition the general case"
disease.

Luis Brooks
Luis Brooks

Do you know how I know you are full of shit?
Any instruction not used by a C compiler

Robert Gomez
Robert Gomez

dynamic typing and GC
I wouldn't pay even a dollar more for the extra silicon needed to implement that bloat. Especially when you consider the inevitable increase in power consumption.
more efficient and more productive
Cool buzzwords. Are you some sort of corporate manager?
C# and JavaScript look like C and they were designed for C and Java programmers. Hating C# and JavaScript means hating UNIX.
W and X look like Y, and Y is associated with Z, therefore hating W and X means hating Z
ebin

Luis Long
Luis Long

I wouldn't pay even a dollar more for the extra silicon needed to implement that bloat. Especially when you consider the inevitable increase in power consumption.
You would end up saving money, silicon, power, and memory. We're already paying more for x86 bloat and wasted silicon than any extra silicon that would come from a Lisp machine.

people.eecs.berkeley.edu/~kubitron/cs252/handouts/papers/symbolics.pdf
Hardware can process the tag in parallel with other hardware that processes the rest of a word. This makes it possible to optimize safety and speed simultaneously.
Automatic storage management is simple, efficient, and reliable. It can be assisted by hardware, since the data structures it deals with are simple and independent of context.
Data use less storage due to compact representations. Programs use less storage due to generic instructions and because tag checking is done in hardware, not software.

more efficient and more productive
Cool buzzwords. Are you some sort of corporate manager?
No, but it is more efficient and more productive. UNIX "academic" handouts use those buzzwords too.

cosm.sfasu.edu/gharber/353/notes/Unix_philosophy.pdf
Thompson and Richie's innovation was that speed can be traded off against utility and portability. Their reasoning was that it didn't matter if the machine performed somewhat slowly, if it could offer portability and productivity tools to offset the loss of efficiency.
UNIX exploded the notion that machine efficiency was more important than human productivity.
While it's true that UNIX is slow, it's also not productive because of C.

W and X look like Y, and Y is associated with Z, therefore hating W and X means hating Z
W and X suck because of Y and Z.

> There's nothing wrong with C as it was originally
> designed,
> ...

bullshite.

Since when is it acceptable for a language to incorporate
two entirely diverse concepts such as setf and cadr into the
same operator (=), the sole semantic distinction being that
if you mean cadr and not setf, you have to bracket your
variable with the characters that are used to represent
swearing in cartoons? Or do you have to do that if you mean
setf, not cadr? Sigh.

Wouldn't hurt to have an error handling hook, real memory
allocation (and garbage collection) routines, real data
types with machine independent sizes (and string data types
that don't barf if you have a NUL in them), reasonable
equality testing for all types of variables without having
to call some heinous library routine like strncmp,
and... and... and... Sheesh.

I've always loved the "elevator controller" paradigm,
because C is well suited to programming embedded controllers
and not much else. Not that I'd knowingly risk my life in
an elevator that was controlled by a program written in C,
mind you...

Noah Sanchez
Noah Sanchez

Nigger, I use LISP and I can tell you're full of shit.

Carson Cooper
Carson Cooper

This is great.

Attached: laughing-bird.mp4 (949.63 KB, 480x480)

Benjamin Roberts
Benjamin Roberts

That crowd still exists and is probably bigger than ever. They simply don't have the majority anymore. Instead, the largest group is the consumers, who wouldn't have really had computers back then.

Attached: 2018-hey-son-i-found-a-picture-of-your-grandpa-31653661-1.png (324.33 KB, 500x514)

Chase King
Chase King

1978 Autist -- those of incredible mental capacity, who could develop code in their heads, convert it to octal/hexadecimal, type it into the Microcomputer's monitor program, and it worked first-go. Got their first job at 15 when a University Professor noticed their prodigal abilities and introduced them to the MAINFRAME.

2018 Autist -- wacks it to fanart of cartoon characters, still living with their parent(s) at 26, and who favourite hobby is posting Nazi pix to Internet as pissing people off gets them excited.

Attached: OPvt220.png (2.5 KB, 449x337)

Mason Jones
Mason Jones

Dumb question: can't you just make FPGA Lisp machine?

Tyler Peterson
Tyler Peterson

30 years ago
Not targeted at common people
today
Targeted at common people

Lincoln Parker
Lincoln Parker

Yes. It wont' necessarily be fast, though (compared to modern CPU's)

Bentley Evans
Bentley Evans

In theory you can, in practice FPGA tools are bitch to work with, or so I heard.
LoperOS when

Angel Morris
Angel Morris

FPGA tools are easy to work with

Leo White
Leo White

We're already paying more for x86 bloat
<hurr my bloat is acceptable because that other architecture used today is also bloated
Compelling argument, chap.

Hardware can process the tag in parallel
Or better yet, it can eschew "processing the tag" altogether.
Automatic storage management is superfluous
Fixed.
Data use less storage due to compact representations
Less storage than what? Data in a program written in a garbage-collected (lol) dynamically-typed (lol) language executed on a conventional architecture?
generic instructions
In other words, more complex and wasteful decoding logic.
tag checking is done in hardware, not software.
How about: don't use a program that does "tag checking" if you care about performance.

UNIX "academic" handouts use those buzzwords too.
<t.. tu quoque!
And I wasn't even defending UNIX, just pointing out your bullshit.

W and X suck because of Y and Z.
That's not what you said :^)

Nolan Lewis
Nolan Lewis

Most i knew had a sega or nintendo and later had playstation and windows/mac shit

Nathaniel Perry
Nathaniel Perry

Hello /g/

The 'original' Lisp machines would've -loved- FPGAs, as they needed to do things with registers and memory that normal computers weren't great at.

Connor Howard
Connor Howard

As I was informed, reasonably priced FPGAs with open sdk are shitty and can't do anything useful.
Open sdks are usually in barely implemented state and can't do anything useful either.
Cool and powerful FPGAs are prohibitively expensive, have proprietary sdk that are prohibitively expensive, and said sdk only work in Windows.

It would be good to be wrong here though.

Chase Sullivan
Chase Sullivan

Muhellinals

Attached: 1521913079896.webm (3.98 MB, 448x252)

Jaxon Walker
Jaxon Walker

cringy as fuck, assume they are being ironic
get to the end. they aren't being ironic
We need a war.

Attached: 1467935694790.gif (301.72 KB, 290x705)

William Barnes
William Barnes

war
No. Concentration camps for rehabilitation.

Liam Nguyen
Liam Nguyen

This fucking garbage takes up 4 megs. Meanwhile 30 years ago you could fit pic-related on an 880K floppy.

Attached: Neuromancer.png (13.56 KB, 320x256)

Austin Reyes
Austin Reyes

No. Concentration camps for rehabilitation.
Proven to be ineffective. The best solution is gas chambers.

Ryder Cooper
Ryder Cooper

What and where did it all went wrong?

Attached: dae.jpg (172.54 KB, 1280x720)

Carson Peterson
Carson Peterson

the chief reason why people get cuck'd by companies and salesmen is that they just don't know how a computer works on a basic level and they don't know how to use them.

that video made me so very angry.
Also, if I ever meet the fag who said "anti-social" in the video, I would say: "I'd just like to interject for a moment. What you're referring to as anti-social, is in fact, asocial!"

Landon Nguyen
Landon Nguyen

youtube.com/watch?v=AWXP_Ao-JIk

micro kid,level 42

Jose Phillips
Jose Phillips

I might just kill myself after watching that.

Disable AdBlock to view this page

Disable AdBlock to view this page

Confirm your age

This website may contain content of an adult nature. If you are under the age of 18, if such content offends you or if it is illegal to view such content in your community, please EXIT.

Enter Exit

About Privacy

We use cookies to personalize content and ads, to provide social media features and to analyze our traffic. We also share information about your use of our site with our advertising and analytics partners.

Accept Exit