Will Lisp Machines ever come back...

Will Lisp Machines ever come back? Will we ever see a move away from the dominant C & von Newman orientation of computation?

Other urls found in this thread:

web.mit.edu/~simsong/www/ugh.pdf
en.wikipedia.org/wiki/AI_winter
anotherworld.fr/anotherworld_uk/page_realisation.htm
en.wikipedia.org/wiki/Dynamic_Analysis_and_Replanning_Tool
twitter.com/NSFWRedditImage

I hope so but that's not likely.

We are moving away from C as dominant low level language to Rust as dominant low language.

Keep dreaming hipster.

I hope so. The most sure fire way of seeing a LispOS is to try to make one yourself. Terry Davis made his own language and operating system. Just remember that Terry did it if you need inspiration.

That'd be a cool project with some anons maybe. And I hope the hardware component comes someday too..

What do you mean hardware component?

Like the old Symbolic Lisp Machine's, here's a little excerpt from something I was just reading:

"the hardware is built to suit kernel/system software written in lisp, and all subsystems/programming environments/etc available to the user are written in lisp"

Like them old symbolics boys were Lisp from the top down and the ground up. Unlike most computers now where it's whatever language of the application your using, language of OS, language of the hardware - it was just all lisp

Then how come Terry could do it with TempleOS? HolyC is JIT'ed and the OS is written in HolyC.

You could write a LispOS in the same way, on top of normal hardware, yeah. I'm just saying it'd be cool to see not only something like LispOS get developed, but a revival of actual lisp hardware for a complete system (like the old Symbolics)

I don't see why Holla Forums does not try to do it on its own with normal x86 hardware. Linux is over rated.

I agree, plus a Holla Forums OS that we collectively had a part in would be sweet

relevant
web.mit.edu/~simsong/www/ugh.pdf

What would the point of a Lisp machine or Lisp OS be? Do you want to write device drivers in Lisp?

you don't know Rust, do you?

what are practical benefits?
if there are none or little, this won't happen.

All you ever need to know about LISP

It runs on modern devices!
www.jnode.org/

Because x86 sucks big fat donkey dicks.

Jews might be evil but they products use to have a high degree of quality.

Nice try CIA rabbi.

There was Microsoft LISP. Did you know that? Everyone was trying to push Lisp, and it all fell apart because Lispers were scamming the governments in the US and UK and they finally had enough of that bullshit.

en.wikipedia.org/wiki/AI_winter

Besides that, how many people who talk about Lisp machines actually used one? It's all hearsay. "I never actually used one, but I heard they were good."

Of those who did use them, how many were involved in making them? All of them worked for a Lisp machine company, were involved in making or designing them, worked at MIT on AI research, etc. You could buy a house for what one of those machines cost. That cost was passed onto the taxpayers as AI research. They had no incentive to make them cheap because the users were closely connected to the designers. Both groups profited more from the high price and many of the same people were in both groups.

The Lisp vs C false dilemma is another thing they have been pushing because they want you to think the only alternatives are unsafe buffer overflows and dangerous undefined behavior or special microcoded hardware to run a garbage collected language slower than commodity hardware did. Most computers back then didn't rely on a single line of C or Lisp. C was not the dominant language until the late 80s and didn't even have a standard until 1989. They don't want you to know that you can build secure and productive systems on plain old x86, ARM, etc. hardware without any C or Lisp.

I thought AI winter also suffered from not enough data. The internet is what has made machine learning feasible because of all the data being generated. Neural Networks fell out of favor because they did not work until huge internet data sets became available for use.

Systems not based on C or Lisp would be the Xerox Alto and Oberon. I can't think of any other past 1970. I am excluding mainframes. Maybe the original Macintosh? Was it written in assembly?

Imagine this. Until we started using GPUs to train networks, it would take them 40 days to train one.
Do you think a computer from 1950 could train itself to do anything of value on their limited memory and processing power?

McCarthy is very much Right Wing. Back in the late 80s-early 90s, he was flaming the shit out of Leftist Activist types on Usenet. He was Alt-Right before it was Cool.

Right, but hindsight sure is easy.

...

Surely (((Marvin Minsky))), (((John McCarthy))), (((Herbert A. Simon))) and (((Allen Newell))) would never be unscrupulous with money?

Holla Forums is always right.

We've had entire threads on Jnode on here before. Don't be retarded.

Even in the late 80's, the C compiler situation was pretty dire.
anotherworld.fr/anotherworld_uk/page_realisation.htm


CP/M
AmigaOS, until version 2 (released in 1990)
A lot of the 8-bit micros were probably outright in assembly, especially the ones with very little memory. The Jupiter Ace is the only one that really stands out (Forth).

I'm new to Holla Forums
Pls no bully

It was never so much a problem of AI research being useless as people projecting onto it their desires for a kind of computational panacea. I would argue that the same thing is happening today with the new wave of AI research.

AI is good at generating adequate, but not perfect solutions to certain kinds of NP-complete problems. It's a tool, just like any other. The public didn't get robotic assistants, but DARPA's investment did eventually bear fruit. en.wikipedia.org/wiki/Dynamic_Analysis_and_Replanning_Tool

Unless someone rich is willing to blow a lot of money on it (and doesn't expect to make profits off it since Lisp machines have a very niche audience nowadays), no. Chip fabrication is insanely expensive. You might be able to do some shit with an FPGA, but I wonder how well this would really work; I have my doubts.

Not anytime soon, but this area isn't completely dead thanks to Haskell et al. At least we have Lisp machine emulators. I remember that there was an emulator for some Symbolics' stuff as well, but I can't find it right now. There was definitely a method to run Genera on current hardware.

It's called emacs.

that's just top layer software though, I'm talking lisp done to the metal ya dig

...

It's like trying to tell node.js faggots in 2013 or whenever that was that their shitheap wouldn't revolutionize server programming.
You just wanted to bump this thread, admit it.

You'll have to present a proof of concept on a FPGA anyway before producing it.
It actually sounds like a fun project and I might take it up after I learn a bit more lisp and have enough context to read about the lisp machine architecture.
Remember that lisp machines were implemented in the olden day. You could incorporate new and interesting things like Out Of Order execution

I'm asking again (>>728267), what would be the point of Lisp machines? I'm not trying to stir shit, I'm genuinely curious as to what the advantage would be compared to what we have now.

I want to know as well. What advantages are there to running lisp on bare metal?

So when did the Holla Forumstards start shitting up Holla Forums? The average IQ here has dropped 50 points since they showed up. Where are the mods?

I guess it's cool to be trampled over by your jew kings, peasant.

A high-level CPU is a CPU that can be programmed directly without an insanely complex compiler on top. While Lisp machines usually didn't execute Lisp itself, their machine code was very close to it, usable by humans and directly provided features for HLLs such as type tags and bounds checking in hardware, which eliminates entire error classes. On a C machine, a buffer overrun is a programmer error, and a very common (most common?) and expensive one at that; on a HL machine with bounds-checked arrays, a buffer overrun is impossible. Well, unless you do it on purpose by reimplementing the C machine, a general purpose computer can't protect you from acting intentionally stupid. I really can't overstate this: A buffer overrun doesn't become "unlikely", or "requires a software fix"; it's simply not possible if your hardware works as intended, the same way you don't concern yourself with the possibility of a transistor failing on certain operations today.

If you wonder why a complex compiler is a problem, download the compilers for your favorite language (and all languages it depends on!) and run a simple (find -name '*.[ch]' ; find -name '*.cxx' ; …) | xargs wc -l | grep total on it. The C and C++ parts of GCC v4.9.4 measure ~4.7M lines. Do you think it's possible for you to understand all that? Do you think it's possible for anyone to understand all that? If not, it is pretty much guaranteed to have bugs. Whenever these bugs show themselves, you'll have to go below the level of your language; and that's assuming your language doesn't leak in the first place! If your language of choice is leaky like C, you'll be unable to debug even a simple program without understanding what actually runs on the machine, let alone a compiler bug. I also didn't consider sabotage so far, which is best defended against by understandability.

Another common counterargument is that HL architectures are too slow, and indeed this was a major reason why Lisp machines died, another being the AI winter. However, we have reached a point where you can perform most of your work on a relatively old CPU, and if you use the right (non-bloated) software, even really old ones. A Lisp machine would already be a realistic option if it could match these old chips, and with current technology this is certainly possible, provided you get someone who has a fab to do it because CPU production is ridiculously expensive if you don't.

gcc is a GNU production, so it's a monster.
Tiny C is only about 36K lines. I guess Holy C is probably about the same.

Do you use Tiny C?

No because I couldn't get gcc to compile it. ;-(

GCC is feature complete. Tiny C doesn't aim to be feature complete.

The point is not so much the difficulty in understanding your tools, though that is an issue, but rather raising the base-level abstraction so that the programmer does not need to concern themselves with a machine representation that is almost entirely divorced from the code that they're writing. On a lisp machine the basic unit of data in hardware was the cons-cell, the same as in the language itself. The language implementation moves into the realm of just working, not in the 'just werks' sense, which is really ignorance, but in the way that you can trust the hardware to correctly perform calculations without the user having to manage carry bits or anything like that.

We already have to do that with x86, which is not a C machine. x86 segmentation is designed to prevent that kind of error. The developers of mainstream x86 OSes like Windows and Linux are acting intentionally stupid by ignoring an important security feature and that's why there are so many exploits.

GNU = features, features, features, features!
(that's what Ballmer would say if he was RMS)

Sometimes I want to wright to a negative index in an array. Don't judge me. And if you try to stop me I will simply boycott your system, or worse I won't, I will simply re-wright it to run slower on your system and tell everyone how slow your shit is or how much extra ram it takes to code for your shitty os.

I don't know if you replied to the wrong post or if you misunderstand something.

Negative indexes are not a problem in any way when working with segments. Unlike C, it's not a hack, but an intended feature of arrays. Most programming languages that aren't based on C or Lisp can have negative lower bounds.