Your opinion on Common Lisp

What is it useful for? Is it gay? Is it worth learning? What makes it superior/inferior to other languages?

Other urls found in this thread:

paulgraham.com/rootsoflisp.html
paulgraham.com/avg.html
youtube.com/watch?v=4rnJEdDNDsI
redditblog.com/2005/12/05/on-lisp/
blog.kingcons.io/posts/Going-Faster-with-Lisp.html
anotherworld.fr/anotherworld_uk/page_realisation.htm
docs.racket-lang.org/ts-guide/index.html
pouet.net/prod.php?which=2878&howmanycomments=25&page=0
marc.info/?l=openbsd-misc&m=151233345723889&w=2
github.com/luochen1990/rainbow
github.com/Raimondi/delimitMate
github.com/HiPhish/repl.nvim
github.com/kovisoft/slimv,
gmplib.org.
twitter.com/SFWRedditVideos

rewrite it in rust

ebin :-----------------------------D

It expands your mind, but I wouldn't use it as a day to day programming language. It's useful for symbolic computation, eg, the kind of math they have you do in high school like calculating derivatives: (e^(e^x))' = (e^x)*(e^(e^x)) and studying program transformations. Once it gets under your skin you'll be better at structuring programs tough.

paulgraham.com/rootsoflisp.html

So I wouldn't be writing small scripts or anything with it? What makes it a bad day-to-day language? Sorry for my retardation, but I've never really gotten into programming and don't really understand it yet.

It's mental poison. It abstracts you away from how the machine works because math autists wanted things to work more like how math works, when what you should be doing is changing your thinking to be more like machines work. The whole language was masturbation by people who didn't want to make software, and it shows as they made almost nothing with it over 60 years.

Try to make a stand-alone "Hello, World!" executable with it, for starters.

It's not used for anything. It's a masturbation language. Actual math work is done in languages like R.

What would you recommend that I use instead? I'm almost completely experianceless when it comes to programming.

I start people on javascript as it's the easiest to get started with by far and can make things that you'd find worthwhile. Holla Forums will hate it, though. They'll tell you to install arch and write code in erlang or something.

Btw, languages I use at least once a week:
C, C++, python, javascript
I'm a low-level guy but python is great as glue (don't use shellscript), and javascript is often necessary because so many things depend on it.

I know Racket well, and it's very useful for quickly prototyping ideas and writing simple programs (not to say they'd have to be). If performance isn't critical, a lisp language is what I'll turn to, because it's so quick to develop in. I use Racket for things others would choose python for.

Please stop, you're in no position to be advising anyone if you're responsible for the idiocy in this post

Sorry arch user, but I actually write software for a living and teach people how to get shit done.

How do you handle fork and process faggotry in python?

You try not to. Multiprocessing is pretty awful as the standard code to do so is written poorly and signals and such will cause misbehavior. To be fair, it's VERY tricky code to write that virtually no one does 100% correctly even though they all think they do.
In some cases, knowing the risks, I create process pools via python's multiprocessing library and dispatch work across them. If you need a more industrial solution, do one thread per python and design the application like a cluster (this is a general solution that works for almost any shitty language).

Do you even fucking get the point of functional programming, ya dumb fags?

Yeah, to keep people from getting anything done. I've been hearing "functional is the future!" for almost 30 years now. It continues to not be the future.

Never heard such things, and I'm getting a strong sensation of larping here.
Functional programming is, as far as I know, never used in enterprise, it's mostly limited to academic environments, because it's there where it was born, as they rely on mathematical concepts.
Scientists, mathematicians and physicists use functional languages, pajeets lack of the knowledge to even comprehend how they work.
Stallman is known for Emacs Lisp, and guess what, he's a physicist.

I stilll have a hard time understanding what functional programming means. Is it that every function evaluates to another function?

I don't know jack about Lisp, but I read this story years ago and always wondered why more people didn't use the language: paulgraham.com/avg.html
My first instinct is to believe that managers just don't like giving individual programmers that much power. The rise and popularity of Java reinforces that thought. Also in parallel the same happened with Perl vs. Python. Managers are allergic to individuality and freedom. They want everything to be done by the book, in a perfectly predicatble boring fashion. That's why that story in above url doesn't have a "happy ending". Sure the authors made lots of money (and so did Yahoo), but it didn't serve to create more Lisp jobs or whatever. They instead just rewrote (badly) the Lisp code into other languages.
BTW, on the same topic of "de-skilling", I recommend watching this video: youtube.com/watch?v=4rnJEdDNDsI
Not sure I'll ever learn Lisp, but I've been playing with Forth. Main advantage here is it runs *very* well on even old 8-bit computers or microcontrollers with tiny memory.

They don't use it for anything useful in academia other than wanking in CS. I used to do some supercomputing and the code was all built on top of Sun's custom CC. Even AI was a mix of C and PROLOG. Voice recognition did use some LISP but only because the early papers used it and to remain paper-compatible they continued the practice. When you run into LISP outside of academia and open sores, it's usually connected to voice recognition because that's the language those people are familiar with.
Webdev faggots use it for some things as they always want to be using something hip and new even if it's untested or shit. Like you might have heard of Discord. Functional code tends to cripple companies over time as it is extremely difficult to modify without rewriting, the talent pool is small, and the lifetime of the toolchains short.

So what your saying is that functional programming isn't useful outside of academic environments? So it's useless anywhere else? Or is it just too difficult to be practically used in large projects?

Yes, I can tell you're a code monkey by the derision you show towards mathematics and anything abstracting your notion of low level. In academia, it is important to teach concepts, not to churn out drones who can spit out codes for a particular machine type.

Functional programming means computation is done by evaluating expressions, rather than statements.

Perl vs Python had nothing to do with managers the way you're thinking. Perl was designed to be very flexible, with a motto like "There's more than one way to do it". And yes, there were easily 10 different ways to do anything. And if you had 10 programmers working on some code, you'd have the same problems solved in 10 very different ways all over that code. So to effectively work on existing code, you'd need to have learned all these different ways to do the same thing, almost like needing to learn 10 languages. Very few programmers ever had that level of mastery of Perl, which was kinda a problem when you needed programmers.
Managers had to come down super hard on Perl projects with coding standards. The longest coding standards I've ever seen were for a Perl project. They'd, through a lot of effort, try to reduce the language to "One good way to do it". But this was very difficult to enforce and imo hopeless as you ended up having to basically retrain anyone you hired.
The lesson learned (which should have already been learned from LISP) was that unnecessary flexibility makes a language /less/ efficient.
So Python came around and recognized this problem with Perl and mostly avoided that trap. While it wasn't as quick to use as Perl for a lot of cases, 'good' was better than 'great' due to the reduced flexibility making it much faster to write with holistically, and much larger programs could be made with it before collapsing in on themselves. If you do "import this" in Python, you'll see an easter egg that contains, "There should be one-- and preferably only one --obvious way to do it.". That's a riff on Perl's motto, and why it lost to Python.

I have a Masters in CS from a top UC school and several well-cited papers I'm author/coauthor on so I understand academia and how they do nothing of value. Think of a great programmer and you'll find that they were a great programmer before university.

Like Forth is good for being able to run on small memory. Lisp is good because you can write most of the language in 100 lines or something retarded like that.

Who cares if they were good before, the point is they become greater. By sixteen I was a "great" programmer if industry standards are anything to go by, but then of course you learn how much you do not know.

They don't. They all realize it's bullshit and drop out before getting a PhD. Universities panic and give them honorary PhDs later. Again, think of a great programmer and then see if they have a PhD.

I think you've lost touch with your own argument. Earning a PhD is only important if you're interested in research, of course many choose to work in industry instead.

Thanks, user. I'm going straight for lisp.

Do you think great programmers aren't interested in research?

You're talking about two totally different career paths. Some may straddle both, but that is rare. I do not think pursuing a PhD is important for all programmers, but an undergraduate is beneficial.

Good choice. Ignore the fools telling you it's a meme or a waste of time. Even if you never use it (you will), you'll benefit from exploring a new paradigm.

fucking lol. You deserve everything you get.

...

Common Lisp should have been just a stopgap solution in the history of Lisp. When the various Lisp dialects were dying out a new Lisp was needed which would have the best of all the other Lisps and make porting programs easy. I this Common Lisp did succeed. However, the end result is a horrible hodgepodge of a language. It should have been just a stopgap solution until a proper modern Lisp gets developed.

The real problem with Lisp (any Lisp) is that few people know it. Reddit for example was originally written in Common Lisp, but they couldn't find enough programmers so they ended up rewriting it in Python.
redditblog.com/2005/12/05/on-lisp/

The library ecosystem is something to take into consideration as well. I haven't used much of it, so I have no idea how well it compares other languages. Really, if you intend to use Common Lisp you will find yourself with a very powerful tool, but it will be a lonely experience.

Another problem is that the standard leaves a lot up to the implementation. For example, there is no portable way of opening a file. There is a standard function, but each implementation has its own quirks on how exactly to handle the arguments. The best advice would be to pick one implementation that you expect to be around for a while and stick to that one.

Personally, I have chosen Racket as my Lisp language. It has its own issues, and some things are worse than in Common Lisp, some are better.

Please dont. Depending on your needs use one of the many proven task queues such as Celery (a bit involved) or a dead simple one such as RQ, if task queues is out of the question Stackless makes threading a treat, stupid simple and runs circles around Go "goroutines". Used all of them in various ways in the industrial space, telephony, energy and never ran into an instance of "Oops! Muh app shit the bed."

Modern Python isn't tricky by any means jump right in. Don't let a crusty old "Pythonista" scare you away.

Uhh.. user?

You seem to know a thing or two about lisps.
What do you think about GNU Guile?

Task Queue != Message Queue sweetheart.
You've never used a task queue to divvy up work to utilize all cores? Like any other grown up serious app that might need to scale down the road and on top of that, makes it easy to just throw more machines/instances at it as needed?
Who the fuck runs multiple single threaded app instances these days to utilize all cores? Oh that's right node.js niggers.

Common Lisp is a supurb language, as are most of the other Lisps. It's very fun to programme in, there's a decent community, and the literature surrounding Lisp is some of the best.
There's a Lisp thread on Lainchan which is steadily active and usually has good discussion going on. The only problem is that most other languages will seem terrible by comparison.
If you want practical software to hack around with you can look at Stumpwm which is a nice window manager.

I think you're defective, user.

So you can all admire this.

Thank you. It's magical.

install frankenwm

erlang is a beautiful language

and is doubly beautiful when running on Arch

...

...

and erlang is pow(beautiful, beautiful) if running on a compute cluster running arch

Metaprogramming and the ability for self-evaluation means that the entire runtime has to be packaged with any CL programme.
A compiler like SBCL can produce much smaller binaries if you guarentee it that you will not be using those features, but that does defeat some of the purpose of using Lisp.

(((CLISP)))

What a great language. So it's unfixable, then.

Only in an environment like Unix that is oriented towards small, standalone binaries. On a Lisp machine you would have a system-wide lisp listener running that can evaluate arbitrary programmes as the user pleases. This is also a sensible structure for samething that's constantly running like a WM or Emacs.

It should be noted that Lisp is not slow anymore, like it was in the 70s. SBCL can generate very tight x86 assembly. Optimised CL is about a third the speed of C for a trivial programme, which makes it much faster than Java, Python, or most other high level languages.

It gives you a bird's eye view of programming. Learning only low level-languages you won't be able to see the forest from the trees.

I say that as a guy who's first programing language was C. It was all fine and dandy until I actually needed to write something with more levels of abstraction. (eg. a program that takes the string of an elementary function and returns the string of its derivative). Then I got lost in book-keeping details and didn't know how to begin. Then I found lisp. Now, am I able to write the program in C? Of course, but only because I learned lisp. Granted, I wouldn't recommend lisp as anyone's first language, that should be C. You have a point there. But lisp should definitely be your second language to learn.


This whole thread is a mess. This is the reason Holla Forums needs ID's .

Quads confirm, we need ids. Good post too.

We can simply this statement to: it solves a very limited set of problems. Today, every target I want to deploy software to is not a lisp machine. That's not good news for lisp it seems.
Yet whenever we have a programming contest here lisp gets absolutely destroyed. I think in the last one the lisp entry was literally close to 100,000x slower than the top entry, and that was run via SBCL. Until the lisp community here can show us their language actually performing, I group these claims with those of the Java community. Maybe it only performs well if willing to spend obscene amounts of time tweaking the code?

I don't believe it does. I'm somewhat old for a channer and started out with BASIC and while extremely simple that was a language designed to fit hardware limitations. You learn a little bit about negotiating with the machine through learning BASIC. LISP is a very different beast. It's designed specifically to remove hardware limitations and move programs into a more pure mathematical space. You end up not learning programming, you end up learning some variant of math. LISP users usually get hooked on that purism and they end up mired in the Haskell community or whatever other pure functional language is trendy and never really get into programming. Programming isn't pure. It's dirty and violent. Languages learned along the way should acclimate a programmer to that.
I don't recommend starting with C because that's all the gore up front. But I do recommend languages that gradually increase the gore along the way. You eventually want to have mastered it, as that is a close approximation of the machine that you are expected to control. You also want to learn some assembly at the end of that path so you can debug the really hard problems rather than have to go beg your company's Steve for sage advice.
I recommend javascript as a first language despite absolutely despising javascript and hating every second I'm forced to use it. It's a language where there is some hardware pressure felt, it can scale from absolute beginner to billion dollar company, it's trivial to start with as you already have the tools, it's easy to make useful things for yourself and share them with your friends, it familiarizes you with the huge class of languages with C-like syntax, it covers many domains as it can run on the browser, server, desktop, embedded, etc. today, and you could literally write nothing but javascript your whole life if you choose to as we'll still be dealing with it for at least 100 years. All great qualities of a first language.

A machine, in and of itself does nothing meaningful. All it does it move registers around.
The programs we make are meaningful because we impose meaning on them. The question is not 'how do I move registers around?' but how do I command the programing language to move registers around in a way that is meaningful to me? Lisp makes that easier to express what you want to do, while C gets you close to the hearth of the machine and the gritty details. Unless you're able to put aside the machine and think of the problem at hand from a high vantage point you'll never be able to solve the problem at hand. Sure, the code you can write will be efficient, but you'll never be able to write code that does what you actually want to do. Assuming of course, that you want to do anything non-trivial. C provides the survival gear and training for going trough amazonian jungle by foot, and lisp gives you the map of the whole place, so you can plot where you're going. Knowing them both, you'll be able to reach your destination.

Here's an example. Cryptography is something that would like to exist in that pure math space, but if you treat it that way and don't consider how your algorithms will be executed by the actual machine you'll end up with cache timing attacks and similar problems. While the algorithm is fine, the program is not, and it's an algorithm problem, not an implementation problem.
If you intend to write programs and not formulas, you need to learn how to think like the machine.
Untrue. The machine can solve any software problem. Math and programming are two separate things people try to combine. It's good to know both, but you don't learn one from the other.
I'm not. C stayed relevant for ~60 years and likely 60 more despite a much smaller initial boom. Javascript is everywhere and I'm quite certain people will be working with it for at least 100 years. Even if people don't want to use it, all that code out there needs to not only be maintained, but to have features added to it. We're stuck with it at this point.

By the same token, Lisp solves a subset of problems extremely well. I know I prefer to use specialised tools instead of dealing with the general, but poorly suited every time.
Lisp Machines are not the only environment where Lisp is suited, anything running long term where it would be nice to inspect and evaluate code at runtime is very well suited to Lisp. Web servers in CL are very nice to use for example.

Not obscene amounts of time, but you do have to provide directives to the compiler and tell it to optimise for speed, neither of which benchmarks sites bother with. Like all projects you write once, profile, and optimise the performance critical parts. Good thing CL provides a wealth of tools to accomplish this.
Here's a good introduction
blog.kingcons.io/posts/Going-Faster-with-Lisp.html

That doesn't matter. When you're a (real) programmer, you can adapt to and use any language, even write your own if need be. It happens all the time, for various reasons. Pratical example: anotherworld.fr/anotherworld_uk/page_realisation.htm

I am by no means an authority on Lisps, I dabbled in some of them before setting with Racket. I have used Guild a bit, but I cannot comment on its technical merits. The documentation in Info format was decent and it's the official GNU extension language, so I guess if you are looking into a Lisp to integrate with your application or one that won't be collecting dust on some repo in a far off corner of the internet it's a good choice. On the other hand, its maintainer is LARPing as a WW2 resistance fighter on the internet, so there is a chance he'll snap one day.


Yes, you can do that and re-invent the wheel or create a language that's so domain-specific that it will be useless for anything else. Let's say you want to parse some JSON data, in most languages you just grab a JSON library and move on. If you want to roll your own you will have to waste a significant amount of time on a side-task.

know what else expands your mind?


PCP

you don't see programmers using PCP all day either.

I don't like dynamic typing. C is shit too and it's no wonder these retards recommend both C and dynamic typing. Even Go has a better type system.

Common Lisp has type annotations you can supply. There is also Typed Racket, a variant of Racket that forces you to do static typing.
docs.racket-lang.org/ts-guide/index.html

I like to write my code in untyped Racket first to get it working and then port it to Typed Racket. Looking at the extra work you have to go through really puts into perspective how much hassle dynamic typing saves you, but I still think that having an entire class of bugs caught at compile time is worth the hassle.

Do you mean Guile? Read the creators blog for more than 10 seconds and if you're anything other than leftypol, you'll never want to hear of it again. That guy is a mess.

...

A small dose of LiSP can go a long way in improving programing mindset. ;) .

Druggies usually produce terrible code, but I've never personally seen code from a LSD user. We did have a pothead and he produced the most terrifying spaghetti I've ever seen. Huge sections of like 1,000 lines just copypasted with one variable changed, a single C file that #includes all other C files, 'event driven' code where there are at least 5 separate event loops on the same thread, etc..

With pot I can imagine how that might be the case.
The article is about micro-dosing lsd. I pasted it because this 'seeing the whole system at a high level' thing seemed analogous to my experience of coding in C after having learned and coded in Lisp. As response to that 'PCP' comparison.
Sage for off-topic.

LSD microdoses have been a meme for decades. If they worked, they'd already have been abused for all the things people claim they work for. As far as programming drugs go, Adderall helps shit programmers enter the autist zone temporarily to compete with the true autists. It's quite popular with the hacks in SV.

Yeah, I meant Guile. As long as there is no other maintainer for it I guess he's better than nothing.


It's one of these memes like "Ada Lovelace was the first programmer, hurr durr" people keep repeating without thinking. The "CS work" he is referring to was not actual computer science, these women were basically human assemblers and linkers. Back in the old days when an engineer needed a program he would explain the problem to a group of mathematicians, who would then find an appropriate algorithm, write the code and hand it to a secretary who would type it onto a paper tape, splicing in routines from a (literal) library when necessary.

In those primitive days just crunching numbers was enough of a work. As the field advanced the job of the secretary got automated away and replaced by the assemblers, compilers and linkers as we have them today. I remember watching a YouTube video a while ago, but I can't find it. It was an old black&white video from back in the day.

Yeah well the thing is, sometimes you want to do exactly that. There are people who somehow believe that everyone should always use the same library all the time. That's the Python mentality: do it this way only. Then there's the Perl mentality: just do it however you want. In reality, practical uses will be somewhere in between those two, and you have to be able to make the right judgement, not just blindly follow some ideology.

The only case I know about is Tran, who allegedly took LSD. pouet.net/prod.php?which=2878&howmanycomments=25&page=0

I'm just goning to poke in and give my opinion.
Common Lisp is very much worth learning... but not actually using.
It has shit library support, they say it's highly portable but as it turns out, it ain't if you want to use anything besides basic I/O. It doesn't support threads or libraries by default, and it demands you use Emacs (aka gtk bloat). Worst of it all, Common Lispers love to boast about their super advanced language and will bash every other language in existence telling you why your taste is shit.

So why is it worth learning? It does have some nice interesting things that most other languages don't, my favorite being the Metaobject Protocol. Not that you're at all likely to end up using it, but it's a very nice read indeed.
I do all my programming in lanugages that don't get in the way, yet I study lisp because it's so interesting, it's like magic. Only not the kind of magic that you would actually want to use in your project.

>I don't know jack about Lisp, but I read this story years ago and always wondered why more people didn't use the language: paulgraham.com/avg.html
Because it was fucking 1995 and the better options didn't exist yet. For fast development of fast-running web applications, look into Go, since Google created that language explicitly to replace Python with better performance.

;;how to make a 50 MB hello world executable in SBCL(save-lisp-and-die "helloworld.exe" :executable T :toplevel (lambda () (format t "Hello World!~%")))
Its not mental poison and it's not really meant to generate standalone executables. You are right that it abstracts away the lower levels a bit much. But I think its a good idea to operate on 2 levels: near the hardware with C/C++, far from the hardware with Lisp.

If it is such a mental poison why does every language reimplement some feature of lisp poorly and pass it off as a brand new innovation.

If you want to learn a lisp learn Clojure. It's a modern practical lisp and pushes you more into functional programming than clisp. It's also a small language and really easy to learn.

If you want sluggish code that depends on a mishmash of Java libraries, listen to this fool.

If there's a library available that doesn't stop you from rolling your own. But if there's no library available you're forced to roll your own.
If you want to go inbetween you need a language with lots of libraries.

Not necessarily, if your language lets you leverage an existing librariy for other language. Look at SDL, it's for C, but there's bindings for many others. Same idea with Tk, Gtk, and so on. You only need to write the glue parts.

I use CL somewhat regularly myself, it's a pretty good language but not without drawbacks I'll admit.
Executable size is indeed a bitch but CL works best if you just use ASDF to import libraries and create small wrappers around it. Most implementations have a script function for a reason.
Common LISP is simply the best for numerical computing. It has built in fraction types that make its floating point calculations significantly more accurate.
Many may say LISP is very slow, and it's true I suppose but optimization is incredibly easy as you can measure function performance down to how many CPU cycles it took to complete.
LISP is a very batteries-included language. Just about anything you need is included by default, usually you don't need to import anything. While LISP has very few libraries, very few are actually needed as they tend to follow the batteries-included approach.
Overall, I wouldn't think to do any extremely performance important task in the language but it excels at simple number processing and list processing. It's also very easy and fun to write your own small libraries for small utilities here and there.

Hacker News and Reddit* were made in Lisp :^)
*originally, now it's in C++ IIRC

Suck it.
Machines are to serve their masters, not the other way. Thinking like a machine is only a prerrequisite to make better machines, or fixing problems you can't solve abstractly. (e.g. make secure crypto, etc.)

Try writing GPGPU code with that "I shouldn't have to think like a machine" attitude and see how far you get.

Lisp isn't more "math" than any other programming language (the most "exotic" math is bignums and rational numbers), and C sucks even for low level pointer-heavy programming. C is a huge problem and causes billions of dollars in damage and wasted time, but Lisp is an "Emperor's New Clothes" situation. Lisp doesn't even give 10% of the benefits they claim. There are some benefits, comparable to a scripting language, and it is probably 100 times more productive than C, but any good language is 100 times more productive than C.

That brings me to all of the C shilling in this thread. Shills do not want to replace C because C is the cause of suffering and problems, which leads to an infinite cycle of "researchers" and other charlatans. The suffering allows them to beat dead horses from the 70s like "functional programming" instead of focusing on solutions that worked for decades outside of the C/UNIX/PC ghetto. Haskell is not a C replacement and anyone who thinks it is is a moron.

So does Python. However, Python also has numpy.
What makes it better than Python?

Experienced programmers know there's nothing that can replace C today and are sick of you kiddos that just jumped on board in the last couple years due to the facebooks telling you you needed to be STEM pulling a dunning-kruger and saying we can replace C with whatever this month's popular language on HN is. The arguments are just painful as you'd expect from 'programmers' who have at best managed to write a Hello World arguing with people who have been doing this for a living for decades.

This attitude will serve you well when JavaScript and PHP are the best languages on the planet, after infinite amounts of PHD students have been ground up and used to force benefits into PHP and JS implementations.

I use python for numerical computing. Most people around me use python for numerical computing.
If common lisp is better for numerical computing I'd like to know. However, all the reasons that post listed also apply to python. They're not enough to make me try common lisp, because properly trying it is a major investment of time and effort. But if there are other reasons that don't also apply to python, or apply moreso to common lisp than to python, that might change things, so I asked.

I have a feeling people in this thread have very different ideas about what that phrase entails.

It's certainly better for expressing math, seeing as a Scheme program is built entirely out of recursive expressions-to-be-evaluated. It makes bringing concepts like Peano arithmetic and lambda calculus into computer science (=/= programming) studies natural, where they would seem out of place even in a high level derivative of Algol which has come closer to Lisp (like Python).
You can write in it as if it were any other language (albeit an awkward but powerful one), but that's missing the point of why academics and Lisp hackers love it.
Most programmers struggle with C and write buggy code because they're lazy and don't understand UNIX philosophy, beside that C is smushed in where it shouldn't be because of tech-illiterate managers who see that C is popular and decide to use it. That much is true.
If you're getting segfaults and buffer overflows though, that's on you. It's a fine language if you learn it right. Pointers aren't that hard.
You don't even realize what a powerful language you're dealing with when it comes to Lisp, do you? Are you Paul Graham's theoretical "Blub programmer"? Have you actually tried out Lisp? Have you ever used a macro? It's unlike anything else out there.
That's mostly because other languages have taken so many of the innovations which made Lisp special and incorporated them. Lisp is still the only one to combine them into something off-the-scales powerful, though.
What do you suggest to replace C? Pascal? Pascal probably would be better for making low-level code for OSes (Pascal microkernel when?) when considering where things have headed (high security software demands readable, hackable code a la BSD) and the decline in programmer quality (codemonkeys taught HTML, Java, and Python need simplicity shoved down their throats), but it's dead because no one uses it. Even if it is a good all-rounder of a reasonably low-level language, it has no momentum now and no calling card to gain any momentum.
Compare this to how there will always be a few Lisp hackers every generation because of its unique mix of a lack of syntax, s-expressions, homoiconicity, dynamic typing, macros, etc.
That's the problem with replacing C, whether you like C or not - the language does so much that any language which could entirely replace it would either never be noticed or be too stretched out (C's generalism plus a more powerful feature, all from scratch at a level of abstraction which no one wants to deal with because we don't need to milk a 256MiB RAM machine for performance). It's a legacy relic in critical but ugly niches.
The one instance in which something new might be built is that of programming languages for microcontrollers like Arduino and the like, a modern fork of Forth to address C's issues. There are also quantum and neuromorphic computers which need their own low-level languages (although Lisp can already be ported to the latter, and maybe even the former as well).
There's a Haskell compiler which converts it to C-- (C stripped down to portable assembly code). IDK if it's any good or not, I don't use Haskell.

this user gets it.
90% of programming advice on the internet takes the form of "you should eat that plant and get back to me on whether it's poisonous or not"

But writing buggy code because of laziness is completely in the spirit of Unix. I'm not even joking.

marc.info/?l=openbsd-misc&m=151233345723889&w=2

Think how fast tech was advancing in the 70s vs today. If you think that plays no role in the longevity of a language then you have water in your skull. Keep in mind that the entire computer industry was built on COBOL and C. The role of C has stayed the same because it its design left little to desire in its field (that'd be very low level stuff not some userland buggy software). In 10 years the web will be the first application your mobile computer starts up and it sure as hell won't be written in javascript.

Maybe it was when compared to "The Right Thing", but not any more. Code was written for a group of users which included the programmers themselves. Lazy was something that kind of worked, did its one job, and didn't take up resources.
Now, you shit out as much code as possible (in whatever language your idiot boss tell you to use) to make a hackneyed bells-and-whistles solution that barely works because otherwise some $2-a-day guy in India will do it first, and people eat it up and throw it out anyways. By comparison, UNIX philosophy looks austere, out of touch, and impractical.
Different times, different lazy.

When python programmers say they are doing numerical computing, they're using their language to call other libraries, nearly all of which are written in C/C++/Fortran/Asm and do the heavy lifting.

Python('s most popular implementation) itself is also implemented in C. Numpy is written specifically for Python and integrates very well with it. What's your point?

That it isn't strange for Python to be used for numerical computing.

Right, sorry. Your post pattern-matched to some weird arguments I've seen in the past.

I think it's because people know Python is a slow language which seems antithetical for number crunching. Yet those same people wouldn't bat an eye if someone told them they were doing numerical computing in Matlab or Mathematica, which are both (very) slow languages. Of course in all three cases, you're calling highly optimized routines, and funny enough, it always seems to be GMP, BLAS and LAPACK at the bottom.

The Right Thing is not complicated. It just means you throw away code and start from scratch every so often. That is the secret to clean, elegant solutions.

The web is UNIX philosophy. The whole incrementalist approach is favored by UNIX and C hackers. If it was made by some other group, they would have made a HTML 2 and a Script 2 based on everything they learned in the last 20 years, and completely broke backwards compatibility. These incremental things all suck because they "kind of work" instead of work, which is the UNIX philosophy (Worse is Better) at its finest.

Some posts said Lisp is a "variant of math" or "how math works" or "a more pure mathematical space". I would be more likely to believe that if it was a Haskell or Coq thread, but even those languages depend on the machine.

The creators of C are lazy and UNIX philosophy is cancer. The real UNIX philosophy is "Make something that sucks and form a cult around it so you don't have to fix it."

C is a bad language even for pointers, which are the only thing C is supposed to be good at. C continues to cause billions of dollars in damage but programming languages older than C solved these problems. Why are we using something 50 years later that is worse than what they used in the 60s on computers with 64 KB of memory? It makes no sense to me. Does that make any sense to you?

Since you're the one defending C, you sound like the "Blub programmer". Lisp increases productivity by orders of magnitude compared to C, but not compared to a good language.

Scripting languages have a lot of Lisp features anyway, so they add more Lisp features because they're the best fit. Other languages don't really copy Lisp unless some Lisp shill does it so they can say everyone is copying Lisp, like lambdas in Java and C++.

Pascal is one choice. The universities were shilling that FP meme since the 70s and what really happened? Programming got worse, education got worse, software got less secure and more bloated. None of that is FP's fault, mostly because FP didn't catch on at all, but what it did do was "steal oxygen" and researchers from more viable languages. There were a lot of known problems in CS, but the FP shills were talking about getting rid of von Neumann machines and other things that never happened, so they decided not to solve them.

There are Pascal, BASIC, COBOL, APL, and MUMPS fans.

C doesn't do much at all, and what it does do, it does poorly. Just add pointers and pointer arithmetic, and your language does everything that's special about C and does it better.

lol

106 replies in, and you're the first to mention the Ada language. Mentally ill?

Well, CL has fractions built in, not requiring another library. Fractions are also handled generally exactly the same as standard integers and the like. According to the benchmarks game, CL is generally faster than Python with numerical calculations as well. Python has no function that can load data and code from any arbitrary file, with CL, saving data and loading it is a breeze because you can easily load LISP data out of a file.
CL can be used very efficiently to process lists of numbers as well, LISP stands for LISt Processing for a reason you know.

Python has fractions in the standard library. They implement an abstract base class, which ensures they're handled properly.
Python with numpy is really fast. It's the standard for numerical processing in Python when you care about performance.
Pickle can supposedly store and load arbitrary Python objects, but I haven't tried it. The usual data structures convert cleanly to and from json.
Numpy is vector/array-based, which is sort of like lists. Vanilla Python is famous for its list comprehensions.
Do you have experience with numpy?

...

It's usually even slower than regular python, which is already impossibly slow. Time this with and without the float32 casts.
#!/usr/bin/pythonfrom numpy import float32d = float32(0.1)n = float32(0)for i in xrange(1000000): n += dprint n

$ for x in *.py ocaml; do cat $x; echo; echo -n "-- timing: "; ./$x; echo; echo; done#!/usr/bin/pythonfrom numpy import float32import timebefore = time.time()d = float32(0.1)n = float32(0)for i in xrange(1000000): n += dprint time.time() - before-- timing: 0.12046289444#!/usr/bin/pythonfrom numpy import float32import timebefore = time.time()d = 0.1n = 0for i in xrange(1000000): n += dprint time.time() - before-- timing: 0.0890231132507#! /usr/bin/env ocamlscriptOcaml.packs := ["unix"]--let before = Unix.gettimeofday () let n = ref 0.let d = 0.1let () = for i = 0 to 1000000 do n := !n +. d done; print_float (Unix.gettimeofday () -. before)-- timing: 0.00366306304932
well that's not fair. the parts that are optimized are optimized you know. you can't just expect everything to not be shittier than usual.

numpy in general is for accuracy and compatibility. Doing anything other than array operations it will be quite slow as it has to go through several additional layers of python.

The problem with Lisp is that it was just too damn slow and bloaty back in the day, and when hardware finally caught up more sensible high-level languages with easier to grasp syntax became available to use instead. There's no real reason to learn it now unless you wanna customize Emacs.

Okay, now I know you've never seen numpy code before. Here's the proper way to do it:
#!/usr/bin/pythonimport numpy as npimport timebefore = time.time()ns = np.linspace(0, 100000, 1000000)print time.time() - before
numpy is based around vectorized operations. If you're using loops you're doing it wrong.
Now, imagine we do something almost useful, instead of incrementing a number without using it for anything. Let's say we want to square the numbers, and calculate their sum. This is how you'd do it with numpy:
#!/usr/bin/pythonimport numpy as npimport timebefore = time.time()n = np.sum(np.linspace(0, 100000, 1000000)**2)print time.time() - before

.. that's not even remotely the same thing, though.
.. I explained that about array operations. But it's also wrong to suggest loops are 'doing it wrong' as many algorithms can't be expressed in terms of big array operations.

It doesn't make any sense to do the same thing, because the same thing is useless. It's something you'd only ever do as a step in the process of doing something useful. Numpy has a useful way to do whatever that thing is using linspace, probably.
The method using linspace calculates all the same numbers your loop calculates. It's the numpythonic way of getting a million evenly spaced numbers.
Can you give an example of a calculation that can't be expressed in numpy that way, in that case?
Numpy uses plenty of loops to implement these operations, but if you're writing your own loops you're usually doing it wrong. If the algorithm requires loops then you can still leave those loops to numpy.
Numpy derives its performance from not spending a lot of time executing python code. If you loop through xrange(1000000) you execute a million python loops. If you use a numpy function to generate a million numbers then you only execute a single python function. The bottleneck is moved out of your own code and into numpy's own compiled code.

Creating a sequence instead of printing a value and having that sequence sum to 50000000000 rather than 100000 and calling it "the proper way" made more sense?
That code is similar to what I used to graph percentage error in accumulating frame time from Unity. It's not useless. A more complex version of that was used to show some devs why things were going haywire after a few days played on 144hz monitors. I needed numpy for the float32 support.
Seems I already did, but in general, data dependencies limit array/vector operations. Something like AES might be a good example for you as while the math can be expressed in terms of arrays, those arrays are intentionally limited for various reasons and are so small that you'd likely be better off just writing a big loop than dealing with numpy's overhead.

For almost all numerical computing where I would be tempted to use that loop, yes. It looks like a common pattern for doing calculations in python. For your purpose, no, but I didn't know what your purpose was.
That makes sense. But it's not a good way to benchmark numpy, because you're not even making numpy do the heavy lifting.
I can't find a fast way to do it in numpy that keeps all the imprecision you're looking for. But if you're not deliberately trying to make your answer inaccurate, this is a good equivalent:
>>> np.sum(np.full(1000000, 0.1, dtype=np.float32))100000.09
It's about a hundred times as fast as your code. It doesn't do the same thing, because the answer it gets is much closer to the real answer, but that's the opposite of a problem in most computations.
AES does look like a good counterexample.
But many other things with data dependencies can still be made a lot faster using numpy as long as you can express those data dependencies without using python loops. Numpy uses loops all over the place internally, but because they're not implemented in Python they're a lot faster.

the point I thought was that numpy slows down normal-ass numpy code. If the numpy-ass numpy code wasn't much faster people wouldn't be using it. I think people who use Python have bad taste and poor morals; I don't think they have an alternate experience of the passage of time.

Nobody uses numpy that way.

I've been searching for a way to write common lisp in nvim, but I have not come across a single good andd working plugin. Does anyone know if there is a good way to write clisp in nvim?

What do you mean by a "good way"? Rainbow parentheses? Auto-indenting? Auto-matching parentheses? A REPL plugin? You have to be more specific.

I have no idea what im doing user.

Here is what I have installed. Keep in mind that I know how to use Common Lisp, but I have not yet written anything big in it.
< Rainbow Parentheses
I use github.com/luochen1990/rainbow mostly because it is not limited to parentheses, it also works with stuff like HTML tags.
< Auto-mathing parentheses
github.com/Raimondi/delimitMate Not just for Lisp languages, this is handy for pretty much anything. Not limited to parentheses, works with anything
< Auto-indenting
No idea, it just works out of the box
< REPL
There are a couple, I use github.com/HiPhish/repl.nvim
< Something like Slime
There is SlimV github.com/kovisoft/slimv, but I haven't tried it yet. May downside is that it's a Vim plugin, so it's subject to Vim's limitations. I wanted to try my hands at writing a proper Neovim plugin, but when I looked at the source of Swank (the server part of Slime) I threw up my hands and just gave up. I don't have to autism super powers to dig through that mess.

Ada did nothing wrong.

80% of the pre-Rust mentions of Ada I read describe it as a giant inelegant overdesigned clusterfuck. Sometimes it's compared to PL/I.
I haven't used it, but it makes me sceptical.

of course it's bloated, it was designed for military use and high assurance meme. I still want to learn it so I can show off, but it's not a 'brainlet repellant' like glorious Lisp, nor does it contain zero cost abstractions.

Try 999/201*201 in Python and compare with (* 201 (/ 999 201))
I'll wait.

So that's why lisp is so insanely slow then.

Don't get me wrong, I love Scheme, but I'm pretty sure numpy would offer that. And any language can get that functionality with gmplib.org.

any language can get that functionality with floating point:utop # 999. /. 201. *. 201.;;- : float = 999.00000000000011if given machine integers you can take them and compose bignums, rationals, a CAS if you want. Scheme just takes away machine integers, giving you instead tools that are 'better' but which are no longer a solid platform for anything you might make. but python still sucks

>>> from fractions import Fraction>>> Fraction(999) / Fraction(201) * Fraction(201)Fraction(999, 1)
A predecessor of Python, also developed at the CWI, used fractions by default. It turned out to be a mistake - most calculations didn't need that precision, but it made them significantly slower and in some cases made programs endlessly slower and slower as the factors kept piling on. It's not something you should have by default. So Python puts it in the standard library, but makes it optional.

This works too, by the way.
>>> Fraction(999) / 201 * 201Fraction(999, 1)>>> sum([5, Fraction(999), 201, 100])Fraction(1305, 1)
You don't need to make every single number you use a Fraction manually. It propagates.

What languages do they want you to use instead?

That's because there are actual similarities to PL/I. They both have fixed-point numbers, tasking, nested subprograms, exception handling, decimal numbers, region-based memory management, dynamic arrays in records, variables declared at specific addresses, etc. C does not have any of this (Blub paradox).

If someone said that about Lisp, you would call them a Blub programmer.

Honestly, give me anything with strong and/or static typing and no garbage collection. I was almost interested by Haskell until I saw the GC.

What distinguishes scheme from common lisp?

Have a webm in exchange for your knowlage.

It's a different language with similar syntax.
Common Lisp is large. Scheme is small. Scheme's standard is shorter than the table of contents of Common Lisp's standard.
Scheme used to be popular for teaching. It's the language used in well-known meme book SICP.

can you even read?

You're still missing the point. Fractions are automatic in CL but must be imported in Python.

All problems are better solved thinking like a machine, 'tard. That you can abstract 1+1 in the most dirty, inefficient way possible and have it limp across a special olympics finish line does not make it good, and if you compete for real you will be destroyed.

Defaulting to use fractions by default is awful. It's why no one uses CL.

Python's behavior is better. Most of the time you shouldn't use them. Making them available is fine, defaulting to them is stupid.
van Rossum knew this. He worked on another programming language that did default to fractions, and lo and behold, it sucked.