Best Language

What's the best scripting language? (Efficiency, light-weight, and speed wise) I have a feeling it's lua
Lua is extremely easy to learn, it's pretty efficient from what I know about it, and low-end computers can run fairly beefy lua games decently, for example, Spintires.
I know JavaScript is pretty terrible when it comes to efficiency and speed

Other urls found in this thread:

benchmarksgame.alioth.debian.org/u64q/compare.php?lang=lua&lang2=python3
trac.sacrideo.us/wg/wiki/R7RSHomePage
javazoom.net/jlgui/sources.html
raid6.com.au/~onlyjob/posts/arena/
twitter.com/SFWRedditVideos

dumbass
The primary purpose of a scripting language is to make rapid prototyping really, really fast. That means as little typing as possible. Lua fails in that regard.
The GOAT remains Perl 5 but sadly its community is dying. Ruby is probably the second most terse language of that sort.

There is no general answer to that question. It depends on your use case.

Lua is a great language to embed in programs. That's what it was originally meant for, making programs extensible.

Shell and Perl are good for text processing and gluing programs together.

Python is a good choice if you want your code to be readable (either by other people or by you a month from now). It's general purpose, but especially useful for some particular purposes, like scientific computing. If you're looking to learn a single language I recommend Python, although learning other languages besides it is a good idea.

node.js may or may not be the best choice for web development.

From a pure speed perspective, yes. Lua is the #1 go to.
Python lies on a somewhat distant #2.
Perl is also somewhat of an option, but only if you use a lot of regex.

Javascript is rather terrible, but if you're willing to botnet it up and use the google V8 javascript engine, then you can actually squeeze quite a bit of performance out of it.

not so fast, Mr. Trips

benchmarksgame.alioth.debian.org/u64q/compare.php?lang=lua&lang2=python3

I would just use Golang and skip the scripting hassles.

cursive

Scheme

Only correct answer in the thread. Racket though

Scheme's problem is that its community just can't get its shit together. R5RS specified the bare minimum because it included only the things everyone could agree on, so you can imagine how little that was. There was not even a way of specifying libraries/modules, so the moment you wanted to split your code over more than one file your code became unportable.

R6RS sought to solve this issue, but many implementers thought it was too big and too convoluted, so they only adopted parts of it. R7RS small should fix the issue by providing long-missing essential features, but it still isn't fully supported yet by every implementation. Then there is R7RS large which will be the "batteries included" language for Real World development, but that one still isn't anywhere near being finished, and when it finally gets done you will still have to wait for years until it is actually supported. R7RS large is expected to be even larger than Common Lisp.

The only way of using Scheme right now is either to restrict yourself to what R5RS offers or pick one implementation, say "fuck portability" and stick to it.

Python 3
/thread

BASH if you're going to be chaining and combing commands.
Python if you need to manipulate more complex data, or need to use some library.

Get a load of this faggot.

When I use Scheme at the moment, I am sticking to R5RS simply because I know it already. In practice, I have to choose a specific R5RS implementation (Chicken Scheme). When I choose to use R7RS large, I'll be doing the same thing anyway (choosing a specific non-portable implementation) but I'll do so with the understanding that the language will be ratified in the future and that my code will have to be eventually updated to reflect the ratified language. When the time comes, I'll be prepared to change non-portable R7RS code into portable R7RS code.

without a doubt rust.

R7RS Large is finished ( trac.sacrideo.us/wg/wiki/R7RSHomePage ). It's fully supported by Larceny, with the minor caveat that the libraries have their old SRFI names instead of the new R7RS ones. Other implementations should follow in time.

It's nowhere near the size of Common Lisp. It's similar in size to ISLisp, minus the object-orientation.

Scheme is ok, but my personal preference is for strong static typing so Oberon is my minimal language of choice instead. I know about typed Racket, not particularly interested. Some super-duper compiler could maybe typecheck with SRFI-145, but that wouldn't be minimal any more.

That's just the red edition of R7RS-large. They are going through the rainbow spectum, from infrared to ultraviolet, so R7RS-large has just gotten started at this point.

Lua can't into switch statements.
Lua can't into add/sub/mult/division assignment operators.
Lua can however into meta methods.
And meta methods is as nice as operator overloading in Python's methods.

Are there any important differences between the two?

Why do people keep misspelling Ada like that?

Lua's entire purpose was to be as portable, small, and efficient as possible because its meant to be embedded into applications, e.g. World of Warcraft, Photoshop, nmap.

Different scripting languages can have different goals and uses. "The Best Scripting Language" is a moot discussion.

lol

LUA'S A MESS
A BIG FAT MESS

Meta methods only works on tables and strings. Whereas operator overloading in Python works only on classes.

There's some peculiar thing with tables in Lua though. The standard table library all have OOP declared methods with the self attribute parameter, but the actual tables when declared do not have the methods attached. So in order to use something like:
table1 = {1,2,3}
table1.insert(4)

instead of:

table.insert(table1,4)

you have to attach the table methods as reference inside the actual table1.

When asked on stackoverflow, a user said that it's because Lua comes with empty tables and they're supposed to be empty both in content and in meta content. Yet, integers and strings have meta methods attached that can't be changed.

Probably one of Lua's strongest features is loadstring() that let you load a formatted sentence into code, slap brackets at the end to use it as a function. That and it's fast runtime for being made in C. The include statement is just as messy as javascript's require.

I could go on all day about Lua since I had to learn it in order to teach others.

Slight thread hijack: as someone who's currently working their way through learning Java, what is with the amount of hate for it?

Is it because it's bloated, used so ubiquitously, and considered symptomatic of the outsourcing of programming?

I don't really know shit; this is my first foray into programming. I'd be much obliged if Holla Forums could help me know why they dislike the language so much, so that I can keep it in mind as I work through Java towards the meme languages in this thread

Long story short:

It's garbage collected with no way of actually turning that off, which is great for beginners who likes sprinkling new statements around like sugar on porridge, but not so much for veterans who are trying to squeeze every ounce of performance out of it. Hell, tweaking and managing Java GC has become an art and in many organizations there's people hired SPECIFICALLY to do nothing but to mess with the GC settings all day erryday to try and get best performance out of whatever java-based software they are using.

Everything is signed. And even after over two decades there is still no proper support for unsigned, and the devs seems to be no more interested in it than they were 20 years ago, because it "goes against what we want for Java and will confuse beginners". They instead encourage you to fuck around with bitwise operators. All of which not only makes network programming a nightmare, but also ensures that your programs uses at least twice the amount of memory that it should be using. Something which is ALSO encouraged heavily. Want to store 0 - 255? A byte you say? Fuck you, use an int like the rest of the NORMAL Java devs.

Everything is an object, and Java is nothing but objects calling other objects which in turn interacts with more objects. Objects objects objects. Want a function? Fuck you, use a static class method. Which not only uses more memory and resources, but also encourages you to write in a "black box" fashion where you're "not supposed to know" what goes on in the object you're calling, only what it accepts and what it spits out.

There is no pass-by-reference. Everything is pass-by-value. So if you have this huge object that you need to pass to another object, enjoy spending extra unnecessary milliseconds waiting for it to be copied from one location to another.

It encourages you to learn and embrace shitty coding practices, such as no respect for memory (Why use a byte when you can use an int! Everyone has 16GB anyways!), no respect for the CPU (I could make this a static method... nah lets just divide it up into several classes that needs to be instantiated and called by one another, because MODULAR!), and no respect for other devs (They don't need to understand how my object works, just that it takes an int and returns a boolean, lol), among many other things.

And to top it all off, there's also the "Pajeet Situation" where the language in itself is designed around one dev being able to easily take over a project with little to no knowledge of how it actually works or how it's structured. Hell it's designed around the idea that any dumbfuck should be able to pick it up and learn it. So if you're hired by a company to create or finish a project in Java, there's a high likelihood (to the extent where it's almost guaranteed) that you're going to find yourself unemployed a week after finishing because the company decided it would be better to hire some Indian guy for 1/10th of your salary to maintain the project.


I could go on but... suffice to say Java is nice and all for prototyping, and for learning. But if you want to actually be able to do something, and actually be able to be employed, you're better off picking up some other language along the way.

Java is a horrid language and a horrid VM. Speaking of the latter, there are still no compound value types so many kinds of code have to be either slow and thrashing memory, or obfuscated as fuck.
Oracle implementation is botnet, free implementations (including OpenJDK) are not complete and good luck building them yourself from sources.
Standard library is bloated and kinda impotent at the same time.

Even if you optimize code as fuck, Cython is still faster and has less overhead of interoperability with plain Python than JNI.
In many cases even PyPy is faster than JVM.

This is actually what annoys me about Garbage Collected languages generally. To me, if you need this feature, your code was probably poorly designed in the first place.

Am I wrong about this? Or has Garbage Collection basically just made it acceptable to write poor code?

Wow, thank you both for the thoughtful responses - this provides me with a very good orientation for metacognitively approaching the learning of this language. I have dabbled in C & Python (without getting too far) and Java seems like an absolute mess compared to them - as first poster said, object object objects.

I'm going to be learning this through a college course, since it was either Java (terrible), Visual Basic (the worst), or intro to basic computer usage (the horror.) I figure I will, as time permits, supplement that study with exploration of C, since it seems to force good practice. (Again, I don't know what I'm talking about here, that's are just my understanding of things.) If anyone has tips for balancing out the shitshow of Java/how to work to combat the bad practices inherent in Java, I'd be much obliged, I don't want to be a bad programmer.

I am seeing that there is a C# option, actually - would that potentially be a better starting point? I'm afraid of getting proficient in a language that teaches bad practices; I'd love to take a C course, but I'll take what I can get.

For a scripting language, what's important to me is that you don't have to write a lot of boilerplate code, and you can do a lot of complicated operations easily and with little typing (which also relates to one-liners you bang out in the shell).
Perl is very good at all these tasks, so it's what I use unless it can be done easier with a /bin/sh script.

You don't *have* to use regex operators all the time in Perl. It's just that this is very convenient and clear. You could also use substr() and other such functions instead, and sometimes it's better (more efficient) that way.
One other nice this is a lot of functions operate on a list, and you can chain them together to achieve a lot in very little code. I'm talking of functions like grep(), map(), sort(), and so on. The list support is not as good as Lisp (you have to use references to avoid flattening), but it's the next best thing.

Don't you have to sit in that shitty Python? If it is more than 20 lines of code then don't use Python, no compilers, no types, so you will feel like a mekboy trying to glue shit together.

Python does have types (it's strongly typed, even, just dynamically), and there's some support for static typing too now. The post you replied to mentions two different ways to compile Python.

Milliseconds to copy and pass a pointer is literally nothing.
That is what all of proper software engineering does. There's no need to know how the method works, only that it does. If it doesn't work, you can go change it(if you have the source) or write a complaint to the library author(if you don't) but you have zero need to understand how the majority of the code works. Only when to use it.

Use a short.
Do you really think that 1 byte of memory matters? The situations Java is used for are situations that can take that hit.

You really need all that extra speed on your fizzbuzz right?

Oh you're just spouting memes. Awesome.


GC doesn't mean "poor code". GC means less time spent doing tedious memory leak hunts. In situations where you can have GC, you should. Most things done today can afford the GC overhead, because people are expensive and CPU cycles are cheap.


Learning Java won't make you a bad programmer. Java doesn't prevent you from engaging in proper design. For what you are learning, and where you stand, there is literally no downside to learning Java. Just make sure you practice proper design, and that can be done or ignored in any language.


Standard library does have a lot of deprecated objects but that is mentioned on its page on the oracle docs. I've never heard anyone complain that a standard library gave them too many options.

Does not look like it looking at the documentation, the types are no where to be seen.

I should clarify. It's impossible to work on 100% of a project. You work on small sections at a time, having to treat the rest of the project as a black box. This isn't laziness or lack of respect for devs, it's a requirement to actually be able to work. If you want to pretend you can take in 100% of a million lines of code, or ten million lines of code, go ahead. LARPing on the internet must be fun. But in real projects this is impossible. You'd quickly become so buried in the details that you'd never make decent progress. That is why we treat functions as black boxes that take input and give output.

If your project takes up millions of lines of code, it's as big as an operating system, and maybe you're just bloating things up unnecessarily.
Anyway you most likely don't have to take it all in at once. Even on big projects, you can divide the whole into parts that are individually distinct, and then you work on one piece at a time. Eventually you might look at 100,000 LOC, but not all at the same time.
It's important that you *can* actualy understand all the pieces though, in case something breaks or you find out that an ssumption was invalid.

Technically correct, but the rest of the line goes off script. When you instantiate a non-primitive in Java, your object exists somewhere and you've got a reference to it with the name you chose. You can pass this reference around all you want, but any assignment to it just replaces the reference without modifying the object it refers to. Therefore, passing a large object into a function is only passing in a reference, and not the object itself.

I looked into how Java handled this shit when I used it to write a robotics thing on an Android phone, among other things, I also found out the Java devs don't like operator overloading because it can be used "deceptively" and so they won't let anyone but them do operator overloading which they only use for their string class. I ended up just using JNI and calling C++ code.

Can you explain what you think types are?

Variables are not bound to a single type in Python, but types very much exist.

To get the type of something, run type() in an interactive interpreter. Try type(5), type(True), type("foo") and type([]).

How many lines of code do you think are in your car? Running a navy warship? Firefox has 14 million lines of code. Chromium has 15 million.
Doesn't escape the problem; it's impossible to take it all in.
Then why are you complaining when Java devs don't require you to?
This is no more or less true in Java than in C. It's not related to the language at all.

The car and ship are running operating systems on embedded systems. Firefox and other such browsers are pure bloat, and effectively try to be a second OS for running JS "apps".
But overall, most IT projects are not operating systems, and don't need to be scaled to that level. I worked on many smaller systems in the 10K's LOC size that did a very specific job.

Are you implying that this changes the fact they are too large to take in all at once?
That's very good for you. I can guarantee you that you segmented work on those projects as well.

Why can't you actually use what I said?

and

C# is same shit, just on steroids.
+ Absolutely insane naming convention
+ Questionable legal status (it's done by M$)
+ Legal issues aside, there's Mono but it's not exactly the same as original .NET
+ Somehow they managed to kill backwards compatibility AND leave most of the trash in standard library when bumping major versions. This speaks something about their ability to design software
(Otherwise why users need to install .NET 2.0, 3.5, 4, at the same time? lol)
+ Speaking of third party stuff, .NET has no good&free networking libraries (I mean, they are worse than for Java)

think about array of these numbers. if your brain is capable of simple math, of course.

Well you don't take them in all at once, no argument there.
But that doesn't mean you'll never have to investigate why some module is misbehaving, or even just to see how something is implemented so you can avoid edge cases or other suboptimal sistuations.
I've had to actually fix 3rd party libraries myself that were no longer maintained by anyone. The code was quite old, and the author (a random contractor) was long since gone.
This can become a pressing issue when system upgrades happen. You have all this old code that was working fine before, but now it needs to be revised, and much of the old team has since moved on. Well someone has to look at all that code, and make changes. If it was designed as a black box with not care whatsoever for the readability and maintanability of the code therein, then it will be much harder to deal with.

You want us to do some web searching for you, right?

For good practices, learn:
Scheme (just for solving SICP)
Haskell
Assembler (for your favorite arch, optional)
C (but don't fall for a shitty book, your goal is to either write code without UB, or not write it at all)
Erlang
Python 3

…and after these, Rust, or course.

Okay, an array of 1 million shorts will use 1MB extra memory. How often do you find yourself dying for 1MB extra memory today? Situations where this difference is critical should be done with JNI or JNA or something similar i.e use the right tool for the job.


Again, nothing about Java prevents or encourages this anymore than C. It isn't a language problem.

JNI is PITA to work with, you probably never tried it.

for cases, think about Android for example. or when one has a VPS with little RAM because doesn't want to spend lots of money on it.

Most android phones come with loads of RAM so I don't see how 1MB of extra memory is supposed to be a no go zone. I agree that have unsigned would be nice, but I don't think it's nearly as large a problem as you are making it out to be.

eh isn't it 3MB

So what? It is not available to apps. There are some very hard limits, way below the total amount of RAM. You never developed for Android, right?
of course you didn't, it's trivial to see

I don't think those limits are "5MB of ram" so I still don't see why this is a problem. Perhaps I should also question why your android app needs 1 million entries in a byte array.

Again, unsigned would be nice, but I can't see how this 1 extra byte is some sort of final destination no go zone that makes Java bad.

It just adds up with other issues, like no compound value types.
the order of magnitude is about 50-100MB, sometimes lower. and if the application needs to work with heavy shit like huge images, every kilobyte counts.

Also, you ignored second (non Android) example.

Maybe so, and so again unsigned types would be nice, but Android seems to be functioning fine without them, so I'm not convinced that this is some critical disaster that disqualifies Java from being a good language.
If you don't have the memory to run a Java application then write it in something else. Java is a fine language if you have some spare RAM like 95% of computers today do.

Have you ever used Android Studio? 4+ gb for an IDE and still lags like crazy.
I can run 3 VMs with less RAM.

I haven't. Can you say with confidence that this is solely the fault of Java, and not poor development practice? I don't believe you can.

I hadn't heard of ISLisp for a long time, so took another look at it. Looks good actually, OpenLisp is a nice interpreter with some useful libraries. Also, it's the only Lisp I got to work under "Bash on Ubuntu on Windows" (work laptop, I have to use Windows).

I don't really accept that Lisp has a "Unique Selling Point", I think there are no silver bullets, but it's nice to have it available all the same. Even if I prefer Prolog for prototyping these days, or Oberon for development when I know what I'm doing.

I'm in an HTML5 class as I type this. College is fucking pointless.

why not common lisp?

i use embeddable common lisp. can embed in c.

bro wtf? why bother? learn it yourself you don't need to go into debt for that.

the only reason to go to college for cs is for rigour / the actual science in computer science, the algorithms and mathematics. you gotta learn languages on your own.

also fyi, computer science is not about science
also fyi, computer science is not about computers

Why should I use two bytes to store a value that's inherently 1 byte in size?


One byte? No.
(1 * 1024 * 1024) bytes? Yes. That's the equivalent of a memory leak and is unacceptable.


Yes. Why did you formulate that question as if optimization and speed is an undesirable thing?


Says the person who went "Lolz fizzbuzz amirite??" two paragraphs up.
You're not exactly a shining beacon of factual statements yourself.

You shouldn't have to. That being said, it's not the end of the world that you do.
1MB of space. I highly doubt that will cause you a problem.
Why do you assume a GC will be the biggest slowdown on your system? Surely you aren't implying you've optimized your program to such a degree that only the GC could cause slowdowns. You don't have to wait to read from a disk, or send a query over the internet or any of these tasks that take orders of magnitude more time, during which a GC could run without problem?

Oh really?
It's only fine because the other alternative is full locked down botnet, but the performance and battery usage are still meh

Don't forget about realtime tasks, like outputting audio or games, etc.
Just one GC pause at inappropriate moment might ruin the whole thing

There are two real reasons:

1. SBCL, my favourite CL implementation, won't work for me on Bash for Windows.

2. The CL standard is huge and I never learned the whole language. ISLisp is much more manageable, yet more useful then R7RS-Small at least, it may be comparable to R7RS-Large. Like C I can remember the language without constantly referring to the standard. At the same time although it's not a strict subset, writing ISLisp feels very like writing CL.

If I wanted to troll, I could say I'm not from the US so why would I care about an ANSI standard when an ISO one exists :-)

The fact that ISLisp isn't wildly popular might influence a decision to use it commercially, but this is only for hobby programming (trying to work through "Goedel, Escher, Bach") so I don't care.

Compared to some absolute imaginary standard you have maybe, but Android users seem satisfied with it.

I have never heard an Android user complain about this, so I'm tempted to believe this doesn't happen, or if it does it isn't nearly as large an issue as you imply.

Stockholm syndrome.


It's not only on Android.
And point me to an audio player which is written in Java (not only the GUI isolated from the core by IPC, but entire fucking thing in Java, with DSPs, etc) and isn't plagued by audio drop outs caused by GC pauses.

javazoom.net/jlgui/sources.html
100% java.

did you confirm that it doesn't suck balls in practice?
also I don't see any mention of DSPs in its features overview

Yeah I've been playing music for the last 10 minutes on it.
[code]
import javax.sound.sampled.AudioFileFormat;
import javax.sound.sampled.AudioFormat;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.AudioSystem;
import javax.sound.sampled.Control;
import javax.sound.sampled.DataLine;
import javax.sound.sampled.FloatControl;
import javax.sound.sampled.Line;
import javax.sound.sampled.LineUnavailableException;
import javax.sound.sampled.Mixer;
import javax.sound.sampled.SourceDataLine;
import javax.sound.sampled.UnsupportedAudioFileException;
[\code]
100% java.

bump

What is there to bump? This question is like asking on an engineering board which construction material is the best: you will be laughed out of the board because even something as shitty as plywood has its purpose.

...

oh boy ecl looks really fucking cool

10 minutes is fucking nothing.
test it with something like continuous 20Hz sine wave for a couple of days, record the output and check it for discontinuities/gaps. (low frequency sine is just the easiest thing to check if there are any discontinuities or gaps)
all while heavily using the computer for something else too.

if it passes then I might give it a shot. but I am almost sure it'll fail the test.

finding discontinuities in the recording is easy — reject the original sine frequency with a high-Q IIR bell filter, and look for outstanding peaks in the resulting waveform

Perl is still the best all around scripting language.
However since Holla Forums is full of millenial babbies they all think python is better.

You wouldn't happen to have an argument, would you?

any evidence to back that up? besides "too much readability is bad"

Best performance out of all scripting languages
raid6.com.au/~onlyjob/posts/arena/

best regex implementation by FAR

very terse so less typing required

very readable (see example)

example
my %s;s,^.+?;,,;while (m,(\d+),g){ if (++$s{$1} > 1) { print "$1\n"; last; }}

this is horse shit. look at neighbor thread

The one with the made up statistics?

Nice to see there are other JAPHs still out there.
Perl shall never die

C is the only language worth learning.

arbitrary-based > 1-based > no arrays >>>>>>>> 0-based

1 is the only LBound where UBound(a) = length(a).

Only Lua, Fortran, Cobol, Algol, Algol 68, Simula, Smalltalk, Basic, Visual Basic, PL/I, Pascal, Modula, Modula-2, Modula-3, Ada, APL, Haskell, Eiffel, SAS, MUMPS, MATLAB, Maple, MAD, Mathematica, Julia, JOVIAL, and JOSS programmers (sorry if I forgot anyone) will get this.