Aside from the fact it bloats the web experience and adds endless ways to both track the user and exploit their machine...

Aside from the fact it bloats the web experience and adds endless ways to both track the user and exploit their machine...

What are the technical reasons to hate JavaScript? I have no idea why JavaScript should be swallowable in small amounts.

Other urls found in this thread:

destroyallsoftware.com/talks/the-birth-and-death-of-javascript
dorey.github.io/JavaScript-Equality-Table/
html5rocks.com/en/tutorials/speed/v8/
usenix.org/system/files/1403_02-08_mickens.pdf
docs.racket-lang.org/guide/numbers.html
youtube.com/watch?v=yWDAw2nsIjw
myredditvideos.com/
twitter.com/NSFWRedditVideo

*shouldn't be swallowable

As in, small scripts should be okay, right?

destroyallsoftware.com/talks/the-birth-and-death-of-javascript

web-asm is the future

For me, it's nothing personal: All weakly typed languages need to be destroyed.

DOM manipulation is slow, the standard library is garbage, and you're constantly fighting a type system that's undergone a botched lobotomy.

The type system isn't much different. It suffers from syntactic diabetes though.

I don't know whether to laugh or cry as I imagine that future

purely as a language, it's not bad. It has a really really fast interpreter, allows functional programming, as a language it's fairly simple, and a fairly sane basic syntax. The bad parts are: arrays are broken (i.e. they are actually objects), the type system is weird for legacy reasons (functions are also objects, but typeof returns "function"), and there is no standard way to do OOP.

The really bad things are: the module system, the incredibly baroque and every changing toolchain (wasted a day of my life figuring out what gulp was and why it sucks if you want to do something out of the ordinary in your build system), weird interactions with the DOM that lead to memory leaks (I think the main issue is that if you create a dom node, attach a callback, then remove the dom node from the main document, that callback will never get garbage collected). React is cool because it solves the last problem (and many other dom related problems) but then we get back to the baroque toolchain.

At the end of the day, web apps are a valid option for people looking to build a cross platform (including mobile) application. But the whole experience will be far less pleasant than, for example, building a windows only app, or building a desktop app with Java.

I would check out elm or purescript, they should give many of the advantages of JS without the weak typing.

The worst part is the automatic casting.

dorey.github.io/JavaScript-Equality-Table/

not really, because you can avoid it by always using === (for some reason I find !== even more annoying to type, but that's just me).

JavaScript is like heroin.
A tiny bit of can make your day fantastic.
Too much will just fuck your shit up.

Dynamic typing.

Doesn't JS have only floats? Because if that's the case, that's really retarded.

yes, js only has double type for numbers, although there are ways to implicitly coerce 32 bit integer operations. web asm can do true 64 bit integer operations. But in practice I think the JIT can figure out want data type you are really using and will use the corresponding integer type. In terms of performance it's no worse than dynamically type languages with floats and ints.

Prototypical inheritance _is_ OOP you retarded fucker.

I'll just give you one example:
JavaScript makes no difference between floats and ints. Every number is a 64 bit float, so for any numerical calculation you have an unnecessary overhead by design.

Right, I'm a retard just like the ES6 author's who added classes, and the creators of TypeScript, Google Closure and React. All these retards, found javascript's prototypical inheritance insufficient to do OOP programming in practice, and felt the need to extend the language or add frameworks/conventions to do OOP. Or maybe you are the retard.

Javascript is a bad language, always has been always will be.

Even python is not a "great" language, but it's still several times better than shitty JS.

That's the one that has always bugged me the most.

the JIT won't actually represent your number as a 64 bit float, see html5rocks.com/en/tutorials/speed/v8/

none of that changes the fact that OOP doesn't require classical inheritance you bootyblasted retard
protip, the new syntax is just sugar on top of the old JS prototypical inheritance
it's the same thing under the hood, faggot.

Most people define OOP to include classical inheritance. Feel free to read my original comment as


It makes no difference what you call it. The facts are that JS was lacking something that ES6 authors and all major library author's felt the need to add. It was a flaw because there was no standard way to do it (though this is corrected in ES6).

Calling people a retard doesn't make you right, in fact people like you are a cancer that prevents serious discussion.

It has no paradigm, it is a bastardized attempt at functional without tail recursion and oop without inheritance.

9/10 people writing it for some reason think it's oop though, becuase that's all they know. So they will write 10000 lines of shit to perform simple task attempting to work around a type system set up to punish that. If they happen to in rare cases appreciate functional programming and try to use recursion they end up wasting a fuckton of memory.

Just because you have a work around doesn't excuse it as a poor feature.

they are wrong and so are you :^)

I am truly sorry you are so frightened by things outside of your comfort zone of knowledge, that's a vary bad trait for programmers to have.


wew lad it's another one

I'm not at all frightened by prototypical inheritance. You're the one that is too pig headed to take into account the practical experience of thousands of programmers who felt the need to reimplement classical OOP on top of javascript. Of course you will dismiss this by calling them retards because that's the only way you can think about people having different ideas than you.

Should I learn ruby or node.js for a full stack web guy?

Pic related.

Yeah. If you'd like some humorous reading, though, I recommend usenix.org/system/files/1403_02-08_mickens.pdf

var a = 205;a = a / 100;a = a * 100;alert(a);

Result:
204.99999999999997

ftp://ftp.openwatcom.org/pub/devel/docs/ieee-754.pdf

holy shit the website doesn't run like shit any more

Webdevs everyone

Why other languages don't have this problem?

For example PHP

Result:
205

LISP

[9]> (setf a (/ 205 100))41/20[10]> (print (* a 100))205 205[11]>

As pointed out, that is a problem with floating point arithmetic. The real shittiness hear is that javascript doesnt have integers, it stores everything as double precision floating point.

Change that 205 to a 205.0 and it will do the same thing.

Because they can infer the type from the literal.

I have typed the following in Racket (a LISP dialect), because I just happened to have a REPL open:
If we take a look at the Racket documentation it becomes clear why this works
docs.racket-lang.org/guide/numbers.html
Racket has stored 205/100 as literally a fraction of two integers, there was no loss of precision. We can see it directly:
Multiplying this number by 100 works without loss of precision as well: 41/20 * 100 = 41 * 5 = 205

However, let's try something else this time:
This time the 205 is stored as a floating point number due to the decimal point and floating point numbers are inherently inexact

If you were to type the term in a language like C that does not support exact rational numbers you would see a loss of information because 41/20 would be truncated to 2 and when multiplied by 100 would yield 200 instead of 205. Rational numbers are a good way of working with fractions without losing precisions, but they cannot represent irrational numbers like certain roots, logarithms, Euler's number and of course π. And floating point numbers will always be inexact, no matter the language. Of course there are some cases where the result will be correct, but not every real number can be exactly mapped to a binary floating point number.

Since JavaScript is lacking integers there is no way for the language to automatically recognize rational numbers, so you will always lose precision, unless you implement your own rational structures somehow.

...

Sure, OOP doesn't require classical inheritance. But not having a standardised way to create classes & modules and everyone doing it their way in different libraries means you can't interop between them easily. Which is stupid.

Ruby. Node.js is a cancer for SJW dumbshits.

I dunno what you want, webdev in Clojure is pretty legit.

There is literally nothing wrong with javascript.
other than shitty syntax

C, ironically, gets it right with float but not double, at least using SSE instructions, because it does the operations in double precision and then rounding happens when it converts back. At least that's what I got.

To be pedantic It's JIT what makes it fast for a dynamic language, there's no magic interpreter in there.

Also, that's not a language feature per se, but rather the product of billions of dollars poured into spidermonkey and v8. That's how influential the web is


javascript is a lean C-like syntax with cool features like first-class functions, prototype OOP and a very simple yet flexible object notation. The rest of the semantics is a hacky mess, not surprisingly.


plain prototype-based OOP and javascript is powerful enough to implement the features of classical class-based OOP

isn't classical inheritance frowned upon in modern OO-design best practices anyway? Inheritance always looked like a solution in search for a problem to me. plain object members (i.e. nested objects) are better IMO


I could have never imagined I would recommend Ruby to someone, but if those are your only two options avoid node.js like the plague.

Is there any good reason to still be using dynamically typed languages? It doesn't seem like it saves much work, and mostly shifts bugs to runtime.

Or alternatively an individual who made a pact with the devil to become a superhuman compiler engineer. LuaJIT's Mike Pall is fucking unreal.

Fuck the technical reasons.
The sheer amount of JS abuse and webdevs who pretend that it's necessary justifies all the hate several times over.

Learn java if you want to work and be ok with yourself... learn python or ruby if you don't like money. Learn php if you don't have self-esteem and don't want money.

Learn C# and the .NET stack if you want to work for the government and make money and have no regard whatsoever about end users.

Learn js, the DOM, css and be familiar with major js libs if you want to be cool and maybe make some money depending on luck and stuff.

Be able to work with JSON no matter what.

You know there is a problem with a technology if there is actually a book called ": the good parts" and the book is not a joke.

I am trying too but I keep getting discouraged because I know I have absolutely no passion for it at all, how long does it take to learn?

I tried the html, css, javascript and dom stuff from w3c, went through that in like a week or two. Now what?

no fucking shit

how has no one posted this yet?

what the fuck?

string + value coerces the second value to a string and concatenates it
string - value can only be interpreted as subtraction, so it tries to coerce both to numeric

If you want a language that doesn't pull this shit then use Perl 5

youtube.com/watch?v=yWDAw2nsIjw

...

It's not about the size of the books, it's about the fact that "JavaScript: The Good Parts" exists to begin with.

Hey u gais, interpreted script runs as fast as native code if it can bypass kernel overhead!