Which programming language is most detrimental to the industry as a whole?

which programming language is most detrimental to the industry as a whole?

Other urls found in this thread:

repository.root-me.org/Reverse Engineering/EN - Reversing C - Blackhat - Yason Sabanal - paper.pdf
golang.web.fc2.com
youtube.com/watch?v=tIBVQDF2YCw
youtu.be/vm1GJMp0QN4?t=2477
jwz.org/doc/worse-is-better.html
arstechnica.com/tech-policy/2011/07/should-we-thank-for-feds-for-the-success-of-unix/
blogs.msdn.microsoft.com/oldnewthing/20140627-00/?p=633
cm.bell-labs.co/who/dmr/kbman.html
github.com/kraih/mojo
clang.llvm.org/
en.wikipedia.org/wiki/Mono_(software)
embedded.com/electronics-blogs/programming-pointers/4026892/Enumerations-are-integers-except-when-they-re-not
twitter.com/CollinEstes/status/738767017843515393
twitter.com/SFWRedditGifs

Javascript. Many substandard programmers coming from web design, and it has now entered a hipster stage in which even otherwise competent programmers, have bought into the meme that JS should be everywhere.

Rust

C

Top left.

i dont know anything about this but you guys say java so therefore its java

everything that fails basic things C and C++ is capable of and anything that hopes to gain traction while being way slower

Sepples. It parasitized C's popularity and added a whole bunch of object oriented junk which was totally unnecessary to begin with. Along with Java, it helped popularize the everything-must-be-OOP meme which is finally starting to die off after 30 years. It simply could have been a better C, suitable for applications programming, with more features like operator overloading and a bigger standard library but without straying from the procedural paradigm of C.

Perl, on the other hand, is the least detrimental of those languages in your picture because it's the one that's most likely to be used by people who know what they are doing. Or maybe Go is because no one uses Go.

Computing would have been much better if that lousy hack never got popular.

JavaScript is popular because its the only choice on browsers. Some web developers started to think its how programs should be made. Now these idiots are pushing javascript with node and buisnsses are buying into it. Le mean stack developers are making a killing right now

JavaScript is a clusterfuck of poor design choices. Its major sin is fooling web developers into thinking they're software engineers. Managed languages that attempt to idiot-proof programming at the expense of runtime performance, such as Java and Go, are the biggest threat in my opinion. These languages were designed to make competent programmers obsolete.

I don't think you understand what is OOP. You write software in the OO style because you understand that the problem domain is easily modelled in terms of self-contained objects. If you don't believe that the problem is adequately solved in the OO paradigm, you're supposed to find a different language that supports your problem in a better way.

99% of jobs are writing CRUD ooerations or making websites for people. There is no such thing as a compotent programmer

there are competent programmers in different fields.
but by definition web devs are not programmers and therefore you can't find programmers at all in web dev

x86 microcode

LARPer spotted

javascript, not even a question. Sure, php is pretty crappy but i have yet to see php on the desktop, it's kept in it's niche. python is nice when used appropriately.

Curious, what is C's sin?

i don't know what he's referring to, but C strings(\00 terminated) are pretty horrible

how would you handle the issue of variable length strings if you were catapulted back in time to when C was becoming popular?

web """"""""""dev"""""""" spotted

In case of strings with a length header: Length aware functions. Or special runtime support. Of course, at the time the runtime cost would be big, but languages can improve over time, and the C standard library should've switched to length header strings. It'd save people tens of thousands of buffer overflow vulns.

i believe pascal has always had a length header for strings, and was already around in 1970

nice try, but no

Crying about webdevs is the #1 sign of a LARPer. Don't you have a fizzbuzz or todo app to write in C or another memelang?

i should fix bugs in nyaa pantsu which is written in golang (not a meme lang)

"simplicity"

why

you forgot

BCPL had counted strings, and it's a direct ancestor of C. Null-terminated strings were a deliberate (and poor) design decision.

Javascript, when written using only The Good Parts, is pretty comfy. What are you talking about?

Javascript was designed in ten days to allow non-programmers to do simple things on webpages. Because of that, if it has to choose between crashing (and doing nothing) and potentially doing the wrong thing it prefers potentially doing the wrong thing. A lot of it is superficially similar to Java not out of practical concerns, but for marketing reasons. Given that kind of origin it would be a miracle not to end up with a shitty language.

Why would anyone have a problem with Python?

Sure, but you can opt to not write the shit parts and only use what works. There's a good language behind things like with, eval, automatic semicolon insertion, plus sign overload, and other stupid stuff.

It's type coercion. operator overloading + to string concationation wouldn't let you do {} + {}

The plus sign is overloaded for addition and string concatenation.

JS doesn't have operator overloading in the language you fucking retard

it overloads the + operator. It doesn't permit the programmer to do any additional operator overloading.

the logo is revolting but the animu version of it almost made me pick up the language.

The + operator does two different things: addition and string concatenation, depending on the types of the variables. That's overloading the fucking operator. This is how a Pajeet would do it in Java:

public int plusOperator(int a, int b);public String plusOperator(String a, String b);public String plusOperator(String a, int b);public String plusOperator(int a, String b);

in Java it makes sense because there's no automatic type casting, so there's no ambiguity. Which brings me to the main point of complaint about Javascript: it allows for too much ambiguity, to be user friendly. That's why you can reduce significantly the amout of complaining if you stick to the Good Parts only.

It was made as a high level (at the time) programming language that could nonetheless be used for systems programming. I'd say it succeeded.

It also provides a lot more control than other trashy "high level" languages, looking at you PHP and JS. Why, because C trusts you to know what you're doing, and doesn't babysit you every step of the way. C assumes you actually know how to code and aren't just some hipster wannabe.

Yeah, processor cycles, memory and disk space are dirty cheap nowdays, while man hours are expensive, let's focus into making the code consume 1KB less memory and be a vulnerable, unmaintenable mess, it sure sounds reasonable.

What part of modern nodejs seems maintainable and reasonable to you? Do you like call back hell?

They probably think that they're cleverly saving "function overhead calls" by spaggheting callback inside callback instead of properly organizing and chaining operations. Behind every shitty write code there's some idea of dumb optimization where it doesn't actually win you anything worth of the trouble.

Python runs 100x slower than C, so for any time sensitive application (or anything where you don't want lag) it's important.

It's only vulnerable and unmaintainable if the programmer is bad. Take a look at the Linux kernel. It's written in *good* C, is maintainable and maintained after 25 years, and most vulnerabilities are logic bugs rather than problems with C itself.

Modern languages (JS, PHP) actually make it more difficult for a good programmer to write code, because automatically thrown in "optimization" functions exist to babysit you. The parser (no compilation because xcompat) converts your int to a string when it thinks you want a string, so you can't use actual binary computing without going through 5 different functions.

Granted, these languages do have some use (quick scripts, for example), but they have no place as the centerpiece of any serious project.

A hundred times more slows than a fraction of a milisecond is still not noticeable for most applications. Time sensitive applications are a small subset of programs wirtten everyday.


Because no good programmer ever makes a mistake, right? OpenSSL didn't had a gaping vulnerability caused by a call to an insecure (but very performatic!) function. It's easy to say "oh, yeah, only someone stupid wouldn't check for founds on strcpy", and not take into account that it's just human to forget things. Machiines want perfect code, but our brains excel on approximation.


No, they make it harder because they inherit syntatic mistakes they inherited from C. You can trace most of C-like languages' mistakes to some byzantine decision that was made in the 70s and we're still suffering from it 40 years after the fact.
It's perfectly possible to write decent code in almost all languages, really (Visual Basic and COBOL being the exceptions), as long as you know the language well enough to avoid using ambiguous, confusing, or dangerous constructs. But every arrogant codemonkey out there will write some bullshit using with or eval and think he's being clever.


Most of the time, why the fuck would you want binary computing anyway? Do you mean you want to do multiplication and divion by shifting bits left and right? Makes no sense nowdays, just use the arithmetic operators.
Automatic typecasting is a problem, but it's also the symptom of another problem: having too many different types to begin with. Ironically, that's something that C both gets wrong and right: it has lots of different types, but they're all represented as ints anyway. JS also almost gets it right by having only string, number, boolean, object, and function as types, but then screws it: boolean is superflouos (do it like C and just use a number), and it picked the wrong number type.

And here's the true sin of C. It made mentalities like this common, where lousy unsafe hacks are considered normal, unavoidable and even desirable, rather than a proper engineering approach.

Quit LARPing about, faggot.

Never mind the inapplicable arithmetic for now. What if I told you that depending on the context, TRUE could reasonably take the value of *any* number?

Do it C style: 0 for false, anything else for true.

And there's your massive mistake. Booleans are not numbers. If anything, they're a type of enumeration.

Also

type coercion isn't operator overloading. take sophomore computer science class you dumbfuck LARPers

Most apps take longer than that to run. I don't know about you but I don't like lag when I try to do work.

Heartbleed would have been prevented had the programmers enabled strict warnings. OpenSSL is a clusterfuck and should not be treated as an example of what typical C looks like.

That's why you use scripting languages on small and quick projects. Human margin of error is ~10%, so if you go over the code a lot you're going to reduce it every time (1%, 0.1%, 0.01%, etc...)

Such as? Most of the gripes I have with modern languages is the babysitting, which C makes a point to avoid.

Bitflags, storing numbers in files (write() needs a char *), etc...

One not found in C

Datatypes are just different amounts of bytes. They're defined differently on different platforms, and this is why you have so many .*_t datatypes, but if you know how they work under the hood you can easily manipulate them. I think this is the beauty of C: you can do what you want with the data.

I'm not saying C is the only option that should be considered. I'm saying that it still has value today and is not a "dead" language.


Lousy unsafe hacks meaning... casting/pointers?

That picture is a pretty good joke, which is that "software engineering then" and "software engineering now" aren't software engineering at all. The "reduced instruction set" was designed for the people who wrote the compilers, based on feedback from common instruction usage. That's the only real software engineering in that picture. "Hacker" retards spreading their garbage to other operating systems they don't understand isn't engineering. Those 3-bit idiots don't know anything about computer security. Not engineering. The "now" guys would do the same thing, but instead of bits it would be JSON. That portable standard library is more lowest common denominator "portable" abstractions on top of abstractions by people who are afraid to touch the system, like JQuery and Gtk+, which eventually becomes a 48 MB Hello World. Your Unix and JavaScript "hackers" are just different species of code monkeys and it's not often you find something that shows the parallels so well.

Just like in the real world/10

The memories are coming back now, of when I used windows, skype and discord simultaneously because I had been using skype and some of my friends switched to discord, and wanted to use the internet and sometimes play a game at the same time.
You need to be stopped. I hope someone cuts off all your fingers.

Skype isn't bloated because the designers didn't use 4 bit pointers instead of 8 bit pointers. Skype is bloated because it's a fully featured surveillance suite masquerading as a chat client.

If I ever become a dictator I will hang every single one of you fags that say this pathetic and flat out wrong excuse. Grab a book on memory hierarchy and kys.

Ever tried to operate the DOM without it? I guess not.

You really shouldn't be doing a bunch of direct DOM manipulation in the first place. That's where people are using fucking scripts to replace good HTML+CSS because apparently HTML+CSS is too hard.

Pozzed community. The language by itself is great, though.

Say this were true, then it's really only a problem for the competent programmers.

eval sure is great for all kinds of attacks against javascript code

at the time it was certainly high level but nowadays you really can't call c++ a high-level language in general contexts.

Alright, so 0 is the default, uninitialized state. Length 0 means no elements, a pointer to null means uninitialised etc. Also, when talking about truth states it is generally said that 1 is true and 0 is false. So it makes sense for boolean values to have 0 for false, and everything else as true.

Now let's say we want to return the status of some operation, we want 1 sucess code and many failure code, since there are many possible errors, but only 1 success. Why not use booleans, but inverted? so 0 becomes our success message and everything else can be an error code. This allows easy error checking if(!example()) printf("error message"). Doesn't get more concise than that, does it? Since "if" in shells is usually used with a return value it makes sense to use status code conventions rather than C style boolean conventions.

that's not how that works

not the original user but i think automatic bounds checking for arrays is pretty important yet unpopular among C/C++ programmers. Sure, for a few tight loops it's fine if you don't use checking but outside of that it's a massive benefit. C++ STL containers do have optional checking, but i'm guessing it's not very popular:
myvector[3] //unchecked
myvector.at(3) //does bounds checking
For your own containers, you can implement something like
mycontainer[3] //does bounds checking
mycontainer.u[3] //doesn't do bounds checking
which preserves the [] syntax

that video is so true its painful.
I only wish I saw this before going into software engineering

Then some guy comes along and proposes that you put an entire new socket in the wall, and everybody starts moving to the new socket, but then you're in the half-way situation where half your shit is in one wall socket and half of it is in the other wall socket, and a few of them are crossing the bridge too.

But before the migration to the new socket is done, a new PLUG comes out that's a different shape than either socket.

You do make a good point there. Problem is, just because there's bounds checking doesn't mean the exceptions are handled. It's the same in modern languages. Python, for example, will check your bounds for you, but if you don't do try except it will still crash.

the problem isn't crashing, that's the only sane solution for the problem unless the code knows how to handle it(for example by handling exceptions). The issue is that turning into a security vulnerability, which is fixed by crashing.

It's C with OOP slapped on top of it. The STL is very helpful for doing menial tasks for you, and it's possible (though not practical) to use C++ like you would a high level language. Just use std::string, std::vector, and boost, and you don't need to bother learning fuckall about memory management etc...

It's an example of what a high level language should be: possible to use it without knowing the standard by heart, but you can still make it do what you want and assert control. My only problem with it is OOP, which, when compiled, is essentially structs and functions that take a this pointer as an argument.

What's the problem here? It's still strictly better than having no bounds checking. If your program goes out of bounds it's definitely doing the wrong thing, and it should stop.
Forcing exception handling is not a good idea, because there's often no possibility of going out of bounds, and there's often no good way to recover from such a situation meaning that crashing is the desired way to handle it.

I have never used a good java program. The only people who think java is acceptable is people who has such a good computer that they don't notice how crappy java is. If a program is coded in java i don't even download it.

You're fucking high.
What the fuck is this?
> "abc" + "def"; // String concatenationabcdef> 2 + 2; // Addition4> "abc" + 5; // Coerce 5 to string and concat with "abc"abc5
How is concatenating 2 strings "coercion"?

Furthermore, in addition to being dog-slow i have never used a stable big java program. Every fucking java prog and game has crashed on me.

Fuck your brain.

ffdec is great

whoops you're right, forget about the !

On a project I was working on I had to do direct DOM manipulation to do AJAX comments. Unfortunately that's the only way you can do it. I have to agree that DOM sucks ass, many functions are missing and you have to implement hacks to work around them. (my code has "form.nextSibling.nextSibling.nextSibling" in it, because there's no nextNSibling(num) function.)

i meant to say C, not C++, whether C++ is "high level" or not is debatable i guess, it really depends on how it's used. I'm curious as to how much the OOP part gets compiled away, especially for classes i'd expect to be simple like vector

Much of the standard library is templates, so they get generated at compiletime. For regular classes, it's just a struct with the variables, and then the functions are statically compiled and take an additional argument, which translates to the this pointer.

This paper has a pretty comprehensive overview of how exactly it all happens: repository.root-me.org/Reverse Engineering/EN - Reversing C - Blackhat - Yason Sabanal - paper.pdf


The issue remains that a bad programmer will still make bad programs that shouldn't be used, regardless of how many crutches there are. You're right in that a crash is better than RCE, but that doesn't excuse bad programming with the mindset that "the language will fix it". The whole point I'm making is that C trusts you to know what you're doing and doesn't babysit you. If you're using C properly, then you make sure that there is no out-of-bounds vuln BEFORE shipping the product. If you are a novice programmer and don't want to be overwhelmed, then fine, use training wheels. But to say that C is objectively a bad language because it doesn't babysit is false.

Thanks user, this will be an interesting read and probably prove useful down the line

ffmpeg can be easily used from cmdline and therefore easily used in whatever language you can think of. oh well. even if the idea *could* be uselful i loathe the concept of feeding anything through java.

you coerce the strings into numbers which are added together
ez

Nothing as sad as people still hung up on their imagined glory days.

Everything is better now, and you just can't stand that the skills you have are obsolete. Time to recycle yourself or fade away, but stop bothering the present with your shit.

...

You can't meme yourself out of reality, obsolete fag.

I could if I tried, look at you for fucks sake.

Aw yeah. I sure do enjoy programming kewl bloated webapps or phone apps written in C#/Java/PHP on my super dank Visual Studio because that's pretty much where all the jobs are now. So much better.
Fuck you nigger.

It's not my fault your apps are bloated.

...

You're right. It's your language's fault.

stop trying to find excuses for your being too stupid to understand pointers and accept that you're a failure in life.

C without question and I use it everyday. Everyone uses C as a base for their language, everything is written in it. C did not start out as a "fast" language. I believe Pascal and Lisp beat C on performance back in the late early to mid 80s. I think it became popular because of Unix. There is nothing special to C and it has held back the entire industry. Buffer overflows are still the biggest attack vector against C programs if I remember correct.

But there is. Name one language that is as efficient as C when it comes to hardware interaction. Also C is relatively speaking simple language, new hardware architectures get C compiler very quickly.
You'd be wrong. Most of the security bugs are caused by faulty logic which is language agnostic. There are still some buffer overflow cases but tools and techniques to prevent them are quite well developed. There is Address Space Layout Randomization, PaX, various compile time protections, static analysis tools, coding standards... Technology to protect yourself is there, too bad very few people use it.

ffdec is a flash decomiler, has nothing to do with ffmpeg. It's a bit clunky but it does decompilation & modification of .swf files well.

Well, I like what I see.

I'm not delusional and living in the past.

You're skills are fading. Nobody cares about your stupid pointers. Things work great without them.

Things work great without you. It's sad you can't just deal with that.

No they don't. You might hide from them at high level. But as long as we use architectures we use today you can't avoid them.

That's the whole point here isn't it? Nobody needs to care anymore.

That's just what happens in Holla Forums. Things change and either you keep up, or become obsolete. There is just no room for those who dwell in the past.

Device drivers do care. Or do you think that they just magically work and that nice high level abstractions were just there from the beginning? Someone needs to write them and to do so they need direct access and manipulation of memory and registers.
Maybe on high software level. When it comes to hardware we still use 72 years old computer architecture, yes protocols on how devices talk with one another have changed, but basic principle is still there and I don't see how any of this will change anytime soon.

Oboslete you say?


I like begin obsolete user

how does your shitty anecdote change the fact that php is outdated and should not be used anymore?

There are a fuckton of serious, active projects that are written in C/C++ (Linux, VLC, most GNU coreutils, your babysitter language interpreters, Vim, Emacs, most DEs, Source Engine, Unreal Engine, and probably a lot of FOSS and PS that I haven't named yet). C/C++ are far from dead, and with good reason.

If you want to be a good C++ dev, you need to understand C. If you want to understand C, you need to understand pointers.

Now go kys.

This is very true. C compilers did not generate good code. The original B language was a threaded code interpreter and speed wasn't part of the question at all. The array to pointer crap had nothing to do with being fast. It was a bad design that hampered decades of optimizations. The null-terminated string was also not designed for speed.

When B was a toy "interpreted" language only used by its creators, which generated different code for i = i + 1, i += 1 and i++, native code compilers for other languages could tell when it was safe to remove bounds checks and Fortran was doing optimizations and native code since the beginning. There was no reason to use threaded code other than the fact that it was a toy language and they didn't care about speed.

C and Unix held the industry back, not because they existed, but because universities greatly exaggerate their importance to the extent of leaving out more important languages and ideas. Universities today teach that Unix and C are the foundation of modern OS and language design. They ignore or downplay Fortran, Algol, Cobol, PL/I, Multics, OS/360, and other more important languages and operating systems. Unix is not the only OS that was based on concepts from Multics. Single-level store was the main Multics innovation and Unix doesn't have it, but other Multics-based OSes did. They do talk about Lisp, but only to contrast C as the language "hardware we use now" must use, even though C is a bad fit for x86 and the reason we don't use segments for security.

They call the call stack the "C stack" but the existence of the call stack has nothing to do with C. The PDP-11 already had a stack pointer because other languages, like Algol and PL/I, used the stack principle before the PDP-11 was designed. The C stack is crippled compared to Algol and PL/I because C had no variable length arrays (until C99) or nested procedures. C arrays are also crippled. People don't even know Basic or Pascal and think it's weird when arrays start at 1 or any other number that isn't 0. These are all education problems because people knew this stuff in the 80s.

How pretentious can you get? Just accept that you're a failure and go away.

uhmmm
please give us a gorillion dollars to use our operating system -IBM

Are you saying they weren't good, or are you saying they weren't important?
In the first case you're not actually disagreeing with anything he said, and in the second case you're wrong.

everything that is not go.
golang.web.fc2.com

Lua is the only language I can think of that is "popular" today that starts indexes at 1. I think what happened to computing is that when the PC came out everything reset. All the computer knowledge gained from the 50s to the 70s was "lost" because it was confined to mainframes. Desktop computers were so underpowered assembly had to be used. I think the only way to "fix" the problem is to take a page from Terry Davis and write a new OS in a higher level language.

C has been abstracted so much with tonnes of undefined behaviour that if you really wanted to find out what it's doing you would need decades of experience.

Find out what really happens with hello world
youtube.com/watch?v=tIBVQDF2YCw

An OS developer cannot fix tail
youtu.be/vm1GJMp0QN4?t=2477

You missed my entire point. user said getting left nehind is bad meanwhile people that know obsolete code bases make bank because youngfags and pajeets don't know enough to debug them. Cobol is learned by no one these days but if you know ot you can find a good job anywhere. Just because something is shit doesn't mean it'll go away anytime soon. More often then not shit is what becomes widespread or the standard.

Let's see... C, Unix, Windows, PHP, Javascript... Yep, your story checks out.

jesus fuck

I have a personal one. Actionscript 3, holy fucking shit, actionscript. Why? Because it's like it's sentient and is trying its best to be as annoying and unintuitive as possible.

Make about a paragraph of code that does something mundane like post a picture and some text to the screen and copy/paste it three times into three different documents.

One will work but not in the way you want, one will do nothing and one will give an unhelpful error message. Debugging is an exercise in having anything up to 4 people take an hour to figure out what the problem is because the error messages tell you fuck all. There are probably worse languages but this is my own personal rant.

SUN engineers are simply the best.

You got me, kys

This is true but it doesn't explain the loss of arrays starting with 1. BASIC and Pascal were much more popular on personal computers than C, especially BASIC. Everyone used BASIC. In fact, there were computers where the OS was BASIC, and it used 1 as the lower bound or let you choose. Visual Basic was a big language in the 90s too, and so was Delphi and other Pascal-based languages.

So, after the lost knowledge of mainframe languages, there was a second wave of lost 80s-90s knowledge that happened more recently, in the 2000s or 2010s.

It's not about what index is better for arrays, but the fact that pretty much every language in the 80s or earlier besides C and C++ start arrays with 1 or with any number and people who think arrays start with 0 don't know any of them.

u wot
take your own advice faggot

Or don't be dumb and just read the CERT book. Seriously, C is simple because it has few keywords, relatively sane behaviours understood by anybody (without taking UB in accoun) and a small or even bare standard lib. Like all great things, it's simple to start, and hard to master.
So as a beginner, you come to like this simplicity (you cry a little bit about pointers the first time) then you stay for the speed given by UBs.
Honestly, the only things I'd want in C is native UTF-8 support and something at least as powerful as templates (maybe simpler, if I can push my luck). Lambda functions could be cool too.

tl;dr C is perfect, you little shit, just give us UTF-8, stdint everywhere and a more powerful preprocessor for C2.

I honestly just glanced over the slides, the point stands the libc is not the C language, and the linking bloat is optional.
So? It's a Unix/POSIX design problem, not a C one. The undefined behavior is in the syscall and it's not because of C. If you reimplemented a POSIX system in any other language you will have the same problem, because the undefined behavior is documented in the syscall.

C is extremely insane. C compilers are allowed to remove calls to malloc, delete code that isn't dead, and many more insane things. The insane UB is larger than the "sane" part of the language. C is as insane as Malbolge and Intercal.

So is Brainfuck. C is deceptively simple. printf isn't just printing to the screen. It parses a format string at runtime which is slow, but also has no safety. Null-terminated strings are slow and dangerous.

UB has nothing to do with speed. It was for portability to hardware that has no business running C in the first place because C didn't run on that hardware before ANSI. The standards committee wanted to expand the use of C, so they made things undefined.

The only people who say that have no experience with other systems languages.

It looks like Apple engineers were the best, they avoided the hell of truncating an mmap'd file. One day you guys will take the Apple pill.

I almost gave them credit too but I was worried people would sperg out if I said something positive about Apple. All the BSDs seem to be on point when it comes to their userland utils.

You have to drop harder truths to make Holla Forums really sperg out, like: GNU was a big fat mistake.

gnufeatures.txt

When gnutards say features, they actually mean in the emacs sense of putting unrelated shit into programs because they're lazy and have a broken "why not" reasoning entirely different from "the right thing".

They don't mean in the Apple sense, where all the bloat at least makes for something useful and sensible. They mean putting line numbering in cat and all sorts of things a bored computer science student would do, they mean "features" you never, ever touch.

That's a BSD invention.

No surprise there with GNU copying other projects and trying to take all the credit.

I dont see the problem there. Stallman was right about freedom but wrong about design. Vice versa for muh minimalism bsd/plan9/et al autists who have better ideas about how things should work but all use cuckoldry licenses.
its wrong to say it was a 'mistake' because it was needed, but its design was.

god those skeumorphs are disgusting. not to mention the buttons and bars for inflation fetishists.
why is old OSX so hideous and who came up with the meme that apple looks good?

I think GNU's design is weird, but not bad. I can appreciate Plan 9 purity autism in theory, from a distance, but when I try using it myself it's just horrible.
Unix had become massively popular by having generous licensing terms and being simple enough to be easy to port. GNU provides a superset of the existing Unix features, which means that it plays well with all the existing Unix infrastructure. You only notice the extra features when you read documentation, and even a lot of the more obscure ones are genuinely useful. For example, the options for using null separators are basically a requirement for writing shell scripts that can process arbitrary filenames.
And the "bloat" doesn't even end up making GNU software slower - GNU versions tend to be much faster than the originals, because they're willing to invest some extra effort in making things fast.
And GNU values correctness. It's very telling that classic Unix man pages had a "bugs" section. Classic Unix made things just good enough to work most of the time, which gets annoying really fast when tail starts outputting junk because the lines at the end of the file are longer than the arbitrarily chosen buffer size.
It's the whole worse-is-better business. ( jwz.org/doc/worse-is-better.html ) GNU gives you the best of both worlds. Classic, "quick and dirty" Unix managed to seed the computing world, and then GNU came along and took advantage of the established compatibility to take over with programs that actually work properly.

Plan 9 is what Unix would be like if it didn't have "free labor" from competent people thanks to AT&T being forced to give it away in an anti-trust lawsuit (or was it a secret NSA deal to weaken computer security?).
arstechnica.com/tech-policy/2011/07/should-we-thank-for-feds-for-the-success-of-unix/

Java.

It enabled pajeets to infiltrate and destroy software tools made for every single productive field. I'm talking about engineering, sales, research writing, nearly everything.

I'm not a software developer by profession. But I have to deal with this broken shit daily. Java front-ends to marketing systems, 3D model repositories, secured transfer of large files between work sites, work order entry systems. All Broken Java Shit.

Java.

I didn't read the thread

this. /thread

Whatever programming language produced this shit thread

Or how about Alan Kay was right and we should all have glorious OOP systems.

Or that Sun was the greatest tech company of all time, and they're never given nearly enough credit (probably because people like to hate Java). It could have been different, it could have been SPARC and Solaris, alas Oracle stole our future.

someone has to write the languages

Wrong. The most known example is not checking overflows after every possible operation.

An unmatched quote character encountered by the lexer is undefined behaviour in C. Your argument is invalid.

Also, throwing "not checking overflows" away to speed things up is an immensely costly example of premature optimisation. The correct, sensible thing to do is to keep them in and provide a way to turn them off.

GNU is based on the idea that "the right thing" is freedom. This means if you don't like a certain feature, it is your own responsibility to remove that feature. This means if you desire a certain feature, it is your own responsibility to develop that certain feature. GNU has many features because certain developers took the initiative to write it into GNU.

That's not "the right thing". That's freetardism.

I will never understand why people feel the need to defend GNU, as if your rationale is going to change anyone's mind.

Your confusion is in the strawman that I am defending GNU.

No, it really isn't. GNU's design philosophy is separate from its freedom philosophy.
GNU was designed the way it is because Stallman chose to make a Unix clone for pragmatic reasons, didn't particularly like Unix's philosophy, and wanted it to be better than the existing Unices so it could perhaps take over by technical merit.

Not entirely sure if it's the exact same, but I've watched a presentation calles the exact same with the exact same slide at FOSDEM this year.

That's not a problem of C itself. It's a problem with how much has to be done to just start an application that might/might not need the features you just prepared. Blame your local OS designer for that.
Any and all scripting languages will face the exact same problem, as well as compiled languages other than C, which also need at least libc's startup (crt*.o) files, and often even link to libc themselves.

Call it libc, call it libos, the OS needs a library with things almost any program does, such as starting up. Be it C, or whatever other compiled language you have, as long as it's bytecode for your target architecture and can be linked to programs without much fuss.

In my opinion UB simply makes it easy to write a C compiler. There are some UB things that affect performance, but most don't.

WTF am I reading, I just had this thread open and see this. UB allows advanced optimizations.

For ease of making a simple implementation of a language, the existence of UB doesn't change anything.

That's not its original purpose. UB was added by ANSI to allow C to run on more diverse computer architectures.

I can't think of a single UB "optimization" that can't be done by writing less code in the first place.
blogs.msdn.microsoft.com/oldnewthing/20140627-00/?p=633
This whole thing is scary and dangerous and doesn't help the programmer.

Everything that's not in the B manual was bolted on later. No char, no structs, no unions, no hexadecimal, no xor, no malloc and free. C++ is just continuing the C philosophy.
cm.bell-labs.co/who/dmr/kbman.html

I've known C for quite a while now, enough to know it's ins and outs.
I have never touched any other systems language though.
It seems you're enlightened enough, please teach me your ways, senpai: What do I have to learn to grasp at such a level of enlightenment?

what is this acronym?

a meme book for doing """"""""secure programming"""""""" in an inherently insecure by design language.

If only openssl developers followed rules and recommendations in that """"""""meme"""""""" book.
Whats your solution then? Rust?

OpenSSL was designed to be insecure. Notice that they implemented their own memory management, rather than using the libc? The same people would also implement their own broken stuff, instead of using standard library for "safe" language.

All of them except maybe Ruby and Python.

Cancer shit:
Anything Microsoft created
Anything Apple created
Anything Adobe created
Anything Google created
Anything Sun created
Anything Oracle created
Anything Facebook created
Anything GNU or DE developers created
Javascript

These corporate languages are all colon cancer and so nothing other than promote vendor lock-in and reliance on implementation dependency.

None of you faggots are writing rocket software or life support software, so just grab 3 or 4 core languages and so following every lel trend out there.

Developers should learn 1 of each:
* High performance language (C or C++ before it adopted all the features of all languages)
* Hobby script language (Python)
* Shell scripting language (sh)
* Database query language (SQL)

Also, not a programming language, but everyone should learn to stick to the fucking standards and fix language shortcomings by using better programming idioms rather than new corporate languages that fix that one small detail for you but break a lot of other fundamental things because lel malloc is too hard.


Only spend time learning other languages is your autistic idea of fun or if it's a job requirement. Otherwise you're just procrastinating and pretending you're getting better when you're not.

HAHAHAHAHA OH WOOOOOOOW

cancer

What would you pick?

Perl 6

Ok I'll play this game.
asm
BASIC
Pascal
Forth
Also, eh fuck Unix, and fuck Windows, and fuck Macs. All modern OS shits are suckass, cia nigger tier.

what did he mean by this?

The existence of UB certainly changes something, if a compiler can't handle uninitialized variables, it's perfectly fine to treat those as a compilation error, or use 0 or anything else, it doesn't even have to be consistent to be standards-compliant. Similiarly for signed integer overflow, if a compiler does wrapping on one architecture, no wrapping(INT_MAX+1=INT_MAX) on another and outright invalid code on a third, that's perfectly fine according to the standard. IoW the compiler writer can implement things however they want, and it's up to the programmer to figure it out(rather than the compiler having to use some defined, consistent behaviour).

Is it readable?

Use varint string length prefixes. For the same price as one null terminator you can instantly know the length of a single string up to ~¼kB, and have embedded nulls within them.

Pick one.

Nope Perl 6 is totally unreadable, just like Firefox 54 is totally non-standards-compliant because people used Netscape 4 in the century you got that stale meme from you fucking cuck

I was asking a question. I know, like and use Python. Most Perl code I see looks disgustingly messy. I know Perl 6 made some hefty changes, so I wondered if it improved the language in that area.

That is the "worse is better" philosophy rearing its ugly festering head. And just so we're clear, I'll spell it out for everyone: DEFINING A PROGRAMMING LANGUAGE LIKE THAT IS DROOLING PANTS-ON-HEAD RETARDED.

It's still perl but you lose all legacy support/stuff. It is easy to learn though like all perl before 6, just spend a weekend with it. No point in not knowing it.

It makes sense for these things to be undefined, but the C interpretation is completely retarded.

Uninitialized variables make a lot of sense (think about a 1 GB array), but the compiler shouldn't magically assume something is "both true and false" because of an uninitialized variable. It should use whatever data happens to be there originally. That's the behavior required by most languages because they say there are unspecified/undefined values, not that accessing them is undefined behavior. Even undefined behavior (if it's there) in most languages is saner than it is in C.

There are many different ways to handle overflow too, so there's nothing you can do unless you mandate exceptions or traps. This is why integer overflow is undefined or an exception for most programming languages, not just C.

But C "optimizations" don't make sense because they can optimize assuming one kind of overflow, while actually doing another. If it's possible for INT_MAX + 1 to be less than INT_MAX because it wraps around, the compiler shouldn't "optimize" assuming that it's impossible.

It comes down to fast, good, cheap. Pick two. There has always been something better right around the corner but when you need it today and not tomorrow something less than perfect will become adopted out of necessity. This is why the world will forever be in a state of running on shit and legacy shit/standards. Once something is established as the standard it's hard to convince anyone to spend money on switching to something new no matter how useful that switch would be in the long term. Thus Windows becomes the desktop, Linux/Unix for the server, Apple faggotry for everyone else, PHP on the server and javascript for the client if you build anything for the web, and C and its spin offs everywhere you look.

This is why I don't worry about Rust becoming a standard. No one is interested in replacing anything with Rust but they do a good job of killing themselves by projecting an image of a group more concerned with identity politics than actually improving their standard. When you chase away the core autism that makes project like possible you're as good as dead in the water. If they spent just half the time they spend arguing over the CoC on improving the language itself they'd probably have something worth writing major projects in within a decade. Instead they choose to shoot themselves in the foot.

One major example I can remember of shit becoming the standard would be PHP in the early days. Before PHP really took off it seemed like everyone had decided perl+cgi was the way web applications should be written. You had faggots shilling ASP, autistic people writing everything in C, and people that hated perl always suggested python instead but most people seemed content with perl. Over night around PHP version 3.x everyone made the switch to using that. Around the same time most webdevs first discovered SQL/RDBMS so pairing php with mySQL became the thing to do. Every shitty webhost starting offering mod_php/mySQL and every shitty script maker started moving over to php because perl was associated with begin old/slow/using flat files to store user data. PHP was hailed as the answer to everyone's problems and mySQL was so fast you'd never need to worry about getting booted off a shitty webhost again if your DBZ forum served more than 10 users.

But we all know that was a lie because PHP has and will always be shit from the bottom up. A language where native function names conform to no standard so you can never remember if a function uses underscores or CamelCase. Where OOP was shilled as the answer to every problem for many years only to not really be OOP when it was finally released. Where their are countless libraries including a standard library for you to choose from but none really good enough to keep around for anything serious so you're forced to write your own libs anyway. I could go on and on, it has a lot of problems.

I'm switching from PHP to Python (Django).
What front-end languages should I use to web and android?

*up to sizeof(size_t)

I'm learning web design any trying to decide on a server-side language. I was considering Node.JS for a while because PHP is such a piece of shit but this thread is giving me second thoughts now. What do?

I like python on the server side these days but most any scripting language will be fine for you assuming you run your own server or have a decent web host. Even PHP, despite begin shit, is pretty good compared to the old days just avoid using anything lower than version 5.x. Avoid NodeJS like the cancer it is.

nodestrap.asm.js

Forgot to mention I'd avoid Ruby as well due to the community. They've been shilling Ruby on Rails for over a decade now and it never really took off. Given the choice between Ruby and PHP I always went for PHP. PHP is just horrible for a newbie because there aren't any standards in how functions are named and it can teach you some really bad habit. Python is great when paired with Django. Perl is good too but I'm not sure about the frameworks over there because I just use it for quick and dirty stuff. For the database I like/prefer PostgresSQL.

Just avoid learning bad habits from hipster faggots and you'll be fine. Learn HTML, CSS, and a bit of javascript. JS is an evil/hack you just have to live with right now. If you use it hack together your own code instead of using a heavy framework. A lot of modern webdev is just copy/pasting and throwing stuff at a wall until something loads in a browser. Don't be like those faggots and don't pick up their bad habits. Once you git gud you'll be able to re-use scripts you've made with slight or no modification over multiple projects. Learn the basics, learn how this stuff works at the lowest level, it's all pajeet/entry tier stuff compared to real programming so there is no excuse for not learning it correctly.

You'll soon realize that 99% of webdev is hacked together bullshit and everything is shit.

check it senpai
github.com/kraih/mojo

My reply was about the effects of UB, not whether it's good or bad.

Eh, concepts are always nicer than reality. 2-3 years ago Rust might have been the next best thing. As far as i understand the a real problem is that the compiler is just not good enough yet? (Apart from the legacy reason, which is by far the biggest issue).

requiring clientside js is pretty shitty imo

I too recommend Python for that, it's quite good for web applications, scripts and as a glue language, but don't use it for anything that isn't i/o bound.

Thanks user I'll give it a look when I can. I've been away for a long time so I haven't kept up. Imagine how I felt when I poked my head back in a few years ago and learned of NodeJS. Used to think AJAX was nifty but we always considered it a good enough hack until something proper came along. Was pretty sad to see how bad webdev had gotten but I knew it must be getting horrible when I noticed every major website becoming endless scrolling/phone friendly bullshit. All my old /comfy/ hang outs where modifications and hacks were shared for popular scripts became wannabe app stores too. The same cancer has invaded gaming now too. It's as if no one even cares about having fun anymore and they just want money. It's sad when 15 year old scripts that were free and released with no license at all are more fully featured and have less bugs than things people try to sell for $20+ these days.

My niche was in forum software. I've told the story on Holla Forums before but I was excluded from those circles when CoC/SJW types started invading the community and securing positions of power. It was a slow process, they started coming in 2005 or so and it snow balled out of control from there. All the oldfags that still participate in those communities drank the kool-aid or pretend they did to hold on to their positions of power. They all bent over for google too and operate forums solely for ad revenue. They employ various techniques to exclude anyone that goes against the hivemind or speaks out against selling user data for shekels. A few key players own most of the popular forums now and operate them under various pseudonyms. If they can no longer turn a profit on them due to declining user participation/clicks they fake stats and sell the forum to some idiot for $5,000+. It's really bad now behind closed doors. I've thought about writing a book about the subject since I've been around since the late 90s and know most of the major players on a first name basis. I avoid visiting most every forum now mainly due to the user base but also due to the fact that I am tracked across them by this forum-mafia. If I slip up and use the same VPN in two places they know about it within a few hours at most since they all idle together in private IRC channels and/or probably on Discord now a days.

C++ is too niggerlicious, HolyC is the language of God.

You should switch drug dealers stat.

Learn to love the D

Come on, NodeJS is one of the least bad modern web technologies out there. You just need to know when to prescind on npm (which usually is almost always), and above all, never ever consider Mongo unless you know exactly what you are doing, since 90% of websites won't benefit from it in any way over relational databases.

I will give you two things: the single threaded model is retarded and could have been greatly improved with a task server system with multiple workers (I guess it isn't posssible due to global state, which should have been forbidden in the first place), and crashing the whole server when one script crashes is stupid. Well, and JS itself, but at least it's not PHP.

Good image there buddy. I liked that film, also magnetic rose is better.

You write GNU software, don't you? I can tell.

Objective-C, Clang, Swift, Rust, Ruby etc.

Clang is a compiler ya dingus

are you retarded?

what are you even trying to say

"Clang" is the name of an LLVM front-end for C and C-based languages.
clang.llvm.org/


How has this thread not been hordes of people yelling "C#" from the rooftops? Java, but by Microsoft, and it really only runs in Windows, for that extra locked-in goodness.

Ummm... en.wikipedia.org/wiki/Mono_(software)

really makes you think, doesn't it?

...

All models are wrong. OOP should never be used for creating software.


That is trivial to implement yourself. Learn recursion


top kek


You're making a terrible mistake. I suggest doing some more research.

Why is django a mistake? It's way better than node or ruby.

LARPer spotted >>>/r/programming/

nodejs is the runtime, node is colloquial, you're right about ruby though, I should've said ruby on rails.
Still, what is the mistake? You give no alternatives.
I'm curious because I am converting a wordpress monstrosity to django after doing research.

Any of them used improperly

...

my programming teacher Raja Mahmood told me that java runs everything, from micro controllers too giant server clusters.

Does it also run the time machine that brings you back to the late 90's when that was almost true?

retard confirmed

Jetbrains IDEs are great and they are java. I use them everyday.

nigger my school IS a time machine, the COBOL meme is real.

To be fair, the universities basically only teach Linux/Unix and some C-based language. There is a reason people think arrays start with 0. I really hate what C has done to the world and computing. I always thought we would have some sort of system like a Xerox Alto or a Lisp machine by now but Windows won.

I can just imagine he had a smug shitskin grin on his face as he told you that. He's really saying, fuck you white man, Indians rule the world, we empire now.

Java and Javascript.

Javascript to the extent it's being used wrongly. It shouldn't be a universal language for developers.

Use the right tool for the right job.

I think he's trying to say that an unsigned int (4 bytes on a 32-bit system) is overkill for a string.


Sorry, I tried to deference this pointer and had a stroke.

Java is never the right tool. Java tries to be the "right tool for every job" but instead it's "the wrong tool for every job". It's clunky, artificial, and bloated beyond belief.

JavaScript, on the other hand, is the new English. Yeah it's got some quirks, and when you really dig down into the language it's a mongrel, but it's naturally and easily settled into the position of the most important and most widely used language in the world.

English is pretty nice though, unlike javascript. Both in their respective domain of course, any programming language is going to look clean and organized compared to a spoken language.

JavaScript is a dumpster fire that still hasn't recovered from design decisions made when it was still used for manipulating don't elements. Now that modern web applications are designed to perform the same as native apps, there's no reason to keep using JavaScript or xml at all on top of a glorified jit compiler which is a modern web browser.

Dom elements *

Like I suggested, do more research.


But user it's clear you don't do research.

If you're converting a wordpress 'monstrosity' to django you're still going to get a monstrosity, just with a little bit of structure. Django is easy, monolithic, and handholds the developer. Django is a shitty meme for shitty programmers. I'm not sure if you're getting paid to do this work, or your working on a personal project, but if you are able to choose which stack to use why not choose one that makes things simple, customizable, and doesn't treat you like a retard? You'll get a lot more out of the project than a slightly less shit wp application.

You don't know what you're talking about, you can't even name one alternative.

I am getting paid, you must give a reason as to why I should so something like create an npm mess with node or roll my own userauth with flask.
Sure with enough time I could create my own perfect system but what is the real benefit or detriment? What are the alternatives?

Nice bait. No reason to be openly hostile as I am only trying to help. I know multiple superior alternatives, but part of the development process is doing research. I am not going to spoon-feed you.

I wouldn't suggest that.

You answered your own question. The benefit would be a perfect system and the detriment would be the time needed.

Like I said I am not going to spoon-feed you as research is something extremely important for someone getting paid to do development work.

Careful guys, this is intellectual bait. This poster acts like he always has superior knowledge, but when challenged they evade and claim it's obvious, maintaining an aura of superiority.
Don't reply, lest you derail the thread into an endless flamewar.

But user, I do. I have done my research.

I stated why I dislike django. If easiness, opinionated, and hand-holding are your thing, go for it. I was just being charitable assuming you wanted more from your time invested.

C completely half-asses EVERYTHING and people who only know C think the original idea is bad and harmful. Nobody should teach C unless they use it as an example of bad implementations of other people's good ideas. This is only happening because people have no familiarity with PL/I, Pascal, Algol 68, Ada, and other languages that C is still badly copying from. 20 to 50 years ago, some of these languages were taught in the same universities that now teach C or C++!

It's easy to blame the PDP-11, but C designers today are still doing this. Various C features like enum, const, volatile, and void were added by the ANSI committee for C89 and most of these ideas which are from the 1960s or early 1970s were better than how C89 did them. C99 and C11 added new features that were in Ada or PL/I for decades.

Enum and const are from Pascal. Ada and a lot of other languages had these for many years before C did. When enums were added to C89, people who only know C think they are a stupid idea because of how C treats them. Const also has a lot of problems.
embedded.com/electronics-blogs/programming-pointers/4026892/Enumerations-are-integers-except-when-they-re-not

Volatile and const are also similar to the abnormal and nonassignable from PL/I. The purpose is very similar to the C features, but C still has problems with these concepts which were around before C existed. PL/I also has something called generic, which is an early way to do function overloading. C11 added _Generic (the underscore is because C has reserved words, which PL/I doesn't have) which is worse and has a lot more problems and inconsistencies than the original idea that was 46 years older. Most newer languages improved this idea into full function overloading, but C managed to make it worse.

Union and void come from Algol 68. The Algol 68 union is a safe way to contain values of different types without losing the type information. Unions also have safe subtyping by extracting a subset of types from a larger union, or uniting a smaller union into a larger union, similar to Julia and a few other new languages. Void is a type with a single value called empty, so "statements" like loops are just expressions, similar to SML and OCaml, where loops have a unit value. None of this is similar to C except for the names.

Newer languages are bad because of C, too. Java and JavaScript exist because universities teach C-based languages.

You cannot claim understanding if you cannot simply explain to a layman about a topic. If this means "hand holding" through logic and data, then you should be ready to do it at all times to defend your understanding.

To be fair, I believe there is a lot of overlap with the C++ committee and they are constantly trying to bring them together. I'm guessing that's why abominations like _Generic exists, or why they've made optional VLAs in C11.

I fail to see how is this a problem and not a cool fact really, C was not intended to have these refined high level features, it was intended to have some of these features as an abstraction over assembly so it can be as portable as possible. You know your shit but have some hindsight bias, and even if you know this I have to say it, compilers were even harder to write back then considering the lack of resources and architectures were in the hundreds, C was the only one that managed to:
So when maintainers later copied features in half-assed way, it was just to not break this. I don't know about you but I rather code in C than in fucking Pascal any day of the week.

But most of them really only copied the syntax, which is not the worst thing about C at all.

kill yourself, numale

I don't think you have the IQ to tie your shoelaces, let alone judge good programming languages from bad.

Who the fuck calls C C-LANG?

twitter.com/CollinEstes/status/738767017843515393

But C was not fast when it came out. A lot of work was put into making the compilers emit fast code. Pascal and Lisp had better performance back in the day.


People who started programming with Golang?

I'm guessing in not at all cheap Lisp machines, because I really doubt it anywhere else. Still, the reason they managed to make fast compilers emitting fast code is because it is easier to make a native C compiler. Hell even Terry managed to make one alone himself.

It's a problem because the C features are all they teach. People wonder why anyone would use enums if they're just integers and think every language has the same quirks as C.

They copied the switch, the octal syntax, and the bad for loop. QBasic had better control structures than C and a lot of C-based languages today. In JavaScript, 123 == "123" and 123 == "0123" are true, but 123 == 0123 is false. Also, eval('0123') is 83, but +'0123' and Number('0123') are 123. This is all because of C's octal, but it doesn't work on strings, which makes it even less consistent.

Pascal isn't the best language either, but knowing Pascal and C is better than just knowing C. Pascal's case is better than C's switch and Pascal arrays don't have to start with 0.

Holy shit, you know absolutely nothing about OOP.

...

Switch in C is terrible yes, there is pretty much no reason to use it (Unless you want fallthrough or some other super nasty hacks with the labels), but Pascal isn't that much better. To make me use a switch case you need to have something like Rust's match, or else I'm chaining if clauses, they look cleaner at least in C.

Aren't switches potentially much faster?

How?

Because with chained if-then-else both sides of the comparison must be evaluated each time, whereas with a switch/case the lhs only gets evaluated once. Unless the compiler knows the lhs expression is referentially transparent, but we're not talking about functional programming here.

This is worded strangely. If-else doesn't evaluate both sides, it evaluates until a true condition is reached.


There is an optimization which can lead to a switch being faster. If your cases can be turned into a jump table, say when switching over an enumeration, the compiler can just jump to the correct case as an offset, instead of sequentially checking the antecedent cases. As you can see, it's always a good idea to check the bounds of anything you cast into an enum.

*can just emit code to

The compiler should also emit jump tables with if-else, I don't see how it makes any difference.

Only if the if-else statements were only testing for equality (as in, it could be rewritten as a switch). Since that's a very particular case, in which a saner programmer would usually opt for a switch, I'm not aware if the compiler would make that optimization.

That's the point, switch exists in C because it's a glorified goto statement block, it's an abstraction over assembly, it has weaker code path guarantees than if-else (breaking is optional). That's why I only use a switch when it's absolutely needed, most of the time because of the fall-through.

Worst: Python and Java

Best: Scheme and C

Are you saying Scheme and C were the most detrimental languages? If it wasn't for Scheme, dynamic typing wouldn't be acceptable and scripting languages wouldn't be used for serious programming. If it wasn't for C, buffer overflows would still be a solved problem and programmers would be motivated to do better than the bare minimum.

It's clunky bloated shit. Sure it's fine if your computer is l88t haXX0r model 3000 but for anyone else it's shit.

C. many would say JS, but that doesn't really make sense since it's already a stupid idea to have websites that can run code in the first place
Java isn't a problem since those idiots have done the exact same shit in C++ before Java came out. Java actually prevented a huge amount of security vulnerabilities by not allowing idiots to use a non memory-safe language.

this is bait. no new programming language or framework or whatever trendy bullshit you use has contributed anything to software, aside from something trivial like memory safety. It's as if you're completely oblivious to the actual software industry and are just regurgitating a few headlines you've read. All this aside, to be a competent software engineer you most likely need to understand pointers. You use them even in high level languages to implement advanced algorithms.

Goddamit man. Have you even used a java prog on a computer that's not supar haxxor11! ?

It's completely useless. It's slows down this world and takes it away from "computing should be instant" to some horrible morass. It's gui toolkits are ugly. There is nothing good about it.

University professors should be flogged for recommending java and all you programmed drones should be flogged for not learning the Holy Truth: Never say java.

What alternative do you propose?
Requirements: must run on Win/OSX/Linux, no DLL Hell allowed, and the end user cannot be told to run makefiles

rust

Any scripting language ever

No, I was saying Python and Java are the most detrimental right now, while my favorites are the ones I called best.

All of "new" meme languages whose Manchurian Candidates always try to start language wars because the rate of industry adoption is not to their liking.

QT
Python
most languages pretty much
just link everything statically or bundle it with your application & load shared libraries from program folder.

Blame licenses and the devs for not bundling their DLLs.

Fuck off.

Keep smoking that crack, you might even be a bigger faggot than the Rust nigger.

fuck you rust best

Apple used to actually look good back in the OS 9 and previous days, I prefer that UI over any other (well, I do like my emerald+xfce/lxde insanity), that being said, the systems did work a bit weird under-the-hood.

That's where the meme came from- and Apple sent everyone over to the Linux and BSD side when they came out with the perhaps well-engineered but bloated as fuck UI OSX abortion.

java programs are shit in general but you almost never need to use them. if Java didn't exist they'd make the exact same bullshit in C++, and yes it would still be slow. have you ever used C programs? Most of the Linux desktop C programs are slow as fuck as well and segfault every minute/hour/day.

dependencies are a general problem and not solved better by any language. choosing to use DLLs or not solves nothing and changes nothing

Rust, Go, D.

i never claimed that, that wasn't part of his requirement.

Half-assing leads to duplication.

C++ has half-assed enums, so C++11 added enum class for better enums. Now there are two different ways to do enums with different semantics.

There are K&R functions with no checking on arguments, ANSI functions, and now the C++ trailing return type.

C++ also has "using" as well as typedef.

There are C null-terminated strings and the C++ string class. C11 added all of these _s functions for "safety".

C has math.h and tgmath.h using generic functions.

But like I said, C evolved along with the hardware. I've never said C is in a good state now (Which is a problem NOW I agree), but I really can't blame the maintainers, they did the best they could to not break each use case.
The C++ case is much much worse though, the language is a complete abomination today.

It seemed like he wanted to distribute non-server software though.

The start of the thread was someone bitching about Java being slow. If the bar's low enough for Python you might as well say fuck it and use Perl 6, save yourself 90% of development time.

...

upboated. have some Holla Forums gold.

Hardware doesn't have much to do with the evolution of C. C mostly worked around newer hardware features. They didn't add enums because there was a new hardware enum feature. They didn't start to type check function arguments because of a hardware feature. They were fixing half-assed bad designs with slightly less bad designs. People used #define, then enum, now enum class in C++. People used K&R functions, then ANSI functions, now trailing return types in C++.

It has never been in a good state but it didn't matter as much because C wasn't as popular. C used to be regarded as a badly designed language like PHP. gets() wasn't acceptable in the 70s any more than it is today. Null-terminated strings weren't faster or safer on old hardware.

Doing things the right way the first time around wouldn't break anything.

C is the same, only smaller.

using is for importing things from namespaces, typedef is for assigning new names to a type, they have a completely seperate purpose AFAIK

C null terminated strings need to be supported in C++ for backwards compatibility. But since C strings are terrible and quite frankly exampleString += "c" is a lot more readable than strcat(examplestring, "c") C++ added a proper string class as well.
I find the seperation into safe and unsafe functions nice, altough safe should be the default in my opinion.

Since C++11, you can also use it to define type aliases:
using ayy = std::vector;templateusing lmao = std::deque

Unsafe in other languages means the compiler keeps track of the lengths, but there are no bounds checks. Unsafe in C means it assumes there's infinite space for reading and writing, often with no way to even provide a length.

Safe in other languages means there's bounds checking even if you calculate wrong and pass in the wrong value. Safe in C means you have to manually keep track of the length and pass it to the function.

The "safe" functions in C are less safe than what other languages call unsafe functions.

I would like to point out that PHP gave us hacks like Hotwheels and Josh.

I'm not pulling comments via AJAX. I do a PUT request to the backend via AJAX and the server returns me serialized data about the comment that I just submitted. Then I do commentElement.cloneNode() so that I have a copy of a comment, then replace the data with new ones I got from the server. Then I need to place it as a reply or as a parent comment, and putting it as a parent comment requires the nextSibling stuff.

Lisp

Lisp is why we're using XML and dynamically typed languages.

No we don't you LARPer. It's either JSON for exchange or yaml/toml for config files.

The internet and the computer world in general is literally a flaming garbage can.