Java is shit

So, Holla Forums, do you still want to insist that Java is perfectly safe server-side?

Or can you start to accept that the whole platform is shit, starting with how Oracle handles security vulnerabilities?

8ch.net/newsplus/res/21260.html

wvusoldier.wordpress.com/2016/09/05/some-extra-details-on-hospital-ransomware-you-probably-didnt-know/

Other urls found in this thread:

securecoding.cert.org/confluence/display/c/INT32-C. Ensure that operations on signed integers do not result in overflow
twitter.com/SFWRedditVideos

Did anyone here ever?

Welcome to Holla Forums, OP! I can see it's your first time.

Yes, sadly.

Oracle was a mistake.

Last year I ran a Java service which would accept as input arbitrary zip files downloaded from the internet.
Every once in a while the piece of shit would segfault in the JVM depending on the file given, in Oracle's java.zip package, which is based on a bundled native zip library.
As you can see Java is so "efficient" Oracle themselves avoid using it internally. For the record I measured java.zip written in pure Java 30x slower than native.

Is there any programming language or OS that Holla Forums actually agrees on is good?

If there exists no such thing as a definite superior language, at least that exist languages that don't bind you to immensely complex and inefficient runtime environments.
In C/C++ you type make, get your machine code instructions and be happy with it.

No.

If it's not written in perfect C89 or assembly, it may as well not exist.

That's because none of them are best for everything -- what you get is unequally bad shit that you have to make do with to get your shit done.

Fucking Pajeet, poo in the loo instead of my computer.

...

If you think C is best for anything, you have insufficient experience with C.

It is good, but not the best, for operating systems and similar stuff and it has many, many warts.

It is best, though. Or less worse, if you prefer.

Not for everything. In fact, I would argue that it's the worst language for many of the tasks people do in C.

You can argue that the earth is flat too. But you're still wrong.

It's not my fault that you're a drooling moron.

Only people with insufficient experience in C would argue it's not the best language for everything. Thanks for outing yourself.

Oh, so you're another drooling moron. Let my give you explicit examples of when C is the worst language to choose.

This is mostly for the benefit of non-C fanboys who stumble upon this thread.

1) Scientific computing. C has silent overflow. That means that you cannot tell when your scientific computation overflows. This is a source, of many, many bugs.

Clang has some tools to detect this now, but that's a compiler and runtime feature, and doesn't fix the problem in the language.

2) Network protocols. C has no array out-of-bounds checking. So you have to roll you own -- always. And you must never forget this. Many protocols come with a length prefix, so if you read that prefix, and then you have to check if every access is within the payload. All the time. If someone maliciously gives you lengths that extend beyond the payload, and you don't check, you're screwed.

I don't care to mention more examples right now, but if you don't get the idea you really are a drooling moron.

DON'T. WRITE. UNDEFINED. BEHAVIOUR. IT'S THAT SIMPLE. Only retarded Pajeets like you could miss this extremely basic stuff and write bad code

Actually, I am glad C allows retards to shoot themselves in the foot. This way I can be sure C code will always be quality code since retarded curryniggers like you could never leave your braindead Java nest if you shit yourselves when exposed to C's freedom of thought and choice. C is only for superior patrician, aryan minds, and I'd like to keep it that way.

scheme/lisp, C and assembly.

All other language are useless.

Wipe the drool off your face and actually know what you're talking about.

In C, unsigned integers have defined behavior on overflow. With signed integers it's undefined.

Now, the problem is that C does not give you any way to detect overflow when it happens.

Also, when you're dealing with user supplied values, you simply cannot assume anything about the behavior - it's not "simply avoid undefined behavior."

Go back to retard class, you drooling idiot.

C is good for what it is, low level stuff. Overusing it is just a waste of time and energy. For web applications, Python is good.

A language doesn't have to be universially the best, but since it's a tool it'd be great if it did the thing it's supposed to do well... which seems a real big problem for most languages or software in general.

Modern IT is like needing a hammer and having the choice between an entire toolbox filled with various dildos.

Our company still runs Java SE 6

Python is fun. They should rename it funlang.

So stop writing code that can overflow. All those other languages that go "You're a fucking retard" when you overflow a variable are doing that because you're a fucking retard. C assumes you're not a fucking retard and that overflow or out of bounds memory access is what you want, because there's some good reasons as to when and how you can use such things. It's simple to detect overflow if it's a concern for your program, or you can use one of the many arbitrary precision libraries. For OoB memory access, use a vector or some other type that does bounds checking. There's plenty of libraries that have vectors, or you can write your own in about an hour if you're not a fucking retard.

It is simply "avoid undefined behaviour", just like C is the language of "I'm going to do what you told me to do".

You really, really are a drooling moron. Yes, it's so simple not to write code that overflows, because
securecoding.cert.org/confluence/display/c/INT32-C. Ensure that operations on signed integers do not result in overflow

Or you know, I can just use Lisp and not have to think about this at all. There, I just saved myself a ton of headaches and error prone code.

I love you user.

I'm not the other guy, and you're an idiot. static inline functions that replace the operators that overflow solve the problem, provided you're not retarded enough to remember to use them. There's libraries that do this shit already, or if you're not a fucking retard and actually understand the underlying data types you're using it's easy enough to write your own.

There are compiler options and special functions that do this - that's not the point - the point is the language sucks in this particular case and if you forget to use any of it in one place you leave yourself open to security risks or having your calculations wrong with subtle bugs that can be almost impossible to chase down.

And note that compiler specific functions and switches are relatively recent - as is the general awareness of these problems.

There's no such thing as "perfectly safe". But it's much safer than C/++. Note that this security hole results from trusting user input, something that can happen in any language.

The big issue is that there's a bug in a common Java library that's used in a lot of places. It's up to admins to stay on top of their patches.

Java has it too, unfortunately.

C went this route because the designers valued performance over reliable code, and they turned niche uses prone to mistakes into the default use case.

In the real world, humans make mistakes, even the best coders, as has been proven over and over again by the endless stream of bounds-checking bugs and pointer fuckups found in C/C++ programs.

But go on, please keep spouting on about the legion of mythical C/C++ programmers that are out there and available to write all the world's software, bug free.

Sounds like bullshit. If you just did a simple translation from the C to Java, you'd like be within 2-3 times the performance of C. Thirty times is more like Ruby performance.

You have them in front of you, you fucking currynigger.
t. a professional low level programmer with 14 years of experience who has never written a bug in C

Degree with timestamp.

Going to college is a meme. I landed my jobs simply by being capable of demonstrating what I was capable of. Only losers like you think you need a degree to get a job in the industry.

wait when would you want this?
And why would you write code that does this?

sage for offtopic

And has never told a lie. Now has big is your epeen?

And where can I find a C program of any significance that's never had a bug in it? Surely among your legion you must have one open source app that you can point to?

I write all my shell scripts in C. I build websites in Lisp and I inject machinecode into the fucking HTML to blow people's minds with some real performance.

Fuck yea!

Did you know Lisp is pretty much the optimal language for this task, as weird as it seems?

Hmm.. yeah that makes sense actually..

Alright then: Lisp for shell scripts and C for dynamic websites!

Incase you want heartbleed :^)

Combining data with metadata is a good one for OoB memory access. Consider the struct
struct Texture { int width, height; int bpp; char data[1];};
The data member is a pointer to a single character, however by allocating sizeof(struct Texture) + width * height * bpp - 1, you collect all of this data all together. The only way to access it is by using the data member and going out of bounds with it. This is useful for when you want to transfer the texture somewhere else.

If you're wanting an object to wrap around from the right side of the screen to the left, or top to bottom, and your coordinate system is integer based and centered around 0. Particularly relevant when on smaller screens.

Java is the same as every other PL in terms of "safety", which isn't pretty good, but still much better than C/C++/Pascal. This thread is fucking retarded.


No you don't. The compiler is millions of lines of code for a reason. If you want your code to actually be portable, you can't think of it as what the outputted assembly will be.


UB isn't easy to spot by a long shot. Finding UB invocations automatically would be reducible to the halting problem. There are probably a few thousand people who know most of the constructs which are UB in C. The rest are retards that think it's just signed overflow or out of bounds memory access. Also every C codebase in existence is full of UB that people by de-facto expect to work the way they want (also lots are just oblivious to the fact that something they do is UB).


I don't know what OS and usermode programs you're using but most of the C code in my Linux system is bullshit and crashes if you look at it funny.

literally overflow is expected behavior for unsigned ints in most PLs. the difference between having an exception on overflow is neglegible from a correctness perspective, and only matters in C where it will lead to bad things.

underage b&

No, almost no other PLs are run by Oracle.

Worst aspect of Java is the dependency hell. Few years ago I worked on Spring/Hibernate/MySQL project and to make certain features work I needed JUST the right version of each pom artifact. Updating anything would break everything with nonsensical errors that exist in build infrastructure - that in no way refer to real cause of the error.

java is shit? thank god you realised that, wtf took you so long?

inject machine code? ok... now, if you are injecting lets say i385 intel code, people on android couldnt open your pages, if injecting of real machine code in web is even possible

Guys I share with you a neat programming trick to create great IDEs in Java with a few easy steps.
1. obtain Eclipse source code
2. edit splash screen picture
3. edit program icon and title
4. create language plugin
5. package together
Unique and pleasing user experience is guaranteed.

What is this shit? All this shit to print "Hello world".

public class HelloWorld{ public static void main(String[] args) { System.out.println("Hello world!"); }}

I would like to say I was once like you, but I haven't because I am not retarded.

Screw you

I remember an ARM M0 IDE that use gcc but was a crippled version unless you pay. Is that the same shit?

My problem with java is that it doesn't have any of the fun stuff: no operator overload, no functional programming (I guess they added something in 8, but it's too late) no generic programming.

Java is single handily responsible for all the hate to OOP, because it forces its use even when it's not the best approach.

How do you feel about clojure/scala?

I am not but to answer your question.
Clojure is good, so appears to be Scala, no problem with these languages.

The JVM being the only target is a problem. AOT compilation is generally best for statically typed languages.
There are few cases in reality where JIT is really useful, beside developer use cases like dynamic instrumentation, but that should be entirely optional. The JIT'ing virtual machine incurs a considerable cost and high startup time.

One big defect of the JVM: losing the genericity information from source code, so it uses boxed genericity, which makes the language awfully slow in cases like numeric computation. (no such problem in C#)

I love Clojure but I have never used Scala, Clojure feels like one of those languages you learn and suddenly you are a better programmer in every other language, yet you never use it for anything.

The biggest issue is that because of the JVM (like
said) they both have to transpile to Java in some way, adding extra overhead and making them slower than java itself, because, you know, nice stuff like type inference aren't determined with preprocessing like Haskell does but hacked with a hammer.

But it transpiles to ASM.js-grade javascript which is great because you don't have to write a single line of javascript ever again and because V8 it can run better than the suboptimal java bitcode and runs better than average javascript because of Google Closure.

Operator overloading is a terrible misfeature and should not exist. Same goes for overloading actually.
The one and only proper way to accomplish custom operators is redefinition and multiple dispatch. (cf. Julia language)

I know it is controversial, but I've recently got into graphics programming in c++, and man, being able to override + - * / to work with vectors and matrix is awesome.

The code is stupidly easy to read, clean and simple algebra operations everywhere with no performance loss because polymorphism.

Java's operator overloading is fucking horrid because it's defined for some things such as strings, but you can't define your own.


What do you mean by redefinition and multiple dispatch? By my understanding, multiple dispatch is so that move(Cat& cat) and move(Dog& dog) would refer to two different functions.

C++'s way is to define a version of the operator for each pair of operands, which seems to me like a good way of doing it since it treats each operator as a function, and due to argument dependent function lookups, sounds like multiple dispatch at compile time. Due to the ADL performed, each operator has to be redefined for the pair of operands it works on.

Of course, since this thread is Java, I may be completely missing the point and it's just that Java does things in a terrible and inconsistent way. Fuck Android for choosing Java as its language.

Ain't that plain old overloading?

If I ever need to write in Java I prefer JavaScript because it's so much nicer.

After looking a bit more into it, it sounds like it's done at run time rather than compile time. So something like:
void move(Animal& an); // 1void move(Dog& dog); // 2Animal* dog = new Dog();move(*dog);
would call 1 based on compile time type information, but 2 based on the run time type information.

This would obviously have a performance impact, probably a similar one to virtual member functions.

I take you haven't written extensively in any of those languages.

But it's true, though. If Hello World was a good measure of anything, then bash is the fucking best language ever invented: shorter and faster than assembly!

If I ever need to write in JavaScript I prefer plain HTML because it's so much nicer.

If I ever need to write in HTML, I just quickly draw the website on a piece of paper and stick it to the monitor because it's so much nicer.

Actually Hello World is a fantastic indicator of language style and design.

So we've got Bash's
echo Hello, world
and then Python's version
print('Hello, world!')
C's
#include int main(void) { printf("Hello, world\n");}
C++'s
#include int main(void) { std::cout

Let's see:

python (Let's grab lisp and C and do neither):

print "Hello World"

javascript ( Let's do OOP in the less OOP way possible, people will love it!):

console.log("Hello World")

C++ (Let's implement a fucking operator overload for the most basic programming example and let's also never use this syntax for anything else ever!):

std::cout >> "Hello World";

C ( I can't think of anything sarcastic about this one):

printf("Hello World");


I think you have a point user

If a language's hello world is more than one statement it's a shit language. Objective fact.

Good thing no fucking languages use more than one statement to print a Hello World, huh?

#include int main(void) { printf("Hello, world\n");}

more than one statement.

>what the fuck is ::? what the fuck is

Holy shit, you're dumb as fuck. You should be doing these range checks in *every* language, not just C.

Also, your use of commas is abhorrent. I will admit that my usage isn't great, but jesus christ stop filling your sentences with them in random-ass locations.

ayy lmao


That use of commas ain't incorrect, though. It may sound kinda pedantic, but you can totally write like that.

you're full of shit but I can't blame you for everyone overusing shiny new stream functions where just basic c string operations and functions would suffice

Haskell
putStrLn "Hello, World!"

J/APL
'Hello World'

GNU/Linux
apt-get install hellohello

>scheme/lisp, C and assembly
ftfy


Then you have never written anything in C.


Every decent language designer should anticipate that their language will be used to write a Hello World, hence a slow Hello World with a big memory footprint is a good benchmark of the designer's foresight.


Avoid Scala, it piggybacks on Java in a way that you always have to work on two levels, the Java level and the Scala level.


If I ever need to draw a website on a piece of paper, I just take a photo of an existing website because it's so much nicer.

>scheme/lisp, C and assembly
ftfy

If I ever need to take a photo of an existing website, I just tell people to look at that website themselves and imagine I made it.

No my friend, you need to go back to retard class and learn that some languages have range checking built-in. Effective use of that feature means you don't have to range check manually all the time.

Just, filter, out, my, posts, if, you, don't, like, my, commas, you, fuckwit.

I work with an actual aspi coder three days of the week and it still surprises me how the most blatant sarcasm still goes over your heads.

And that my friend, is C++ in a nutshell.

Stream operators have their use in that it's basically a consistent interface to print_type(stream), which is a nice bit of syntax but overuse is annoying as fuck in anything. No need for everything to be an object or a member function, but Java encourages that. No need for everything to work with the stream operators, but C++ allows it and people overuse it.