Roast me Holla Forums

Roast me Holla Forums.

Other urls found in this thread:

benchmarksgame.alioth.debian.org/u64q/which-programs-are-fastest-firstlast.svgz
cvedetails.com/product/1820/GNU-Zlib.html?vendor_id=72
twitter.com/NSFWRedditGif

I can't bc ur dead

I would, but I'm sure all the undefined behaviors he introduced to the world are already doing a pretty good job.

So are the issues caused by undefined behavior (buffer overflows etc) not a problem in assembly then? C may have introduced "undefined" behavior in the sense that it introduced the concept of programmers being able to write programs that don't map to deterministic actions across different compilers, but the issues caused by undefined behavior existed before C.

FUCK YOU JIM YOU FAT FUCK
YOU KILLED HOTWHEELS AND TURNED Holla Forums TO SHIT

dubs of truth

C11 sucks compared to C99 and you died too soon. Also you deserve to go to Hell for making arrays decay to pointers.

It seems like you don't understand what undefined behavior is.
Undefined behavior means, for example, that a C program that has signed integer overflow at some point is allowed to do literally anything. It could execute system("rm -rf /*"); and it would still be following the standard. A real-life compiler wouldn't go that far, of course, but it can do fairly horrifying things because it will assume for the sake of optimization that signed integer overflow never occurs.
Undefined behavior is a well-defined term that you can find in the standard.
Buffer overflows are a different issue.

Stop writing programs with undefined behaviour then and enjoy a fast compile-time and execution time. I like my undefined behaviour.

Yes, I understand what undefined behavior is, and no, it's not C's fault for introducing it. C is portable assembly, and so signed integer overflow cannot be deterministic, as different architectures have different ways of handling it (x86 just wraps around, MIPS and Alpha generate traps), and C does not instrument code to check for bugs, as this would decrease performance. Or shifting a value by more bits than can be stored in that value. Again, different processors handle this differently -- x86 truncates the value to fewer bits than PowerPC does. The only way to "define" this behavior would be to reduce C's portability.

Other undefined behavior involves actio s that are obviously wrong (dereferencing a null pointer). Having this be undefined allows the compiler to optimize out certain code so long as the assumption is made that no such obviously wrong things are done.

The rest of undefined behavior is even more obviously undefinable. How would a programming language ever define what should happen when someone reads past the bounds of an array without implementing memory safety and accepting a 200-400% slowdown and a corresponding increase in memory use? Without that, neither the compiler nor the processor has any way of knowing that an array index out of bounds error has even occurred, and so cannot produce defined behavior.


I was referring to writing past the end of an array, which is one of the more common cases of undefined behavior, and the one that c gets blamed for allowing to happen.

C just generated the assembly, you're responsible for the bugs. Of course it's normal that bugs slip through, we're only humans, but languages that check for all this type of stuff (bounds checking for example) are slower. It's a trade-off

not rust, faggot

It's slower in benchmarks, faggot.

(not true by the way)

It's almost 2x as slow.

benchmarksgame.alioth.debian.org/u64q/which-programs-are-fastest-firstlast.svgz

TANSTAAFL. If you want to emit extra code for bounds checks, your program won't run as fast.

wew

Please explain to me how your magical Rust code runs as fast as C while also emitting the necessary instructions for memory bounds checking on each memory access.

...

Do you even know what an abstraction is ?

What's to roast...

The memory bounds checking happens in the language itself at compile time. It doesn't happen at run time.

amazing advice. mind giving this advice to actual c programmers?

most of the fast rust versions aren't compiling because the maintainer is a nigger

...

unsafe

Why doesn't C compilers do this and emit warnings?

Why doesn't the compiler optimize away all code which is guaranteed to result in undefined behavior?

This image got removed from recaptcha because nobody could correctly identify the gender

It's pretty obvious it's a man though.

That's almost what it does. It tries to remove all code which only executes if undefined behavior occurs, and it leads to weird shit. Let's say you have this somewhere in a loop:
if (n < 0) { /* Overflow! */ do_x(); break;} else { do_y(n); n += some_positive_value;}
If the compiler is smart enough it realizes x < 0 only happens if undefined behavior occurs, and reduces your code to this:
do_y(n);n += some_positive_value;
This is perfectly valid according to the standard.
Alternatively, there's this:
int *x =NULL;if (i % 2 == 0) { do_something(); i = *x;} else { x = y;}
Because the first branch always involves dereferences a null pointer, the compiler can reason that it should never occur, and remove it. Naively you'd expect your program to crash after do_something() if i % 2 == 0, but this is what it actually does (before even further optimization):
int *x = NULL;x = y;

And overflow doesn't always happen, depends on the platform. On some platforms it might cause an exception. You might also want an overflow. For example, I have a table of 256 elements and I want to go through it in a loop. I can have a 8 bit index and just increment it, without checking for 255 and then setting to 0 and such. Also, how does rust help in situation like this

Let's say you have an array of 10 elements and a function that will read an element from any index

int get (int i)
{
return array1 [i]:
}

You can't know at compile time that I might be greater than the array size. What about when you memcpy stuff into a buffer and you control the size, which makes it possible to overflow the buffer. Can't optimise that at compile time either, unless the compiler just adds the size checking code automatically. That's convenient I'll admit, but that's not a reason to change the whole language, especially when rust is so much different. Just do it right in c/c++

user you are replying to was talking about signed integer overflow, which is ub. unsigned integer overflow is defined. he also hasn't mentioned rust.

most of c/c++ code in production doesn't do it right

That's not the fault of the language. I understand using tools to prevent such bugs, it's fine. But it's not fair to shit on the language itself because coders do bugs

If the language encourages bad coding that's a flaw of the language, even if it doesn't remove all blame from the coders.

That doesn't make sence. How does c encourage bad coding ? It does what you tell it to do. If you want bounds checking and such implement it.

let me use the same example as before, but now I'm on PC so I can describe it better. This is a typical bug that can be exploited to do something malicious.

void parseSomeStuff(void* packet){ void* headerBuffer = malloc(0x100); memcpy(headerBuffer, packet+4, *(int*)(packet) );}

What this code does is copy the stuff from the packet of the size specified in the first 4 bytes of this packet, but it doesn't do a size check, it mallocs the same amount of memory for it. The programmer just assumed that the "header" will always be 0x100 bytes or less. How does RUST prevent that ? If a RUST programmer make the same faulty assumption he will introduce the same bug, unless RUST adds the size check on memcpy or something. If so, you can do it yourself in C/C++ (as you should)

what if you have a code linke this

int someGlobalArray[10];void getStuffFromGlobalArray(int idx){ return someGlobalArray[idx];}

how can RUST detect that we're going out of bounds at compile time ? If it does the bound checking at run time you might as well do it in yourself in C (as you should)

How would you check those bounds at compile time?

It's an NBCR military suit. You probably know it under hazmat suit.

what about dereferencing null pointers? double freeing? dangling pointers? oh thats right: just dont do it. amazing how you can shift the blame always on the programmer. btw why do guns have sefties? just dont pull the trigger if you dont mean to.

that is the problem. in c bounds checking is opt in. in rust it is opt out.
if you want to check if a program written in doesnt do out of bounds stuff, you're fucked. if you want to check a rust program, simply search for "unsafe".

But it's a good thing it's an opt-in. With great power (speed) comes great responsibility. One of the greatest C/C++ strengths is that you have so much control over your program. And again, because nobody wants to respond to that, what about these ?


C++ has smart pointers for the double free or use after free and null pointers issue, but regardless it comes back to my point at the begining of this post. If you want safeguards for this stuff, fine, use a language that does that, but it will be slower and you won't bullshit us that it's not.

cvedetails.com/product/1820/GNU-Zlib.html?vendor_id=72 (just an example)


bounds checking

i would check them at runtime

what about c?
i am
marginally slower. if this is unacceptable you can disable it.

What's the point of disabling safeguards in a language ? I might as well use C then. I sort of put C and C++ at the same level. Most people use C++ with all the new cool stuff like smart pointers and std::array (which checks bounds at compile time). People that use C usually do it for some embedded system and what not.

not an argument
but your answer to everything is: lol just dont do it bro

I'm saying there's no point in switching to a totally different language for something that you can do in the old one. And C++ adds this kind of features. Check out C++14 and 17. If rust was at least similar to C... but it's not. If someone is about to learn a new language then I guess I don't have anything against them using RUST, but you won't convert old C/C++ coders with that. It's just not worth it.

you cant do memory safe c.
rust was never intended to be similar to c.
have fun with your arbitrary code execution

Ok friend. One more thing I forgot to mention. We're talking about software that can be exploited remotely, right ? Like let's say you have some software on your dedicated server and someone can exploit it ? What about offline software ?
Let's say I'm making an offline game and it has a bug that lets you overwrite the code. So what ?

I actually like self modyfing code and this type of stuff, but that's besides the point.

how is this relevant?

What do you mean ? You want people to use RUST to mitigate exploits but if the software is offline (a game, a painting program, music player etc.) the user himself would have to exploit it (or a hacker, but he has to have access to the computer at which point what's the point of exploiting some program). I mean, it's fine to write these kinds of programs in C/C++, right ?

i mean that the argument isnt about where you can safely use unsafe languages, but that c/c++ are partially at fault for bugs that could be prevented, if the language wasn't shit.

C is like a macro assembler, but more robust. It just does what you tell it to do and often that's what people need. Then you have C++ with more abstractions, while being almost the same, very compatible and most importantly the abstractions have zero or almost no overhead. I said it's ok if you really need th safeguards, but there's a cost for it, and how tiny it is ? All the benchmarks so far show it is slower. Depends if you really need all that speed or not, but if people don't really need the speed they use different languages already.

that applies to rust too
it is tiny

Like your cock. Oh snap

XD

Glad you enjoyed my quality joke

your syntax is retarded and my ass could produce a better one even after my first year of programming when I only knew assembly language.
int (*(*(*faggot)())[123])[456]
i could actually roast you for hours but i dont really care to make a list