The construction of a single C++ binary at Google can open and read hundreds of individual header files tens of...

what the fuck

Other urls found in this thread:

esr.ibiblio.org/?p=6918
esr.ibiblio.org/?p=2122
ranum.com/security/computer_security/editorials/dumb/
talks.golang.org/2012/splash.article
twitter.com/SFWRedditGifs

I've just tried feeding my most bloated cpp source into the preprocessor and I get a high score of 668x only.

they were too stupid to use header guards, right?

These are the kind of people Go is designed for. Let that sink in.

Geez how many lines per file did they have? Less than 100?

Muh search algorithm is just soooo complex

who are you quoting

The pajeets at apple made a big deal of optimizing then bragging how much faster their clang compiler can parse garbage input like this than GCC.

And PHP has a really fast parser for exactly the same reason. Who needs a washing machine when you can just put a speedy elevator in the dirty laundry storage shed?

No, header guards combined with headers including headers is the problem.
#include
#include
will cause ALL of stdio.h to be included twice. Notice they didn't say they'd preprocessed it, just that they expanded the #includes. Now of course most of those 8G will be thrown away but that's not the point. The point is that you're processing 8G of shit but only have maybe 10M of code in total.
This is why when it came to Plan 9 they decided to just nix the header guards and force you to include things in order and only once, as parsing all that shit was genuinely the slowest part of compilation.

Well allow me to quote for you a relevant excerpt of Google's own C++ guidelines.
It's incredibly retarded and contrary to all good C++ programming practice.

Forward DeclarationsAvoid using forward declarations where possible. Just #include the headers you need.

I bet they thought about making that rule too but then realised they were working with pajeets so they made Go.

An even better solution would have been to ditch the C trainwreck and use a language with a proper module system.

What language should replace C? It would have to be developed from scratch.

Rust, obviously

There is no hope. Switch to C++ and use it forever.

I think you mean Rust

Keep dreaming. Rust will be long dead when C "finally" bites the dust.

Hipster garbage. I'd sooner use PL/I, which, by the way, both predates and is better than C.

Fortunately Ada exists, so I don't have to go that far.

I don't think you've written a single program in Ada. You just think it's cool because it's the current meme. We both know that Rust is cleaning up the street that C/C++ shat all over and there's nothing either of us can do but accept and embrace it.

There is only one good statically typed programming language and that language is Dylan.

I don't think you've written a single program in your life. You're just a cheerleading mozilla fanboy hipster.

kek, Ada is one of the most anti-memetic languages around. Anyone recommending it is highly likely to be doing so because it's, y'know, actually really good.

I've been coding far longer than you have, know more languages and have contributed more to open source than you ever will; that much I can tell from our conversation alone. Or you would know why C and C++ are such shit and need to be put to rest.


There you go, needlessly labeling everyone's gender. How typical, C master. You know if you ever left your mom's basement you'd see that not everyone fits into your nazi worldview. But to imply you'd ever leave your dark sheltered world is just too much.


Anyone recommending it here is just doing so because it's a Holla Forums meme. You aren't in the military designing planes or submarines that need Ada's security, What difference does it make if your fancy fizzbuzz in C is secure or not?

Put down the alcohol m8.

Looks like someone has lost himself on his way to reddit. C++ is superior to Rust get over it, what good is a "bare-metal" programming language if one cannot enjoy shooting himself in the foot and doing so FAST. Bounds checking is for babies.

holy fuck i hope you're both trolling

Hey, C++20 is only 3 years away, it will probably have modules standardized, maybe.

D comes after C

...

Enjoy having backdoors in your "secure" code.

yep and don't forget about concepts and also GC.
Tbh I don't blame the committee they have done a great job with modern c++. 2011 was the standard that made my programming great again.

...

Do you like D?

D has some good and bad.
It has a very advanced interpreter in the compiler, and what's better is it can be used to generate code.
On the other hand it has more null-pointer problems than C++, much like Java. In C++ one can limit the flow of null-pointers by using references, or pointer wrapping objects like string, vector... In D you can't do as much to prevent propagation of nulls, for instance you can very well assign null to type string.

The faggots never learn.

Truly a worthy addition to the C language family!

I test my programs for inputs that are conformant to my personal expectations of what they should be. Therefore I consider my programs to be, from my personal point of view, secure :^)
Anyway it does not matter, all that matters is that it runs fast, like sanic levels of fast.
It's called that only if it's intentional, it's a vulnerability otherwise. I would never do such naughty things.

I'm not being hard on them either, they have to satisfy a lot of interested parties. It's just, do you ever use a different language for a while, and then come back and realize oh yeah, headers ARE terrible. That's where I am right now.

Perhaps a naive question, but could a library not be created in C/C++ to give the ability to instantiate a bounded data-type? Something akin to smart pointers, but for boundaries?

My beef with all these other languages is that:
1. They're bloating the language space (there are too fucking many now, making it a fucking pain to interact with libraries made in these other languages)
2. They force things upon you that might not be necessary.

I wish there was only one language that could be both compiled to asm and ran as a script in a VM.

What do you mean by a bounded data type?

I'd imagine it'd be a data type that's got a fixed size, similar to a std::array or []. The problem with that is then you have to do bounds checking for all accesses of it that rely on outside input.

Really all you need to do is use arr.size() as the upper bound of your access for iteration, and verify your inputs if you're doing random access. Basically not be retarded, document when your data lives for, and read documentation for how other people's data lives.

So it's retard friendly, how is that bad?

For one, your community will be filled with retards.

What level of retardation are you ?
You can't prove nothing and the other user too.

Mozilla sjw shill detected

Like all retards who can't work without mixing politics.
Go back to your safespace you mentally ill trans fagot.

Reminder that this is the sort of people that is killing the hacker community:
esr.ibiblio.org/?p=6918
esr.ibiblio.org/?p=2122

Anyway the only real language that a worth spending time one will always be:

Assembly
C
Lisp fam (shceme/guile)

this is trivial to optimize btw, are all C compilers that stupid?

What about Pascal/Object Pascal, Ada, Fortran

you forgot python.

its main point is that it's expressive, doesn't force any coding style, usable for everything (remember NumPy?), and even as a bash replacement because fast start up time and preinstalled pretty much everywhere

Technically that's the domain of the preprocessor, not the compiler. C doesn't have modules. :^)

He doesn't seem like the type to acknowledge that a language that's both easy and popular could be good.

...

Python belongs in the same bag as JS, PHP, and countless others. It's what happens when one software engineer without scientific knowledge believes he knows enough to design a programming language.
It has only a pretended advantage to improve development speed, but that one is nullified by the times it takes to test the program with sufficient coverage, since errors are undetectable until runtime.

C is a naive language, and that's present in pretty much all tools a C compiler uses. The preprocessor handles all of that with header guards or #pragma once, assuming the programmer who made the header isn't a retard.

I only use #pragma once, I have never encountered a single compiler that does not handle it.
Beside I dislike that some libraries use some very generic names for their header guards, and if a conflict happens in some obscure header I wish you good luck to discover it.

All the major compilers handle it, and they all have special handling for file wide include guards. Google's issue is that their coding style is to #include every file that has something they need even if they only need a class declaration. If you look at the Clang source code they've got sometimes dozens of forward declarations and their coding style explicitly says "Do forward decls instead of #include if possible, because it's fucking stupid to do otherwise". Yeah, modules and precompiled headers are a pretty fucking great idea, but when you're running a build in parallel each compiler instance is still gonna have to read in that module or PCH that was imported for a single source file. There's caching tools to handle that sort of shit but Google apparently doesn't want to hire competent devops.

that's implementation detail

compiler takes code and produces some executable shit, user should not assume anything about how exactly it's done

therefore it's bad excuse, a good compiler would just do these things together, allowing for some optimizations, even if some elder men said that preprocessor should be separate.

proofs?

it has static type annotations and type checker, are you living under a rock?

I'm pretty sure it's required by the standard. It's not an implementation detail, it's a requirement of the language. I do agree that that's shit.

I don't live under a rock, I just know what I am talking about.
"Static typing" or even classes under Python aren't worth a shit. There is no structure or enforceable typing discipline.
Nothing prevents anybody to instantiate Cat, replace its legs with tentacles or create all sorts of "chimera" objects, and still call it a Cat nevertheless.
Take Lisp, a proper, structured dynamically typed language. The objects actually have the properties described by their class, because a cat is just a fucking Cat, period.
And it's also one of most expressive languages there is, therefore disciplined typing and expressivity are not incompatible concepts.

So what? Sometimes it's useful when you have to deal with some third party code which is impractical to change (because you don't have sources or whatever) and you need to change it because.

accidentally pressed

That is what inheritance is for.

10/10 user

inheritance is stupid

Try calling $1000000 in inheritance stupid, fag.

...

It was originally about Lispers. Rustfags and Haskellfags copied those tactics from the Lisp community.

Memory boundaries to prevent buffer overflows.

Sorry, I realize I wasn't too clear there.

Basically though, I would've thought that a library could be created that would check that memory isn't overflowing.

For example, the memory allocated could be stored in an int before the data itself. Every operation on that piece of memory could then be checked to make sure it remains within the bounds of the memory allocated.

You've already got that in C++'s standard library type vector. You just have to use the at() member function instead of the [] member operator. C++ provides both because of their principal of "You don't pay for what you don't use". std::array also provides the same things.

That sounds similar to a PL/I area.

Lots of languages implement bounds checking. Ada, Java, Lisp, Pascal, fucking ALGOL 60... On some architectures (but not amd64) it can even be enforced at the function/module level via memory segmentation.

Areas are most similar to Ada storage pools out of anything in the languages you listed. They're not strongly typed arrays like you have in Algol and Java. They're blocks of bytes which can be used for allocating other things inside of them (including other areas).

So is that like obstacks in GNU libc?

Are you braindead?
Look at the shitload of 0days discovered every day in mainstream software. Think for a minute or two.

Don't worry so much. The hackers won't get past our firewall, so it's fine to use C everywhere, and even better in things like OpenSSL. ;^)
ranum.com/security/computer_security/editorials/dumb/

waving_dick_so_hard_it_breaks.jpg
Oh our mighty lord please tell us. Protip: you can't, because you're just dicksucking.
POZilla please leave.
I'm asking the same question: You aren't doing anything useful in society, What difference does it make if your fancy fizzbuzz in Rust is secure or not?

Dlang or FreePascal. The two sane and "popular-enough" languages.

What do you mean exactly? Do you want typeless byte stream with fixed length?

D is hard to swallow at first, but once you've loosened up and get into the rhythm it's a fun ride.
Don't try forcing it or you're in for a world of hurt. That applies in general but I guess tetanus impairs people's judgement

Very subtle, no one will ever notice...

Sorry you messed up

What the fuck is wrong with people here?!? Not one of you tested, replicated, and compared code to test the validity of the article, not one!
I assumed even one of you would compile and test the C++ and Go source codes, and reply back with the sizes and difference, but you've masturbated to your ideal language of choice!

Fucking hell not one line of source code shared. Holla Forums why are you so shit!

FOR CHRIST SAKES, NOT EVEN OP CITED THE ARTICLE IN QUESTION:
talks.golang.org/2012/splash.article
IT'S GOTARDS WANKING!