Why C is shit

Why are people advocating this broken piece of outdated shit again?

Other urls found in this thread:

gofy.cat-v.org/
github.com/redox-os/redox/issues/303
port70.net/~nsz/c/c89/c89-draft.txt
developer.gnome.org/glib/stable/glib-Memory-Allocation.html
muen.sk/
templeos.org/Wb/Doc/HolyC.html
quellish.tumblr.com/post/126712999812/how-on-earth-the-facebook-ios-application-is-so
cplusplus.com/reference/cstdlib/qsort/)
englisch-hilfen.de/en/exercises/tenses/do.htm
wiki.theory.org/YourLanguageSucks
en.cppreference.com/w/c/language/generic
artlung.com/smorgasborg/Invention_of_Cplusplus.shtml
twitter.com/NSFWRedditImage

Kill yourself.

Use C++ then, Pajeet.

I don't understand your issue with your first two complaints. They are both pajeet-tier coding. Logically and Mathematically, it makes sense they would be undefined. Allowing them to be undefined and not enforcing some arbitrary order in the spec ensures more freedom for the compiler to produce better optimizations.

Reading through the rest, yeah, you just seem kind of stupid. go back to java or python.

All of the things you complain about being undefined are things you shouldn't be doing unless you really know what you are doing. C won't waste time checking shit for you at runtime, get good or get out.

C doesn't hide the truth from you. On the hardware everything is ultimately a number and how that number is interpreted depends on the context. C lets you change the context at will because the context is purely imaginary in the programmer's mind, so it stands to reason that the programmer should be allowed to change the context at will. Of course if you do something stupid in the process it's your fault.

not an argument

Go back to JavaScript if you want a language to hold your hand through every statement, line and problem.

C isn't for Pajeets, nu-males, or "diversity hires."

Stop it OP, you're triggering people!

In addition to PEMDAS, stuff is evaluated from left to right in most cases. Formal logic uses similar systems. (a, ++a, ++a) would therefore logically be (0, 1, 2). Second one is a bit more tricky due to the =, but should still be left to right.

If I criticise your favourite language, would you die?

C is for low level development and performance and the language assumes you'll work with it to achieve that performance. If you can't understand why those things have intentionally been left as undefined behavior, it's not the right language for you.

It would be extremely triggersome

Because it makes sense. Basic hardware level programming cannot be completed in Lisp, Rust, or Go. Why do you think there is an international standard when it comes to C? Talk back to me when you have a decent kernel written in a language other than C. C will always be relevant because of GNU, BSD, and Linux. Stop being a code LARPer.

This too.

C is for kids who are too retarded for Assembly but want to feel hipster

any = is evaluated right to left, which is why
var1 = var2 = var3 = 0; works

Here's your argument, Pajeet.

void kill_yourself(Person &op)
{
delete op;
}

It strikes a good balance between bare-metal access and human readability, and can accommodate inline assembly where it's truly unavoidable.

C was created in a time where programmers were expected to be intelligent.

Anyone who does this deserves to be punched in the face numerous times

likewise

tl;dr you're dumb

the fuck is PEMDAS

something most of us learn by 3rd grade

holy fucking shit Holla Forums has actually been taken over by 6 year olds.

the fuck is google

oh, BEDMAS in merica.
how are you supposed to pronounce it?

Shit, I'm a Murrilard and I knew PEMDAS.

Let me guess, you're here to sell us your political ideology thinly disguised as a cloudweb2.0meme language

Found the LARPer

You are a fucking idiot who doesn't understand either C or the actual reasons why it sucks. Neither do you understand that words have meanings, apparently. "Technically undefined" is semantic nonsense, something is either well-defined (by the standard or the implementation), undefined (explicitly or by omission) or ill-defined (no such thing in C, thank dog). If something "works" under an assumption, such as certain implementation quirks, and said assumption doesn't actually universally hold, that something doesn't work. Addition is no "working" multiplication just because 2*2 = 2+2. In (6), you don't get "working memory", you get "something". Maybe. Who knows, it's not defined by the standard.

Most of your complaints about undefined behaviour are pointless since the operations in question don't admit a sane definition independent of the implementation and therefore cannot possibly "work" on some platforms but not on others. (5) is especially retarded because your alleged workaround doesn't do anything. You memcpy'd an int to the memory pointed at by a float*, but you still can't dereference it without invoking UB!

(1) and (2) are shitcode that I won't even comment. (8) makes up for (7) — the real question is why you want to mix data and function pointers because this always ends in tears. (9) is wrong. (10) is unrelated to the language. (11) is due to unwarranted assumptions on your side and contradicts itself. (12) void pointers are for implicit types, which should have been obvious. You also don't know what "by definition" means. The casts in (13) are all explicit, so you'd have to fuck things up on purpose, unlike in C++.

Sage because this is a shit thread by a moron. Actually try to think before you complain about something, this display is fucking embarrassing.


C is mostly written by people who pretend to care about performance and don't even know the language. I wonder if anyone still remembers g_free.

Sorry, I meant that it is what i call BEDMAS is called in merica. I am not merican.

you're a big coder

People wrote operating systems in Lisp 40 years ago and Redox OS is written in Rust.

I've never heard about this, and i really want to know what 4/g/ did to free().

I really want to know what 4/g/ did to free()

Fucking hell, I thought we were done with doubleposting.

It's a retarded rule for niggers who can't keep the order of operations in their brain. It's also wrong, which is why you get those same morons thinking that 30 / 2 * 3 is 5 instead of 45.

Order of operations pnemonics are ambiguous and should be considered harmful.

g_free is a function from glib that for the longest time was defined as "like free, except it ignores NULL pointers".
free already ignores NULL pointers.
These are the people writing C.

This is stupid. The moment you mention how C may not be perfect and could have some errors, you get thousands of C fanboys coming in to defend their favourite Fizzbuzz language, claiming that those errors are exactly what make the language good because (((le pajeeeeeeeeets))) can't understand them.

Not defining order of operations is optimization? Have we gone full retard?

Not only that, but one would expect the "portable assembly" language not to rely too much on implementation, which likely relies on hardware platform for muh performance. Stuff like integer overflow being undefined was probably done to avoid putting additional checks in hardware that didn't comply with whatever rule they came up with, but in turn, they killed their language's portability. Good job!
I know everyone in here knows the C spec by memory (wink), but some courtesy compiler checks for most of these caveats for the Pajeets who deal with C (you really thought there were no poo-in-the-loo C programmers?) wouldn't have been a bad idea. It seems they simply didn't give a fuck about it, which is weird considering we have ANSI and ISO standards for C. Guess the experts were too lazy to actually standarize the language.


The average "pointers make C a language for geniuses such as myself" Holla Forums poster, everyone.

I almost forgot about the Go operating system from the kind people at cat-v.
gofy.cat-v.org/

wtf I hate C now

He said "Decent kernel" not "Operating system", a fucking lisp machine doesn't hold a candle on modern kernels, and the hardware was specialized as fuck.
There is no fucking way in hell you are going to make a decent kernel in Java, no matter how much bytecode an ARM CPU can run.
The only promising? modern kernel I know of is Redox but only because Rust can basically do almost everything C can using unsafe clauses, but then problems start appearing: github.com/redox-os/redox/issues/303
When you develop a kernel, you appreciate the simplicity of C. At some point you start debugging and see there is basically no name mangling whatsoever, symbols you defined are exactly where you expect them, code basically maps 1:1 to asm in -O0, there is no behind the scenes mumbo jumbo that would happen even in a related language like C++, you can pretty much link the object files by hand if you really want, rename symbols at link time, and other stuff you may need when making a kernel.
This is without mentioning the obvious like not needing dynamic memory at all, or the standard library itself.
C is far from perfect, but it is one of the best choices when you are close to metal, because it reduces the clash between you, the language, and the hardware. There are no surprises from the language (If you are cautious about UB), it pretty much steps aside, you may only need to fight the hardware itself.

Yes and it is a horrible choice for just about everything else.
Unfortunately for you systems programming jobs are far less than web related ones, whether you like it or not.

Not him, but one could argue that while there are less systems programming jobs available vs web development, systems programmers will typically be paid a great deal more than web devs

Its just got different names in different places, over here in Britbongistan its BIDMAS

Which is totally relevant when you start looking for a job fresh out of college.

Technically a mnemonic, and often taught to schoolchildren who are in the stages of learning basic arithmetic, but I'm sure you feel like a big man for dismissing mathematical pedagogy.
No, you're a retarded "nigger" who doesn't understand it: parentheses, exponentiation, multiplication/division, addition/subtraction.

Honestly, this is why most people learn Javascript and C, because they both at least resemble each other and one can be a web dev become moving up to serious high-profile shit

How do you "move up" to being a systems dev when you work at a webdev hipster joint? Take into account that most webdevs don't have a formal programming education and are therefore not very well versed in algorithms and data structures which are required for higher level programming.

you git gud

Its really not that hard to understand

back to reddit with you

I don't know pal we seem to be in deep waters over here... at least enjoyed the ride, right?

Please don't remind me of my age, thank you.

does age cure autism? does tfw no gf get better?

Was it a good guess?

I wouldn't know as I don't have autism. I'm also happily married.


A big high.

bit*, excuses.

why are you even here? why do you have standards for women so low?

Why do you care so much about him faggot? This is not /r9k/ for fucks sake.

The vast majority of people don't look like a supermodel so chances are high you will settle down with someone average looking.

that's because they had machines designed for working with lisp. once RISC/CISC machines emerged they dropped lisp machines like an ugly baby

same with java bytecode CPUs, nobody supports this thing anymore since java code gets recompiled into native cpu code anyway

I'm going to regret saying this, but what language do you use?

We never learned PEMDAS or any other weird acronym in school. We were just taught the following hierarchy of rules:

1) Go from left to right
2) Multiplication and division go before addition and subtraction, overrides the first rule
3) Parentheses override both previous rules

PEMDAS makes it look like multiplication and higher precedence over division and addition over subtraction, which is why you get people thinking that 30 / 2 * 3 = 5. By trying to make it simpler with a cute acronym you have actually made it more complicated, that is why PEMDAS is wrong.

i'm not even talking about supermodel shit, all women nowadays have something wrong in their head be it just being insane or completely empty in there

C

...

I meant instead of C because OP obviously dislikes it.

I never said C is a good language for kernels or stuff that requires low level code (in fact, it's one of the three possible applications I can see for C). However, one of the main reasons C is good for kernels is because of assembly compiler extensions and compiler intrinsics. For any kernel, at some point you will have to touch assembly (or let your compiler on steroids handle it for you) to directly interface with hardware, but C isn't the only language that has inline assembly. Ada, for example, also has inline assembly, but other languages may support inline assembly through LLVM extensions, and those that don't (although I admit they probably would be less than optimal) also have FFI.

The other thing that makes C useful for kernels is that it's unsafe as fuck. Safer languages may require more thought in the design phase to better structure the kernel architecture, whereas C can be easily patched over time until it is transformed into something else.

Overall, C is a quite good language for kernels, but it's not the only one, and possibly not the best, depending on what are we aiming for. C is popular for kernels because it's an old and compatible language with tons of support, and because it's relatively one of the easiest languages for these applications, not because it's special or unique.

That said, I would rather see people stop using unsafe low-level languages in places that don't require unsafe low-level languages. We could have stopped years of vulnerabilities if we didn't let people (sometimes with decades of experience in living on the edge with unsafe languages) program critical stuff in C or C++ at 5:30 AM.

Which language do you advocate for kernels?

Why are you assuming it was ever any different?

Depends on what are you trying to accomplish. Do you need compatibility? How critical is performance and efficiency (specially memory efficiency)? How much time are you willing to spend adapting not-so-capable compilers, standard libraries or runtimes? Is security a priority?

Technically, you could write a kernel in any AOT language implementation, and you could write a kernel in any language if you spent some time in adapting VM or interpreters to run in bare metal. Some languages require more effort than others.

A small list of not so crazy languages for OSes:
>Lisp :^)

I would personally pick C or C++, or Ada. Rust could be interesting, but it'a still too young.

You don't have to like a tool to use it, even exclusively. I have a list of complaints for damn near everything I use daily. I defer to Sturgeon's revelation.


most jej

1. Define some arbitrary behaviour as The Standard™ (sign-magnitude implementations don't exist after all)
2. Complain that a language that allows sign-magnitude implementations doesn't mandate behaviour that would be nonsensical on such implementations
What the fuck. If you didn't implicitly redefine portability to some nonsensical concept, you'd notice that C is actually more portable this way.

Have fun trying to write a multiqueue network driver with languages that don't have any way to deal with volatile memory and barriers.

Oh. I thought it was a /g/ meme.

free() (as far as I can remember) does not have defined behaviour when passing a null pointer in C, so I can see why they did that.

port70.net/~nsz/c/c89/c89-draft.txt
4.10.3.2

The description makes g_free sound retarded, but this explanation makes sense:
developer.gnome.org/glib/stable/glib-Memory-Allocation.html

This part does make sense and is actually good design. My point is that the people who originally wrote it didn't even know how the memory allocation functions work, yet they are writing a major library. This sort of pseudo-knowledge is exactly the reason why so much C code sucks.

free(NULL) does nothing since ANSI C.

...

It seems to me that those that speak so highly of C seem to have little reason to use it, and when they do it's just for fizzbuzzing.

It makes sense when I see people praising its speed in domains that require it, like for processing-critical and embedded environments, but it irks me to see stuff like the BCHS stack, and posters here who advocate for its use in literally everything.

Alright then, learn something new every day, I guess :P

You're mistaken, the problem is that the order of parameter stacking is not standardized, not the "order or operations" (even if it's the same, in the end).

You should not confuse people saying that C is the best for everything with those saying that it's the "less worse".

i can't take this anymore

It could be that users were checking for NULL a lot in the wild and they added it as a reminder.

And the people who didn't know free() did this would still not know that g_free() does it.


why would your wrapper do anything except call the implementation. It's a terrible design decision.

It is not, go see my post again. You don't even need that at all, you can have a bunch of nasm compiled objects everywhere and link them where needed.
The main reason is because it is simpler, if there is a problem with your kernel you don't need to fight the language, C will not do anything behind the scenes that fucks you over, you can easily debug the assembly generated, and move around the symbol map.
You definitely never made a kernel, you won't be LARPing this much if that was the case.

Have fun with the unreadable name mangling and secondary effects. It can be done but you will not gain much from C, since it is still very unsafe, and you will lose all the simplicity that makes debugging easier.
I don't know much about this language but I know it is the stricter of those, so it is going to be boilerplate: The kernel. This can be good if you have some special hardware and want the kernel to be secure, but I'm not seeing it for a portable kernel like Linux.
Oh you are the same forth shitposter that shits up every C thread JK. These are REPL by nature, compiling these will end up in either bytecode or some bloated hack that will fuck up your icache, not good for kernels. Lisp is GC'ed by design so it is out of the question, I can see a simple kernel in Forth though, but I don't know how well it will scale.
D is nice and you can also use it without GC as far as I know, but it is too bloated and too complex for the task. It was also made with GC in mind so I'm not really sure how well it behaves without it.
Making a kernel it's pretty much the opposite of functional programming, making a kernel in Haskell is therefore a terrible idea. I know Haskell wizards will do anything they can to prove their language is the alpha & omega, but really don't make a kernel in Haskell.
Rust is an interesting option, I coded with it, you lose the simplicity but you gain the safety, you can also use unsafe clauses to corner everything that can go very wrong, making debugging easier.
Still in the case of a kernel this may mean using unsafe everywhere like I said before, making pointer structures with Rust is a total pain in the ass. Doing something like a cache friendly high performance scheduler will be tricky as fuck.

Instead of trying to put the square peg in the circle, we should upgrade C to reduce the UB (Like having non/arithmetic right shift operator), maybe modernize a little its syntax, whatever it's needed to make it better suitable for modern hardware.

N-n-not an argument :^)

Apparently SeL4 has an executable specification in Haskell. I don't know if it can run on actual hardware, but it supposedly can run actual programs.

autism

What kind of response are you expecting?

You don't need to taste shit to speak against its taste blah blah blah.

Nigger, as shit as the NT kernel is, C++ has been proved to be plenty capable. It is a fully functional kernel, which is already way further than what many others have achieved.

That said, I am not exactly sure what would C++ features bring to the table, other than namespaces. Objects are cool, I guess, but you can do fairly well without them. That's it.

Ada is certainly not the most expressive language out there (but it's not like C wins in that aspect), but it has compilers for many platforms.
Yeah, which is why I said that it depends on what you are going for. We wouldn't be the first ones with that idea, tho.
muen.sk/

Nah, I actually dislike Forth. I would still consider it an option for simple kernels, though.
Lisp was more or less of a joke. I bet you can do it, but reducing Lisp to a low level DSL would be pretty much heresy

I know they added an option to disable GC, but did they fix their standard library shitting itself whenever you did it? I lost track of that long ago.

Something like Cyclone on steroids?

What's wrong is your brain tbh.

I did some toy kernels Terry is on a complete different level though, being able to predict what the linker is going to do is a blessing.
Most kernel developing is drivers drivers drivers, most of the time you are reading specifications, even for non-hardware stuff like filesystems. That's why C is a blessing, if something goes wrong you are not wondering about the language, you can debug the assembly and follow the spec easily.
I'm not following D lately.
I completely forgot about it, something like that would be cool. But really it doesn't need GC or any of that shit, just add whatever it makes sense from a C perspective. The fat pointers from cyclone make a lot of sense to me for example.

That's up to the programmer.
So you'd break the whole tooling ecosystem, for what gains exactly? Making the type come after the variable name? Will you do it again when the next stupid fad comes along?
It's already completely suited to modern hardware.

C comes from an era where you couldn't assume architectures had some specific feature.
I gave the simple example of right shifting. Right shifting a negative number is undefined behavior, why? Because if you make it a defined behavior and the architecture only had logic shifting it would have to generate bloated code, maybe unnecessary since you only needed a logic shift.
If you want to rotate bits, you have to either use an intrinsic, or pray the compiler understands that your complex expression means rotate (GCC managed to do it, clang not).
Popcount, lead zero bits, all also non-standard intrinsics.
Bitfields don't have an order defined within units, a necessity when making kernels, so you have to pray the compiler doesn't fuck you over.
So no, it's not simply "Up to the programmer".
The asm directive is ancient with an horrible syntax, and I always have to look it up just to make sure I didn't fuck up somewhere.
Vectorized operations are completely alien to C, the same for multithreading. The volatile keyword wasn't even a valid viable option for multithreading, until C11 came out with atomic types, you had to use asm "memory" to ward shit properly.
Other languages have a lot of things a C cousin could use:
A swap operator would reduce a lot of boilerplate, good for compare and swap situations.
The else clause in fors in Python, so you don't have to use goto for no reason (I think these were proposed in C++). And a better way to break from nested loops.
Hell, just shit from Holy C alone is enough to make me wonder about the current status of C. templeos.org/Wb/Doc/HolyC.html
The good thing about C is that most tools work on the object level, they don't give a shit about the code. So no, you aren't going to "Break the whole tooling ecosystem", you will be able to link and call other libraries just fine.
I love C, but it is not perfect, it can be improved.

I forgot to add, the best argument against C being "completely suited to modern hardware" is this one:
Linux is 100% ANSI C (C89) code, yet only compiles on GCC, when there are infinite fully compliant ANSI C compilers. That's because it is completely filled with non-standard things out of spec. So, if C is really suited to modern hardware, then why does Linux fully depend on GCC behavior?
Because it isn't, even Linux needs to use GCC extensions to make it work.

This.
Beside what you named, proper strings (and proper string management functions), generic data structures and namespaces would be cool.

for you

What are your thoughts on Go?

The volatile keyword simply says "Don't make assumptions about the value of this variable", if you want to ward shit properly, use a different method because it is completely the wrong tool for ths job.

The thread appeared to have 404ed, but that was a bug. It's this same old problem again.

It seems the last post didn't work. I'm trying something different now. If it works I'll make a guide for that too.

The thread will be archived and a new one will be created to bypass the problems, the new thread will link to the archive so that it can be continued.

Yes I know, a lot of people don't know this, they completely miss the point about volatile, you need something like the asm(:::"memory") trick, and even that doesn't work in all cases.
Today C11 offers you atomic fetches and atomic types, so you don't need any of that anymore.
The thing is, that trick was GCC specific, so C literally never offered a solution to that problem up until C11.

Let me sum up OP


There are safeguards in place to prevent you from most reckless type casting and memory handling, but if you tell the compiler you still want to do it, it assumes you know what you're doing. If you want to be safer but still have the option in parts of your program, use C++.

The rest of your points are stupid except for the first one. If you don't want to deal with memory directly use another language, it's the price to pay for fast code because that's how hardware is built. Memory doesn't have types, just random sets of blocks set by the compiler that you can use how you want within your own program.

i wish larpers would leave

...

It's simple, it doesn't even have generics, it's good to have a high level language like that, even if it has GC. It goes without saying if you really need generics or memory managment don't use it for your project.
Other than that I never coded anything with it, I only can assume it is a good choice if you have to do some concurrency heavy application. Since google is behind it you at least know it is not going to die tomorrow, so it may be worth it learning it.
Saying any more than this would be larping though, since I really don't know much more.

C is shit because you are too lazy to learn pointers and algorithms faggot, fuck off

...

Can you show us your low level C Fizzbuzz algorithm, please?

The one about evaluation order of function arguments?
I'm honestly not really sure why it should be defined.
Imagine you have a function call foo(a+1, b - 5, a + c);
If the eval order is defined, the compiler will produce code that
If you're out of luck, a and b might conflict in cache, and you'd get two cache misses*.

Now, if the order is undefined, the compiler can just
which, if you're out of luck, would produce only one cache miss*.
Only a retard would do such a thing as foo(a, a++, ++a);

* misses caused by a and b conflicting, of course, there could be an additional miss because of a not being loaded in cache in both cases

out

I'm talking about sanitizers, static analyzers, auto-formatters, text editors with syntax highlighting. Tools that operate on the source code. Changing all the syntax for the sake of "modernization" throws out all that hard work, and doesn't actually gain you a damn thing.

It's going to be yet another brace language, almost no change, like how you can use clangformat in Java.
Wow, new-language.vim, we are doomed.
Things like valgrind should work though, and the changes over C are not big enough to require super big changes in sanitizers/static analyzers. The language can be implemented over LLVM, and then implement support in the clang static analyzer which already supports C/C++/ObjC.
Nobody said changing all the syntax, Besides, it was only a thought, no need to sperg all over it faggot.

What's your recommended toolset for C static analysis and formatting? I've been wondering about what I should be using.

Even then, it's easy to predict when it's going to require a strict order of operations and when it's not, and translate it accordingly.

Do you even understand what cache is and how it works?
That means no defined evaluation order.

Yes, but apparently you don't know what registers are. Protip: it's not the cache you are talking about.

It means "no undefined behaviour". If the spec said "in case of more than one ++ or -- operator over the same variable appearing in a function call, the expression must be evaluated from left to right, otherwise, it's up to implementation", while the latter would be "undefined", it would matter jack shit to the programmer because the result will be the same.

So cases that only matter in pajeet code should shit up the whole standard with their extra treatment? I hope you never have to write a standards document.
The best part about this is that it would still be undefined because of sequence point rules.

Standards documents are autistic shit that define everything to the smallest detail. Only C seems to get away with defining very little because the community loves it keeps (((le pajeeeeeeeeets xD))) away from their sacred language.

Muh memory barriers, if you are that worried about it.

Pure autism.

Why bother arguing when you can just define yourself to be always right, huh?

Just use clang tooling.

C is only for elite pro hackers like myself who have completed over 3 challenges from the daily programming thread. I've never been paid for my work but that's because programming is about the love of software and not feeding myself(not that I have to worry about that anyways, mom takes care of that). If you don't appreciate crawling through asm of a million line project(I've only ever done 30 lines though) then you just aren't hardcore enough to be a real C hacker like myself. In fact, you're probably one of those le pajeets that uses Java. Oh, speaking of java, did you know C structs are actually better object oriented programming than every OO language in existence? Now you know. Why? Because C structs are objects without the bloat that makes java run 10000x slower than C on my bit shift fizzbuzz.

I hope you unenlightened """""programmers""""" will one day see the beauty of undetectable OOB access, impossible networking and undefined behavior powering code worth billions of dollars.

no please, just no.

it might not be perfect, but it's the best we've got :^)

Explain.

He's just a retard

C-Fanboys 2 - Electric Coogalo

I legitimately want to hear his reasoning on how networking in C is impossible.

...

Enjoy your latency.

Really? Is that the greatest fecal buddy you could shoot out of your mouthanus? Try harder next time.

not an argument

not an argument either

C networking isn't anywhere near impossible, I partially wrote an IRC server about a year ago(oh Score_Under, come back to me!), in c++, using the basic posix sockets and the only thing that really caught me off guard was SIGPIPE,select(), and reads returning too little data.

I then found out, when trying to rewrite it in Rust(since I liked the idea of C++ being redone) that rust apparently doesn't have non-blocking sockets. That was my breaking point with Rust, since the whole safety thing was pissing me off.

I find it funny that people advocate a language that's missing non-blocking sockets. How?

Why not poll()?

I didn't know about it, and Score_Under used select() in the IRC server.

Wait, nevermind. I'm going crazy I guess. It still handles user iteratively?

The fuck?

That probably stems back from when Rust was still trying to implement green threads. Ideally with green threads you wouldn't need to know that they are implemented with nonblocking sockets behind your back. Still, this isn't really a great excuse.

What do you think it gets written in? I write networking code in C
(and sometimes C++) for a company you might have heard of, and traffic that isn't purely routed (which is an increasingly large amount of traffic) has to be handled very efficiently on the CPU. On a 10Gbit network with 9000 byte packets that means processing nearly 140,000 packets per second and ensuring no more than 7 microseconds is taken per packet.
Think about all the language features you take for granted and how they'd fit in a 7us world. Go claims to have 'super fast' garbage collection at 10ms. That's over 1400 times slower than the maximum delay you can tolerate. You'd have dropped a thousand packets and caused network stacks to panic and go into recovery. What about other shit meme languages? Rust doesn't even have a way to specify thread affinity so packets would end up on threads they were not queued on requiring the CPU copy them between the cache or RAM. There's just not enough bandwidth for this and you'd be throttled. And all the buffer checking they do would be too heavy, anyway.

LOL

what

The design is entirely mine. A lot of the code is, too. Teams are much smaller than you'd expect at this level.

Why is it that all Holla Forums anons are professional low level programmers?

I'm a right-wing low-level programmer. Where else can I go?

Don't mind him. Programming closeish to the metal is no work for a code monkey.

Is this a sleek new variation of the seal pasta?

No, that's basic math showing you why we use C you hopeless webdev child.

Sure you do bud. In your company that "we might have heard of" right?
You might have heard of my company. It is called facebook. We use C++ because it is superior to C.

Why we use C is an entirely different discussion than is C good? Network programming is a pain in C regardless of whatever you LARP as.

Facebook isn't a performance-oriented company. It's easy to just throw a ton of servers at their problem. And if you actually worked there you'd be familiar with their ruined tower of Babel and not say "we use C++" like it means anything in a company that is crippled by use of over a dozen languages.
We usually don't use C++ as it complicates sharing/moving code between userspace and kernels. There aren't many benefits to C++ with this kind of software and a lot of drawbacks.

Okay, now I KNOW you are just larping.
Thanks for the dead giveaway.

...

WHAT THE FUCK ARE YOU DOING

Network programming in C is about the same as in C++. There's nothing it provides to ease the pain.

sockaddr_storage error_socket; char error_packet[IP_MAXPACKET]; struct iovec msg_iovec; (void) memset(&msg_iovec, 0, sizeof(msg_iovec)); msg_iovec.iov_base = error_packet; msg_iovec.iov_len = sizeof(error_packet); struct msghdr msghdr; (void) memset(&msghdr, 0, sizeof(msghdr)); msghdr.msg_name = &error_socket; msghdr.msg_namelen = sizeof(error_socket); msghdr.msg_iov = &msg_iovec; msghdr.msg_iovlen = 1; char control_buffer[64 * 1024]; msghdr.msg_control = control_buffer; msghdr.msg_controllen = sizeof(control_buffer); ssize_t error_bytes_read; if((error_bytes_read = recvmsg(sock, &msghdr, MSG_ERRQUEUE)) < 0) {

...

>quellish.tumblr.com/post/126712999812/how-on-earth-the-facebook-ios-application-is-so
I really hope for Facebook it's not "performance oriented". Pretty sure it's just like MS: streetshitting on an industrial scale.

Facebook is like if you took the Google philosophy of throwing hardware into a magical hopper that feeds a cluster but instead of hiring old experienced engineers you hired young programmers who are likely to work just as quickly but not as well with that cluster. If HHVM doesn't prove they're not performance oriented I don't know what to say, they ran into such a performance problem that they had to write new tools just to deal with how inefficient their code is, I'd wager most people there now are writting glue to wire all their projects together since they are for sure not just a C++ company.

The dirty little secret is almost every networking company that handles 10/100Gbit links is shipping proprietary modifications to Linux. As much as possible is kept in userspace in case they ever are forced to release their changes. It's just not possible to get the necessary performance otherwise.

Jesus, do you kids know nothing about how the world works?

This smells like bullshit. Unless they are selling/giving away their software or hardware containing their software, they shouldn't worry about it.

I would expect a big company to know how does the fucking GPL work.

The performance-oriented teams at MS are very small. Some years ago the compiler team was 6 people. Their Pajeet army is used instead for writing those shitty and broken apps like the new Alarms & Clock app.

You'd be in for a surprise. Want a real shocker? When networking companies merge, rather than go through a lenghty integration process as in the '90s, they slap both firmwares in a box running vmware, add some minimal glue between the two, and ship it. That's the ducktaped pile of shit you likely have in a rack at your company. Docker is increasingly used for this, and I've recently been asked to convert our entire product into a Docker image.

Good companies are very restrictive about the number of languages used. If you've worked at Google you'd have experienced this even though Google is one of the sloppier companies.
Facebook on the other hand just let people write in whatever during their formative period. They have everything from PHP to erlang over there. This created their biblical disaster where no one programmer has any hope of being able to cut a path across all projects. Making a cross-cutting change to their site requires a lot of team meetings and interfaces, a worst-case of siloing. It paralyzed them and is why they weren't able to do anything as twitter stole attention.
The engineering side of their company is the sloppiest mess you'll ever run into in software development. Example: almost everything they've written is stored in a single massive git repo.

Yeah, okay.

They dropped their erlang project already.

For example, I fire anyone I catch speaking mexican.

You are a true american.

Git gud babby faggots.

Here is a pity (you)
try to work on your baiting techniques

Not him, but I do think those in Holla Forums that hate C are doing a fox & grapes thing. Being stuck in webdev means you're in the SJW + Pajeet software shithole and you never get to work on the fun stuff.

...

c is fun, i enjoyed my data structures & algorithms classes at college but it is not the best language for everything.
its also not as hard as people claim but is just an obvious troll either way.

That's not what I said nor implied.

I use C, C++, sh, python, lua, and javascript. There is very little reason to use anything else unless you have to make an android app, and each language you add to that pile increases the difficulty of finding people who can work on all of them.

I would never say C is the best language for everything, but the problems with a poor use case for C mostly amount to "this is hard" whereas other languages misused end up being much more of a pain in the real world.

It still bugs me that Ruby is now seen as some kind of language suited to network performance (when it was practically made for scripts) or that people are writing enterprise-like software in languages with dynamic typing like Python. Good luck with your libraries in the future.

>look at me i use all these complicated syntactically identical languages
You sure are special and of course you are right, finding people who can work with ALL of these exotic languages is probably nigh impossible.

The only people complaining about C being "hard" are the self-taught webdev cancer hpsters.
Everyone else knows how to deal with it. Wouldn't you agree that people who know C are allowed to point out its weaknesses?

This really sums up why noobs use C. They were taught JavaScript/Lisp/PHP/Python/Ruby/Perl in school so they think data structures without type tags and overhead is a C thing.

I have no real attachment to any one language, although the things I dislike about C are usually more to do with GCC than with C proper, like disabling strict aliasing.

a company should only use my ass

kill yourself, you spicy-fizzbuzzer anime poster

Why? Unless you mean for work in which case
kys

But that's wrong, you fucking retard.

u srs

the fuck are you even doing with your lives, lads

Acceptable tier. High tier if Lisp machine assembly.

JavaScript is functional, fag.

lel m8, next you are going to tell me that Eich thought of Scheme when he shat out JS as if that counts for anything

We use lua mostly for custom wireshark dissectors for our protocols and as an embedding language in one tool. sh gets used everywhere (unfortunately). Python is used to replace the shellscript plague and for UI work.
I'm not a "full stack" dev, I'm a full product dev.

You only have to disable strict aliasing if your code is shit.

All the things you mention require the language to program things for you in your stead.

That means that in certain cases, many many cases. The best option will be out of your hand, because going out of your way to evade the "default" solution will be slower.

If you have any kind of work that TRULY requires performance, then you'll have to live with the counterintuitive stuff.

If you don't? c and c++ are still pretty good. If you have so many problems with them however, use other stuff.

Just use one of the assorted meme languages done for baby paheets like you.

:^)

I think you mean C.

...

Binary signals all depend on how they are interpreted. A uint8_t is still 8 bits long, as is a int8_t. Since they are the same length, I can add the two and have a byte of data. I can choose to interpret as a twos-compliment number (signed), or as unsigned for whatever application I need. C deliberately does not define this behavior, its defined by the architecture, which for the entirety with the few old oddball exceptions that uses one's complement, goes with two's compliment notation for signed values.

If I shift a int8_t, it will shift it the same as a uint8_t. The gotchas are the same with regular machine language: shift left acts like multiplication, and shift right acts like division, and so presents a challenge. The behavior is to extend the msb if shifted left, or the lsb if shifted right. Regardless, This behavior for right shift is defined in ISO 1999 as divisions by powers of 2. (6.5.7)

Multi-threading is handled by kernel scheduling features of an operating system, and therefore is not portable. Multi-threading features is therefore handled by libraries, for example, pthreads for POSIX. As for your timing in your reads and writes, threading solutions place the power and responsibility of properly scheduling your resources to you, the hacker.

Not C language applicable. If a compiler doesn't have optimization toggle features, it is trash. If you aren't properly making a debug binary to debug, you are an idiot.

For a fink who cares about undefined behavior, this is in C99:

Pointers contain an address, that's it. Pointers don't have a 'type', only the data the pointer points to does. Yes, even that pointer that points to another pointer. You even said it here:
If you even play with just a little bit of assembly (of any architecture), there is no escaping pointers, as they are a critical part to ensure the machine keeps branching. (6.3.2.3 #2)

Just like above. This is deliberate behavior. A good example of a proper use case is the qsort function of stdlib.h: (cplusplus.com/reference/cstdlib/qsort/) which permits "generic and typeless" programming. (6.4.2.2 #1, 6.4.2.3 #1)

The words of ISO99 6.5 #2 was chosen careful to make this behavior undefined, and let compilers decide how to handle it. As you said, is it incrementing then executing? Or executing then incrementing? Even if there was defined behavior, this would be bad practice.

It would be better to use
int a = 0;f(a, a + 1, a + 2)a+=2; //if needed.
rather then let others guess the intent while you autisticly try to reduce line count.


Holla Forums, you are consumed by /g/ cancer.

Not every Holla Forumsfag went to a 'murrkin school, you retard. Anyone for whom English is a foreign language probably never heard of this mnemonic.

then google it you stupid mongoloid

CAN YOU FUCK OFF WITH THAT REDDIT FORMAT SHIT

Point at this retard and laugh.


Thanks, doc.

What the fuck am I reading. Try casting a pointer to T to a pointer to T2 and back if T and T2 have incompatible alignment.

not an argument

(you)

Keep silent, ignore the posts, let them figure it out through osmosis. They need to lurk more.

gas yourself

Nice goalpost moving, Chaim.

If it helps, even in England it's not PEMDAS. It's either BIDMAS or BODMAS depending on who's teaching it.

...

Who else would you expect in a thread about a professional low level language?

So shitposting and trolling aside, what's a recommended book to learn C?

I see a lot of people recommend C Primer Plus.

...

Assembly is fundamentally not cross platform. If you only target one specific platform then it's OK to use it, otherwise you should stick to C.

The correct form is "doesn't implement".

= is assignment, not equals. I don't think the rules for logical = signs from math apply here because I don't think math has something like an assignment. And assignments are defined as right to left for a ton of obvious reasons.

thank you for your valuable contribution grammar trump

I will probably get leaped on for saying this, but K&R's book can be quite difficult if you don't come from a programming background.

At the same time, though, once you feel you "get it", it really sticks.

Take this while you're at it
englisch-hilfen.de/en/exercises/tenses/do.htm

ITT: proof that you should never take this board too seriously

this

Was my first intro to programming and it was pretty easy. I think you need to be actually intelligent, though.

FTFY

How is this a design flaw? Isn't it how it's supposed to work?

Javascript programmers who have never touched assembly get confused as to why C is designed the way it is.

Just use Lisp.

problem solved

/thread

OP is a retarded diversity hire who should stick to Python.

Programmers who aren't suffering from stockholm syndrome get confused as to why C is designed the way it is.

loose spec is the basis of C
if you are unsure about something you shouldn't use it
edge cases often wouldn't be worth supporting due to differences in cpu arch
unspecified just means read the compiler docs and don't expect immediate portability
for better or worse, this is "worse is better"

It's a thin layer on top of assembly that leaves as much room as it can for an optimizing compiler to choose the code generated. There's nothing wrong with the language. The only thing I'd change is getting rid of header files.

We'll, keeping pajeet away is a big plus tbh

Don't be a pleb, K.N.King's book is for true patricians

You don't need header files, you can just extern everything (A function declaration is extern by default). They are a practical way to group externs.
The best good thing would be having modules.

Most compilers won't require to explicitly declare header files if you really don't want to

Maybe I misunderstand you, but having different memory pools, each with different malloc implementations is a thing, certainly in the embedded world where malloc itself frequently does not exist.

You are why I hate imageboards.

The best book on C I ever read was C Primer PLus by Stephen Prata

It is not impossible just way more work intensive than you want it to be in most cases.
Unless you use a bloated lib of course.

reeee

No, you need header files. But in a better design it'd work via embedded metadata in an ELF section similar to stabs/dwarf and extracted by the compiler when linking and then optionally stripped. It'd eliminate maintaining a lot of duplicate declarations and also obsolete a lot of hacky debugger support. That's more like how modern languages work where there's a heavy reliance on introspection but it wasn't a concept at the time C was developed.
I'd also like to see a return of automatic external template compilation, with the necessary language restrictions to make it work. Old oldfags (I'm 38) might remember this from Sun's compiler in the SunOS 4 days. It would prevent the hell that is manually wrangling explicit instantiation.

Header files are a thing because the preprocessor runs before the actual compiler. If you can't grasp this, you don't understand C. There's no possible reality where header files wouldn't be a thing.

Templates were a mistake.

You're a moron, user. I don't even know how to explain this to you without just restating what I wrote. Consider this: could you embed enough information in a library at compile time as to create a tool that generates a header file when provided the library? If this tool was just a step done internally by the compiler would header files be necessary? Why do other languages not require header files?
And to stop several other retarded replies I can anticipate you making: How much of a header file is redundant information? What additional purpose do header files serve for libraries where the source is not available?

No, user, listen to me. Header files are a thing because they are files the preprocessor includes. Do you understand that part? I'll be patient and explain the rest once you make a little progress.

Once you figure out what you're trying to say, explain it to me.

Okay, you seem to be talking about header files as if they're the C equivalent of a module system. But that's not the preprocessor, it's the linker. Each compilation unit is a "module." The linker links them together after everything is compiled.

The preprocessor is a different system from the linker. Putting function declarations in a header file is a convention, not a part of how the linker works. If you don't understand that much, how can you possibly design a "better" C?

tl,dr: A module system would replace the linker model, and you are complaining about the macro system because you can't tell the difference.

I'm beginning to wonder what's wrong with you. Regardless, I can't fix it from here.

Paste anything from the header files inside the compiler units.
Look mom no headers!
That information can be taken from somewhere else, they don't need to exist. They are a thing because the linking process is made agnostic this way, the linker only needs to know about symbols, addresses and sizes, not types. That's why you can take three objects from three different languages and link them together using ld and it will not give a shit.
If this information was to be stored inside objects the process would become a lot more complex, either the compiler will need to care about other compile units, or the linker will need to learn about types. Both solutions are ugly and the reason why this never changed.

C is only shitty for beginners OP simply because it requires them to master many subtle complexities and outright plain idioms before being able to create even moderately functional programs. For medium and high developers it's fine (as long as the project it smallish).

C++ is better for beginners, or even Python.

...

It's better than Rust.

PEMDAS DOESN'T EXIST. STOP THIS NOW.

Everything is better than Rust.

Does everything have fearless concurrency? No? I thought so.

Good job fagtron, you named the one feature in Rust that makes you look like a retard and a pajeet.

Kill yourself immediately.

Does everything have guaranteed memory safety? No? I thought so.

...

Does everything have zero-cost abstractions? No? I thought so.

This would be expected from a multi platform language that runs close to the hardware as different architectures may implement overflow behavior differently. If you want a slow language then choose a slow language, if you want to research your target system and insure your compiler will behave in the way you want then go right ahead. If you want to create your own structure that rechecks for overflow situations and forces the overflow to behave as you wish then go ahead.

Again your doing low level shit that is close to the hardware, and different hardware will implement this shift differently in an overflow type condition. What happens when you exceed the torque rating on a bolt? Well some languages will just not let you torque it down any farther taking the control out of your hands. C lets you use your tools how you wish and expects you know what the fuck your doing.

Kek even your example is ambiguous reinterpret_cast or static_cast also sizeof(char*) may be different than sizeof(float*) so fucking check please

If you deference a null pointer then your doing something wrong.

On some platform an int* and char* have different sizes.

False

If your debugmode code runs and your optimized code does not then you need to learn your compiler better or stop using so much undefined shit.

Then use higher level libraries pleb

Yup

Speed limits in general are useless because you can (almost) always get away with speeding.

C/C++ tries to give you a large set of low level really high speed tools, if you are not able to use them responsibly then go play with python java or c# instead, you whiny little bitch.

Really the thing I would add to C++ is to add the ability to ask a pointer to memory on the heap how large the thing it points to is (as it tracks this info in order to run the destructor already)

Oh let me take a fully featured x86 computer, convert all my programs down to some weak ass RISC instruction set then try to re-optimize it back up to full x86, at run time every fucking time I run. That sounds fucking brilliant.

If you even think of using this, you should stop programming and start drinking bleach.
Do you need everything to be defined, legislated and enforced by the Streetshitter Programmer Police Department? It's not undefined in practice.
If you need it, define your own shift through %sizeof(type).
etc, blah blah blah
Would you prefer having to write your own casts between what are basically just different names for "a number"?

Nice arguments, they boil down to: "don't do that" in a language that presumably allows you to do whatever you want.
GG C CUCKS

The fuck are you talking about? Is this your average C fag?

If you knew enough to write an optimizing C compiler you'd undersrand.

nothing stops you from licking out a 11 year old girls cunny either so go ahead

But user, that's only natural. The software world operates on the law of the jungle. It's every man for himself. You are the only one responsible for your own mistakes. If you ever fuck up, you will die. Understand?

That's how it's always been. The millennials are trying to change it and make it into a "safe" profession. They just don't understand that the whole point of programming is paying for your mistakes with your life.

- t. someone who doesn't know how to write an optimizing compiler

Good job copying and pasting this off some wiki page aptly named 'YourLanguageSucks'. You're legitimately retarded.

Link: wiki.theory.org/YourLanguageSucks

...

Those "weak ass RISC instruction sets" were made for C compilers.

Compilers for other languages like PL/I actually used these CISC instructions that can't be used by C compilers because they don't exist in C. Some x86 examples are two-operand ENTER for nested procedures and BOUND for (lower and upper inclusive) bounds-checked arrays. These don't exist in C, so C compilers will never emit them.

Compare your classical RISCs (such as MIPS) with the features of the C language.

Why do I have to do two shifts and an "or" to implement rotate?

Why doesn't it have any way to properly handle an integer overflow?

Why do I have to use double-width numbers when all I want is a carry flag?

Why does it have per-process flat address spaces instead of segment-based or object-based memory protection?

Why isn't there any support for nested procedures or closures?

When will implementations stop having to turn loops into single machine instructions because the weak ass abstract machine doesn't have those instructions?

The point of C is being fully portable with the bare minimum amount of assumptions. Any assumption you make for one arch may deteriorate the implementation of another.
And that shit about enter and bound not being used by compilers because C doesn't have that is bullshit. It's like saying C doesn't support vector instructions so compilers will never emit SSE instructions. They don't emit ENTER because it is slower than just pushing manually. And they will never emit BOUND because that will unnecessarily gimp performance, if you have assembly code that works, bound is useless.
I can agree about things like not having a rotate, but it isn't that big of a deal.

C does have rotation via compiler intrinsics. The GCC and MSVC both support the same set of functions for rotation, although looks like different headers because MS is special, and I'd imagine Clang, ICC, and AMD's compiler all support them aswell.

C makes a lot of assumptions about memory layout, integers, and pointers.

The only assumption it makes is that it's running on a computer.

It does, but it's a bare minimum (or at least, it was). Embedded C was made to add support to multiple address spaces and fixed point arithmetic for example, common in DSPs.

user, I...

Just a reminder, there are languages that people complain about, and then there are languages that no one uses because they are shit. C is one of the former and still has its place in the world.

Deal with it.

Also most of the OPs complaints can be put down to being a scrub and wanting the language to hold your hand. As Stroustrup says, C may make it easy to shoot yourself in the foot but when you shoot yourself in the foot with C++ (or higher level languages) it takes your entire leg off as well.

That last bit makes no sense. Shooting yourself in the foot in C may mean compromising the whole system, akin to taking your head off, whereas shooting yourself in the foot in a saner language often just means shooting yourself in the foot.

Alright you C pr0s, answer these two questions:

Lots.
Compared to what? Bugs exist in any language.

If I wanted a thin abstraction layer close to Assembly, I'd be fucking using Fortran (which is actually pretty fun to mess around in tbh).

To break C++ you have to purposefully go outside of C++s safeguards, when that happens you venture into the realm of undefined behavior and get fatal errors in pieces of code which can be far removed from the cause.

The compiler will let you go outside of the language safeguards if you want but as soon as you cant rely on help from diagnostics to save you if things go bad. Not to mention that when C++ fails it fails really bad.

By contrast C tools and compilers almost expect you to fuck up and because of how the language is structured bugs are often near the cause so its often easier to diagnose.

lol

You're fucking retarded.

Others have answered already, but.
Why would you do that?
Why would you do that?

Just do a your 1 +, or + 1s, and assign it before or after it. Thats a blatant array out of bounds waiting to happen. Same goes with the function call.

A pointer just points to a memory address. If anything there it can use, it uses it. The type casting is basically telling it how to interpret it. This actually gives you more freedom.

C is like a butane torch and a canister of gasoline, if you give them to an idiot, he will fuck shit up, but if you give them to a genius.. well i'm not quite sure what a genius would do, but I think I made my point already with the idiot.

Right. The solution is not to use other languages that aren't C. It's to hunt down the idiots and stop them from programming for the rest of their lives.

according to the standard you only get to use one increment per statement so stuff like a++ + ++a or ++a++ is for burning noobs on job interviews

Stroustrup didn't say that about C++ because it's higher level than C. He said that because it has the safety of C but there are even more ways for bugs to cause damage.

Most higher level languages require you to use a special mode or library to shoot yourself in the foot. Higher level languages are also safer because "unsafe" features like pointer arithmetic and pointer to integer casts do what they are meant to do on the actual hardware.

It has to do with the programmer's intent. These safer languages only need these features when you need raw access to memory, so the compilers assume that you actually need to use those addresses in those particular ways. High level features like strings and arrays are totally separate from these low level features and have separate optimizations.

C (and C++) sometimes require you to use unsafe features because there is no other way to use arrays, strings, and other useful features. In order for C to be fast, the compiler has to "optimize" these features by treating them more abstractly than your hardware does. It might treat contiguous bytes terminated by '\0' as an abstract string, which may not do what you want for lower level code.

An idiot with C sets himself on fire. A genius with C sets millions of people on fire.

Either way, you'll have a good time if you bring marshmallows.

Okay, because this thread won't die, what syntactic sugar would you add to make C nicer?

I'd personally like some sort of template support, so that common stuff like MIN() would no longer have to be handled through the pre-processor.

They did that in C11 with Type-Generic Expressions. You still have to write all the functions for all the types you want to support, but when you use it, it will expand out to whatever function is appropriate for the type you provide. It still requires the preprocessor because C will never have function overloading and nobody wants to use the syntax for it when you can already mangle the name yourself, and so each different type requires its own function with its own name. It's better than nothing and implemented for most (all?) functions in math.h.

Forgot reference link.
en.cppreference.com/w/c/language/generic

That's actually pretty interesting, thanks. Still, it would be great if the function definition itself didn't need alteration. Maybe it would just fuck up C's syntax too much, I suppose.

Indeed templates. _Generic doesn't cut it.
Your Min example is solved by _Geneirc though.
Rework the stdlib to fix inconsistencies (like calloc having different args than malloc, or fput* having the inverse arg order of *printf*), use stdint types everywhere, maybe a better switch.

Golang's defer keyword.

I just want a proper macro system instead of the character mangling that is the CPP. Ideally it would involve sexprs, but this is even less likely to happen. I don't think templates in the C++ sense are the way to go, they are a pain in the ass to use and overly complex. A major strong point of C is that you can easily see what is going on and the compiler doesn't do too much behind your back.
Some sort of allocarray and bzero in the stdlib would be nice. Especially bzero is something that is best done at the language level.


Calloc has different args than malloc because it does something different, what kind of complaint is that?
The stdint types part is also nonsense. An implementation isn't even required to have the intN_t types and int_leastN_t does what the standard integer types do with a few more restrictions.

Templates would necessitate name mangling, which is one of the major bonuses of C. The name of a function is the name of the symbol. With templates, you end up with the name of the function not being the name of the symbol because of all the potential variants that template expansion would generate.

I agree that templates are super useful, but they're not really able to be implemented in C due to the requirement for compile time generation of multiple functions of the same name. If you find yourself needing templates, then it's easy enough to write a Python (or your preferred language) script to create the templates at build time, or just switch to C++ if you don't need to expose your compiled functions.

I always feel like I'm rambling when I post here

Calloc is equivalent to malloc + memset. It has no reason to have different arguments.

How about knowing the language before pointing its problems? Most of the "beginner-friendly" features of JS (automatic semi-colon insertion, loose conversion between types, automatic global binding) are things that make your life worse, not better. Kinda like a bunch of C's "features" (that's why people created lint for C in the first place, and why Crockford adapted it to JS).

Because Moore's Law still hasn't brought us completely out of havign to worry about optimization. If every computer running every application in the world had terabytes of RAM and SSD and latency wasn't really a factor, then maybe the tradeoff of allowing C's efficiency hacks vs legibility and defensive code would be in favor of using something else.

Calloc allocates memory for arrays, which is why it takes the size of the type and the amount of elements. It can automatically check for overflows if both parameters are big and arguably it is even required to do that.
Please avoid thinking in dubious equivalences when doing C, it's a massive source of bugs.

C++. It's C, with diabetes.

...

yes starting stories with >be me is a reddit thing
normal people say

instead of


now kys

Oh dear, you are new to this Nihonese cartoon imageb- I mea, "chan" stuff, aren't you?

...

...

The pointer "types" only service as the integer difference for incrementing addresses. In that sense, your argument is acceptable.

do not encourage it. you know very well the user you replied to is full of shit. pointers have no type, they are just memory addresses. full stop. and memory alignment in case of casting pointers to literally the same memory address... someone had a massive brainfart.

Pointer arithmetic is a major part of pointer semantics you faggot, don't brush it off like some irrelevant detail. Pointer types are also important for strict aliasing and alignment, like I literally told you in the previous post.
What the fuck is this retardation where you ignore most of the differences, declare the remaining ones irrelevant and then proudly present a "proof" of equality for two distinct things?


C99 6.3.2.3.7, try reading the standard before you spout shit. I'm legitimately sick of self-appointed C experts and their stupid assumptions that create buggy code. Not everything is glibc on an i386.

...

that's literally you at this point you utter fucking retard. which part of the standard disagrees with what i said?

The one I linked you moron.

you mean this?

Because every type is character, right?
Wow, reading is hard. This better be a shitty troll.

this is the thing we are arguing over. now try and read yourself for a moment.

Pointer P to type T is casted to T2, which is chosen such that (T2*)P is not correctly aligned as a pointer to T2. In particular, T2 isn't char. In case T has stricter alignment than T2, this situation occurs during the cast back to T*.
This invokes undefined behavior as per 6.3.2.3.7. Note that it is the cast itself that invokes it.
Therefore, pointers are not "just memory addresses" like they incorrectly teach you in most shitty introductions to C.

What is so hard to understand about this? It is the simple difference between the language and an implementation of the language.

Correction, this is wrong. The point still stands.

jesus christ. and this happens on modern architectures? the last time i had to even consider memory alignment was on a fucking PowerPC

i want to try it out. how can i force misaligned data structures for example?

They have a type just as much as other C types do. You can cast some regular C types to other types and back without always running into trouble.


You only have to worry about this if you're doing unsavory pointer casting.

If I remember correctly, x86 is very lenient with unaligned memory accesses, they are just dog slow (SSE instructions have their own rules). However, C must account for architectures where this is an issue.
This is one of the edge cases you only run into if you are doing arcane shit, which is probably why so few people know about it.


Allocate a structure and malloc a big char buffer B, read the structure's representation by casting a pointer to it to char and copy it byte for byte to the buffer at offset 1 (this should misalign it on basically every hardware). Now (structure)(B+1) should be a misaligned pointer to the copied object.
Whether the program actually blows up during the cast or a potential read afterwards depends on your implementation and architecture, but it is definitely undefined behavior.

i'm thinking this is going to be a runtime problem and willing to bet the compiler will play along with such nonsense
gonna have a go at it later tonight

...

He's actually right though. Imo people should be taught to think of division as multiplication by a fraction/decimal and subtraction as addition of a negative. That would clear up the issue.

That's a somewhat circular way to go about it. Remember, the children you'd have to teach about division/subtraction don't yet know what a fraction/negative is. I think the reason for the acronym wars is that bad teachers just teach the acronym as if it was the actual rule, hence all the multiplication goes before division nonsense.

Its circular, but that way one not only teaches the rules, but why they are the way they are. Why is also important to know.

As for students not knowing about fractions decimals and negatives, I think that's a part of whats wrong with how we teach math. Those things are much more basic and should be taught along with number line stuff.

How would you teach fractions before division?

fractions I would teach along side division, decimals I would teach before

None that I know of, but unaligned *accesses* will fault on most non-x86 architectures. On x86, unaligned accesses usually carry a performance penalty and you also surrender atomic access.

The reason for the pointer alignment casting issue is to allow pointer types of different alignment to have different sizes or storage characteristics.

just for the fun of it!

uint32_t i;
uint32_t* p_i = &i;
p_i = (uint32_t *)(((uint8_t *)pi) + 1);

as expected got a "garbage" value, but also, as expected i got the value that was contained in memory a byte away from where i pointed to before. as you would expect because pointers are literally just memory addresses

now

p_i = (uint32_t *)(((uint8_t *)pi) - 1);

got 0, just as expected because pointers are just fucking memory addresses.

did -/+ 20000

got segfault, just as expected.

remind me again: what exactly is """"undefined"""" about doing these stupid casts when literally all the time you already know before hand where you are in memory and what you can expect to get or not get from there?

The behavior is undefined because an implementation or platform may not have any meaningful conversion from short* to long*, for example.

Besides, your example will blow on MIPS and I'd bet ARM as well because you're making an unaligned access.

...

Last time I did something like this, it generated a SIGBUS error. After reading up about it, it seems like a memory access to an N byte object that isn't aligned to an N byte boundary will cause this error. This was on Linux with GCC 4.8. The compiler didn't generate an error or a warning, although GCC 6 or Clang might.

I'm sure if you set up a signal handler, you could figure out more info, or the system might handle an unaligned memory access and give you the values there as
did.

C was pretty alright back when you had stuff like DOS and Amiga computers, and no memory protection and other shit to deal with. C let you do wtf you wanted, and the OS also didn't care much. Good thing too, because you often needed to code directly to the hardware.
But now it seems counter productive to use language that gives you this freedom, but at the same time the OS will swat down your process like a fly if it so much as farts wrong.
I guess now it's mainly just useful for writing OS itself.

You are a curryshitting Pajeet if you think C shouldn't be used for everything. C isn't unsafe or unstable, you just don't know how to use it.

I can't even tell if this is sarcasm or not.

Wow my thread got way more replies than i thought.
Well done Holla Forums.

There are people right here, right now who genuinely believe that. Does it really matter if that post is just being sarcastic?

Literally all of you are filthy bluepilled SJW milkdrinkers.

as part of the C standard all memory allocations are guaranteed to preserve alignment and padding is added where necessary so this scenario is unrealistic

It is unrealistic, and you're retarded for not reading the entire chain. Allocate a char buffer, B, of 9 bytes. As part of the implementation you've probably got memory allocated at a 16 byte boundary. Now access B+1. It's valid provided you're accessing it as a char. Cast B+1 to a double* and now you've got a misaligned pointer and an access to it will likely break. Or in untested code:
double d[2];*(double*)((char *)(&d[0])+1));

you sure you are not the one being retarded?

tested the proposed scenario
char *B = malloc(9);memset(B, 0, 9 * sizeof *B);
*(B + 1) = 127;
printf("%f", *(double *)(B + 1));
nothing broke on linux w. gcc 6.2.1

so tell me pls, where and how have you ever seen pointer casting like this break anything and what exactly

*pointer to first array elem
*+1 so pointer now points to second array elem
*for what ever reason: cast the pointer to point to char*
*now cast the pointer back to double*
*dereference

fucking brilliant user 10/10 code, would obfuscate further

The fact that you find that remotely hard to understand proves you are a sub 100 IQ Pajeet who can't into C.

Yes but the &d[0] thing is still retarded, &d[anything] is retarded.

no, this increases the pointer by 1 byte since it's a char*, the second array element of the double* would be at +8. or am I being retarded?

Explain.

You are being retarded, a[b] is literally *(a + b), to the point shit like ["wew"]1 is valid and equals to 'e'.

I mean 1["wew"]

that's not my point, I know that (&d[0]) equals (d), my point is that the cast happens before the + operation, so the pointer is increased by 1 (char*) instead of 8 (double*), then it's being casted back to a double*.

Oh you mean the precedence, yes you are right.

C is great - plain and simple.
If you don't like it change the language or wait for C2.

p.s.: calm down Bjarne

I sure do love staring at a line wondering whether that asterisk is a multiplier, a pointer type indicator or a dereferencer and trying to make a wild guess about a Lisp tier mess of parenthesis to wonder what does it apply to.

Contextual symbols were a mistake.

I couldn't tell you which one of these posts is dumber.

...

welcome to the internet

Smart posts, yeah.

Well, as bad as all those and more things are, the main reason not to use C isn't those things it's this:
level implementation details reduces your mental workload."
-Steve McConnell
-CC2E

This is why C++ was created in the first place, because even all the way back in the late '70's the engineers at Bell Labs recognized the need for a higher, more abstract way to deal with complex problems--while still retaining the ability to 'open the hood' and tinker around when needed.

artlung.com/smorgasborg/Invention_of_Cplusplus.shtml

This reads like one of those blog posts written in the form of an interview. You know which ones.

I don't believe this argument against C applies to everybody.
I personally enjoy working with the low-level representation of programs, rather than with the actual domain. I d think that it is a legitimate programming domain in it's own right. And anyway, someone's gotta do it.
There is also no other language that has this domain, beside assembly languages.

And using C++ as an example for this is... stupid, absolutely retarded. You could've used brainfuck or piet and it'd make more sense.
C++ is C with OOP badly slapped onto it. It doesn't cease to be low level as shit, and it does little in the way of domain abstraction.
Of course, Haskell and the lisps are 1000000x better examples for this.
And yet, no language can be completely tailored to the problem domain at hand, because you always have to think in terms of the native objects of your language, be it lists or the AST of the program itself (plus problems with scope and symbol collision), in the case of lisp, or... monads, in the case of Haskell.
Indeed, languages are tools and not all problems are nails. As a corollary, there is truly no such thing as a 'general purpose programming language', while Norvig is right in principle, he is not right in practice because there are a lot of things that are simply not worth doing in Common Lisp more than they are worth doing in awk or prolog

it all makes sense now

Dude, your nogramming is showing.
What do you mean "someone's gotta write what should be a high level program in a low level language"? Nobody has to do that; just because you are an autistic fuck who enjoys writing more code than necessary to perform non-trivial tasks that could be solved in a few lines if you chose other languages more suited to the task at hand doesn't mean that role is necessary.

There are. There are many other low level languages that can perform tasks commonly associated with low level languages, and they aren't necessarily assembly. Just because they aren't the majority doesn't mean C is special in any way in this regard.

Here is where I realized you were LARPing.

actually c++ has the same domain, more or less by definition. The only greater impetus behind C is it's accidental ABI and a slightly smaller pool of compilers supporting C++. Otherwise, C++ brings many, many benefits to the table that the NIHS mentality of C devs can't cope with, apparently.

True. That's what libraries are for, not production code.

And your lack of intellect and reading comprehension is showing
I say *I* enjoy, not that people have to do it.
But there will always be a need for low level programmers, lest we all write our programs in motherfucking node.js
Nah, I enjoy the low level representation of machine stuff. No other language has that domain.
I'm waiting for you to name one.
FORTH? that's a stack language that's only ever used for special purpose devices, and whose philosophy is that no source code should exceed one page
And what an abstraction, it really sketches the domain of all problems neatly, and it really does take the language to a whole new level, man, C++ is so high level, not low level at all

it brings some benefits over C at the price of overwhelming complexity and a huge standard that nobody can ever possibly fully understand, full of pitfalls everywhere.
I agree that C is not a great language, but if you're going to compare *any* language with C, don't let it be C++, that's one of the few languages where everything that's wrong with C and right with a higher level language, is the other way around: absolute shit in C++, and much saner in C.
The only improvement of C++ over C is namespacing

I wasn't expecting a literal "enlightened by my favourite programming language", but I guess it should be expected from C retards.

Now, before we get any further, answer to the next questions
>How many C programs have you written? inb4 indeterminate large amount
Small contributions to other people's programs don't count. I am talking about having a significant impact on a project.

Because it teaches you to write good, self-explanatory code without relying on garbage dumps of libraries like node.js.