Hey Holla Forums...

Hey Holla Forums, I wrote an article on software development coming from the angle of the recent initWallet() bug in the Parity wallet contract (the one that sent $280m into limbo). It's just me waxing philosophical about the current state of software, I think most of you would agree.

Please check it out and drop any criticism you might have, cheers.
steemit.com/programming/@gassedupoldman/keep-it-simple-stupid-more-than-just-an-aphorism

Also fuck Rust, it's garbage

Other urls found in this thread:

motherfuckingwebsite.com
twitter.com/NSFWRedditGif

"Keep it simple, stupid": More than just an aphorism

"90% of everything is crap."
- Thoedore Sturgeon, kinda

$280 million dollars, trapped in time. You've likely heard it by now, the bug in Parity's WalletLibrary , leading to all Ethereum in all Parity wallets being stuck in limbo, potentially forever. They fucked up.

But I'm not here to rag on Parity in particular, no matter how much they deserve it. I'm here to rag on all developers, and their software - or at least, the 90% Sturgeon was talking about. Today's code is garbage. The Renaissance period of the history of computing has come to an end. Elegant, unix-like C has turned to buggy and bloated C++. People are running Javascript, server side. Indian code farms pump out thousands of enterprise-tier lines of Java a day, with no regard for quality.

Whatever happened to the philosophy of clean, simple code? Code that almost comments itself as you write, concise in it's implementation. UNIX had the problem solved in the 70s, where each tool was designed with a purpose, sharing useful code in refined, well-implemented libraries. Debugged not through Valgrind nor complex analysis, but through thorough examination, rewriting bits here and there until nothing else could be rewritten. A great quote from Ken Thomson comes to mind:

One of my most productive days was throwing away 1000 lines of code.

Today, entire kernels are implemented in less SLOC (source lines of code) than the fucking Parity wallet. Proper, real, actual kernels, too - MINIX 3, roughly 12K SLOC; Parity Wallet's github, roughly 550K SLOC. It would almost be funny if it weren't terrifying. Every day, people are putting real money and real risks into the hands of these developers, and these same developers can't even write a fucking library correctly. A library, might I remind you, that managed $280 MILLION DOLLARS worth of Ethereum.

Developers: It doesn't matter if you're using a super-duper memory-safe, hyperfast megalanguage, if your code is shit. When you write code, you are writing a piece of software that is designed to achieve a goal. The best way to achieve that goal is not 100MB of npm packages or bower install everything-one-can-think-of, it's short, simple, concise code. This philosphy lasted for 50 years and largely persists in only the best of circles today, developers like those working on OpenBSD, as an example. Are the OpenBSD developers wanking off about memory safety, or are they writing code again and again, cutting it down, refining it, and keeping things simple? Which of those choices do you think is the road to good software?

Apologies this article may be a little rambly, but you should get my point. Stop overengineering. Overengineering is death. Next time you're writing code, take a good look at it. Maybe you'll be able to cut out the cruft and the bugs before it's storing $280 MILLION DOLLARS of Ethereum.

Amen. Its just modern plumping. All people want is someone to connect the pipes and not get too wet.

I'd beg to differ there are complexity issues inherent to languages, though. You can write bloated and minimalist code in any language. Some communities are just worse than others.

God, this. People are missing the forest for the trees. The issue is shit code, period. Memory management might be a little bit harder to debug but software will hardly be any better if these issues are completely gone.

All in all I believe you make a good point but you didn't expand on everything you could to make it across. People compensate ineptitude with complexity. They shove a billion libraries for the most mundane tasks due to the simply fact they are inept to do a proper job. Never forget fucking leftpad. I see cases where using certain libraries adds more loc than writing the actual thing the library does.

I did mean to add this to the end. Good code comes from good programmers, not good languages.
Unless you're using Java. Then there's no good code.

Yeah, kinda. Java pisses me off because they decided to simply strip out global functions and function pointers because of their unhealthy obsession with OOP.

I'd still argue one can write passable code in java, though.

By Brian Fagioli

appreciate the criticism

whats this parity wallet thing? what happened?

UNIX and C are the root of the problem. There is nothing well-implemented about the C/UNIX libraries or the UNIX commands. If you really want to solve the problem, you have to understand why people in the 70s and 80s outside of Bell Labs thought C and UNIX were bad. If you understand this, you will also understand that these problems we face today are not solved by returning to the UNIX philosophy, but that they are the end result of the UNIX philosophy.

Go do some low-level win32 code and get some perspective.

You are lazy user. C requires the programmer to think about consistency and constancy; i.e. the programmer must handle all edge cases, error conditions, and memory management. This is beneficial to programmers because it keeps them on their toes. It is only detrimental to programmers because they might get burned out; hence, they should write their own libraries to handle these kinds of tasks so they don't get burned out as easily. Other languages are bloat filled, and it is undesirable to go through millions of lines of code to find bugs in the bloat for the VM's or interpreters or whatever else.
Perhaps you shouldn't be lazy instead of saying the language is foul. The language only speaks to the hardware of what the hardware is capable. If you want to talk about overflow exploits, that is a fault of the hardware and not the language.

Then how do you explain the millions of exploits in C apps? Certainly those programmers were not kept on their toes.

Lazy programmers failed to write to the hardware.

Getting retards to agree with you is not an accomplishment.

Fagioli Bless

I can agree with this too. IMO the issues we see today are a bad reaction to the issues we had up to the 90s.

[Citation Needed]

This is the dumbest shit I have heard in a while. Congratulations.

Unix is famous for handling edge cases by blowing up. Silently, usually. That's not even a secret, or controversial, it's a conscious design decision.
When fuzz testing was invented, it was used on different implementations of Unix. GNU, the least Unixy Unix of all, was the only one that did somewhat well, because it really did care about correctness and edge cases. Traditional Unix cares about simplicity of implementation, even if it comes at the cost of the program working properly.

What about the edge case of a user pressing the power button when they really meant minimize the current window? Which OS handles that.

Old oldfag here. You have to understand that the mindset when a lot of that stuff was written was a million miles apart from where we are today.
Software on the internet used to be extremely cooperative. Everyone trusted each other, everyone shared their services with one another, very few things required authentication. I used to send mail to people by telnetting to some random mail server and just asking it to drop it off. I'd read USENET via telnetting to some random NNTP server. Nothing was encrypted as there was no one to encrypt it from. RMS even used to refuse to use passwords (I was party to the Lucid Emacs squabble and have met him). Everyone on the internet at that time was part of the same collective and there was mutual trust and respect. Writing software defensively wasn't even considered.
And then it diversified. That started in the early '90s all the things that could be easily abused started being abused. There are two readings of this tale from here - one is that programmers learned to write defensive code and spent the next two decades shoring up older software, the other is a warning as to how dysfunctional and self-destructive a diverse group becomes and that you spend all your time protecting yourselves from yourselves.

Placing it in the context of its time doesn't help, because Unix was bad at handling edge cases even compared to its contemporaries.
I think it's a defensible claim that Unix's design was a good trade-off, but it's ridiculous to hold it up as an example of guarding against edge cases.

But implementation != design. I don't remember the UNIX philosophy including "don't handle any error".

Old school Unix worked ok for its purposes, in the context of the Bell Labs offices where people trusted each other. Much in the same way, Commodore 64 and other simple computers that don't care about error handling work fine in their own context, where the user is fully in control at all times. It's only when you misapply the technology that it becomes a problem. You wouldn't stick a Commodore 64 on the Internet with a TCP/IP stack and expect it to withstand constant hacking attempts. At least in the old days those machines were safely behind a modem that didn't route TCP/IP. The only obvious way to damage someone's computer was with a trojan/virus or similar, but even there it was easily contained and cleaned-up because the hardware was simple, and always booted into a clean state. And even if a floppy disk got infected because you didn't write-protect it, it didn't spread to other floppies unless you were extremely careless. So you just reformatted the infected floppy, and that was the end of it. If your system is simple and resilient like that, you don't need lots of error checking. But if you want everything connected over TCP/IP to untrusted networks, it's time to re-evaluate all the assumptions, and everything has to be redone from scratch, starting from the hardware level. Piling on mitigations and such is just an arms race.

You're taking a joke as more than it's meant to be.

Ken Thompson, in a transcribed 2007 interview with Peter Seibel[20] refers to Multics as "...overdesigned and overbuilt and over everything. It was close to unusable. They (i.e., Massachusetts Institute of Technology) still claim it’s a monstrous success, but it just clearly wasn't." He admits, however, that "the things that I liked enough (about Multics) to actually take were the hierarchical file system and the shell — a separate process that you can replace with some other process."

Not the own description of it, perhaps, but other descriptions, like that in Worse is Better, have it. It's a natural consequence of Unix's flavor of simplicity.
Separating design from implementation makes bringing up Unix almost meaningless. But even if you do that, Unix still comes out badly. gets is flawed by design. Writing shell scripts that correctly handle filenames with newlines is nearly impossible.

there are far more problems in the software industry than just overengineering. and this is nothing new. theyve been fucking up security since day 1. UNIX and its ilk were never good like you claim either. companies are flooded with newgrads writing security critical code without any proper training. the managers are not trained properly either and do not know how to make security requirements in their dev process. every published vuln is completely predictable and has one of the few templates:
product x deserializes stuff in a retarded way relying on eval or some other dubious bullshit like what Java does, and thus allows calling an arbitrary method on an arbitrary object, leading to full control
product x enables some advanced remote admin capability but the devs were too retarded to set it up properly and now this admin capability can be accessed over vanilla USB, or by sending some out of band data over a network channel, etc
product x takes user input and retardedly appends it to some string passed to system() (command injection) or printed to html (xss) or
product x writes to a shared variable wirthout locking it properly because some stupid newgrad barely passed his concurrency 101 class and doesnt really even know how anything works
And this isn't just vulns. All the bugs you see in software are the same basic shit as well.
Then about 0.000001% of the set of public vulns up to 2017 are actual hard stuff like side channels, nuanced crypto or concurrency problems, etc.
Then there's the shitty modern capitalist and communist systems of today which cannot produce quality products. In the west, you cannot put effort into quality. If you spend a day making sure nobody fucked up the latest sets of patches, you get fired for being unproductive. If you want to actually read the manuals for an API, RFCs, standards documents, you get fired for being unproductive. Basically the only time you can do any testing is when the boss says to. Otherwise it's too risky. Meanwhile, in China, they're just retarded for pretty much the same reasons. Back here in the west there's no point in putting effort into building a robust product. People just want to consume. They don't and can't care if what they're consuming is good, since they're just going to throw it out after getting bored with it a few days later anyway. If you put effort into a quality product you risk losing a few dollars, which is unnacceptable and will get you fired, even though it may have good effect in the long run. The software industry is basically just a giant cringe fest.
Claiming people are "over engineering" is giving them too much credit. They are pure retards with not the slightest clue what they're doing. Over engineering is a problem too but there are about 50 other big problems with the industry. Software engineering doesn't even exist. There is no philosophy or regulation in the industry. In every big company they just hire whoever the fuck. They don't need any engineering background. Higher education doesn't even offer such a thing for software and when they do it's fake. All basic engineering skills are missing. People can't even figure out how to write to the framebuffer on time.

fixed

POSIX gave use getline/getdelim.
Yes, this is the WORST error POSIX made; allowing newlines in filenames (actually, they're not, see pathchk -p) while having tools working on newline delimited lists.
I limit myself to POSIX sh and have a little tool to check for newlines in NUL delimited streams to mitigate this. Now give me set -o pipefail, the -0 option for every tool (including read), arrays and associative arrays in POSIX 2020 and I'll know true happiness. Maybe a good ternary operator too.

The UNIX philosophy (everything is a file, KISS, simple tools to build pipelines) is the best I've seen, but POSIX made an error by not improving the implementation more than what it is currently.

cant make an account to comment on your blog

You can use one of those online SMS verification services.

It sent ethereum into limbo. ETH has no dollar value except that which fools assign to it.

All I can say is that you haven't seen any other OS philosophies.

If POSIX was meant to improve things, it wouldn't have been based on UNIX.

oh it's this retard again

There's no magical thing, but Plan 9/Inferno at least are simpler (something poetteringware isn't). Nobody adopted them because muh Leenoox wreks gud enuf.

I fucking hate this trend of everything needs to be phone verified or some sort of shit.

Are you retarded? Plan9 went even harder into the UNIX way. Especially the "everything is a file" bit.

The problem with all this crap is underengineering, not overengineering. The growth and bloat happened as compensation for the initial underengineered bad design. The new solutions still have to get more bloated because they are still underengineered. Overengineered solutions remain the same as they were years ago because they already covered everything, and the only thing that might fit is CLOS.

I'm not sure. We're in year where there's dudes making terminal emulators with graphics via web framework. How much more bloated can it get?

Web browsers are the terminals of the 21st century anyway.

I think some applications (like your terminal emulator example) should have the bare minimum necessary code, but "overengineering" can prevent problems later on. It's always better to do these things right as part of the initial design than to keep having to add more and more later. Underengineering leads to bad design that can't be fixed no matter how much bloat you add.

You're saying that like you can know ahead of time what under engineering and over engineering would look like in a project. Feature creep is a real problem because you can't know ahead of time what demands people in the future will place on the application. The unix philosophy solves this by trying to create tools that only do one thing so they can't be bloated. The "problems" you're speaking about are not that the software does not meet the users needs but that the user has needs which are unmet. This desire to take a program that almost does what the user wants or is in the same vein of program as a tool that would solve their needs and change it to suit that need is lazy and what leads to bloat.

"Good design is as little design as possible."
- some German motherfucker
source: motherfuckingwebsite.com

Enjoy your feature creep.

Features are creepy things that get shot on sight. 640K really is enuf tbh fam.

UNIX may not be perfect, but it's a hell of a lot better than the web ecosystem we have now.
I feel like I'm living in a nightmare.

The ride isn't over yet. Web's gonna keep getting worse, and will be more terrible than you can possibly imagine. Think about how TV operates: ads all the time, and you can't skip them. Constant MSM propaganda. That's the future of web. You'll have to watch 30 minutes of ads and propaganda for every 15 minutes of shitposting. Also no way to download files (DRM), everything will be in their cloud.

Me too, you should post number so we can talk about it.


This is a Poe's law tier comment.

UNIX is cancer. If those 90s workstations were based on e.g. Xerox PARC operating systems instead of UNIX, there would be no "web ecosystem" because the OS and hardware would be safe and high-level already. There would also be no need for scripting languages or many of the UNIX tools if we used high-level languages.

with CPUs getting exponentially faster albeit over a longer period of time than previously the drawbacks of so-called "bloat" pale in comparison to the decreased labor cost and time-to-market.

I've used Plan9. Simple directories and files are way easier than virtual filesystems.
Linux is better because it gets the job done and isn't that fucking complicated.

That is p9. Mounting network drives transparently as local directories.