Why people (and by people I mean Hacker News pseudo-redditors) hate Go: Go attacks everything they held as conventional...

Why people (and by people I mean Hacker News pseudo-redditors) hate Go: Go attacks everything they held as conventional wisdom, and then scramble in attempts to destroy it as Go becomes popular and widespread, a lá SocJus and GG.

Because Go eschews harmful
programming constructs like classes and exceptions, hipster fags felt they needed to leap on to it and deride it for being "old-fashioned". Now whenever Go is raised, they feel the need to sperg out against the ebil monster that is Golang.


Other urls found in this thread:


Why everyone else hates Go: THAT FUCKING LOGO

It's not compiled, so it's shit. At least, it's not Java.

it is compiled you monkey brain piece of shit.
how fucking retarded can you be?

This is the very near future of programming, isn't it. Some mangina with a very loose grasp of a language makes a blogpost about how certain data structures commonly used in that language are oppressive. Minutes later, there's a social media outrage starting about how binary trees are transmisogynist and that if you support binary trees, you need to be publicly called out so people can pressure your employer to fire you.

Go is trash. It's garbage collected and BLOATED to an extreme level. A Hello, World! is over a megabyte. Who's the target audience of this thing? For embedded, I need small, fast, and predictable latency. Go is worse than Python here in 2/3 of the categories. It's got such a narrow appeal that the only community I think would care is the Ruby community.

Great article, opressed/10, would have my privilege checked again.

I like Go, but... I'd have to learn the quirks of a new programming language *and* the quirks of a new compiler (which is not nearly as documented as gcc) *and* of a new build system. It's just not worth it when it's basically strongly typed, garbage collected C99 with some syntax sugar, lack of support for parallelism and lack of the libraries I work with.

tl,dr; of the article: "OMG THEY ARE HATERS"
Why did you even bother bringing up this 2 year old piece of shit?

The binary is probably smaller than all the dependencies you need to lug around for that Python program and it's statically linked so you practically don't need to require any dependencies.

Concurrency is the whole USP of the language, by running new threads using the go keyword.

low kek fams

Why does everybody in "tech" take .io domains ?


python-minimal is about 4M. I've got about 30 programs on our box I'd want to replace. Doing that in Go would be at least 30M, probably closer to 100M once converted. That's several times the size of the entire firmware for the box. The box has 256M of RAM and since Go statically links everything that's a minimum of 100M of ram if their executions line up, probably more like 1G once you count the data bloat in Go.

I've talked with the Go faggots about this and suggestions were to compile everything into a single program that tries to act differently depending on what it is. This is a pain in the ass since we have different groups producing the programs and having an extra step where their work is squished together after they've already tested and signed off on their program is adding a risk of things breaking and would need an extra test step. Since we ship a different grouping of programs per device type and they need to be kept separate for licensing reasons, we'd need a dozen different squished binaries. These each have to be tested individually (if you don't understand why you might be a webdev). I suspect the squished binary solution also has nightmarish startup times but I've not tested that as it just seemed too stupid to even bother with.

So, fuck Go. No one but webdevs will ever use it.

Because they like I/O in their buttholes.

pic related is how I imagine a Golang conference.

give back my click

Task: Print out the odd numbers from 1 - n

Wait, wait, you can't use dynamic linking on go?

If I wanted a convenient scripting language, I'd just use Python. If I wanted a modern compiled language, I'd just use Rust.


yeah no thanks

Other than Racket (which is master race), which is the best language other than C for being performant yet without the baggage of these hipster socjus weirdos?

But C is all you need.


You might as well use C++.

C can be a pain in the ass when you need rapid prototyping and for code not in a critical path. As I say, personally I like Racket, but there are not many people using it.

No metaprogramming. No macros. Absolutely unacceptable in the current year.
C or C++ over this crap for me anyday.

bait overflow on line 1

Go doesnt attack or challenge anything you shit. It's a remake of some old dialect of Algol. At best, it's equivalent to Java 1.4. Everyone who associates with Go is fucking retarded like you.

Because it's yet another procedural language when there's a functional competitor being developed. Unfortunately Mozilla doesn't have as much money to throw at their devs and shill teams.

Did you ever stop to check its features? The language itself is just way, way better than C++, that's just a fact. Not talking about compiler optimization or tooling, just that the language in itself lends itself very well to concise yet explicit code.

That's just brute force shilling

What a waste of potential. And a waste of dubs.

It's a moot point. You're like a blood cell asking where it can go to get away from the leukemia that's ravaging a body.

We need to face this problem head on.

learn to pick your battles m8

The language was created to be Google's "perfect" language in that context, so it's not really a surprise.
It was tailor made to fix the problems they have with C++ as well as Python.

Rust borrow checker is a gigantic pain in the ass, huge waste of time.

Go was made by Google to get recent college graduates productive as quickly as possible at Google. Why did Google do this? Because Google has high turnover. Employees work there for a while to get the name on their resume then move on.

Java was created for similar reasons. Java forced OO and is super verbose so you could treat developers like replaceable cogs in a machine. If a developer quits, simply drop another one in his cube and it'll be like the first guy never left.

The way I see it there are currently there are currently the following contenders to the C throne. I didn't count interpreted languages or languages that don't focus on performance.

Really old and tested, multiple compilers to choose from, can be even faster than C. The downside is that the language has grown way beyond what it was originally intended to be, with every cool new paradigm bolted to it. But you can't argue with the results it produces

No idea about this one.

No idea about this one.

No idea about this one.

Developed for Apple by the inventor of LLVM, so I am going to trust that he at least knows his shit when it comes to languages instead of being just an academic who faps to Chomsky every night. I don't know about the performance though, and the language is still run by Apple who are a magnet for SJWs. This wouldn't be much of a problem if they hadn't invited everyone to work on Swift. For fuck's sake, they merged a joke PR by PC Principal from South Park that replaced Master/Slave with Leader/Follower just to later replace that again with something else because the German word for Leader is Führer.

Isn't Racket just an implementation of Scheme?

Isn't Brian Kerningham the main Go developer? How can it be so shit?

Racket has extended far beyond Scheme, but there is an implementation of Scheme R6RS in racket.

That's marketing. Apple hires WHITE MALES to do the actual coding, which is why they ship a competent product.

Are you serious? The language's design is hopelessly fucked up and cannot perform without compiling everything as if it was one huge source file. Trying to say the language's design is separate from the impossible to solve compilation problems it created is full retard. Rust will never be more than a toy language.

Those faggots can't even decide properly on type names.
They don't know how to design a language.

Go doesn't really offer anything over C so no one cares about it.


the syntax is shit and so is being forced to check for errors after nearly every statement

also the logo fucking blows

All this hate for Go? I know the real reason there's so much of it. It comes from the GDF Propaganda group that works hard day and night to quell any interest in the language. You see it constantly because they're dedicated and they like their espresso and vodka. So in this light, I'm giving away THEIR SECRET. The mission is simple, promote at all costs the "deficiencies" of Golang in all forums large and small, throughout the Internet.

Don't let the GDF win. Or... come to the dank side.

Only thing I don't like about it is := instead of = but honestly it's not too bad at all. It's a cool language.

It signals that you're hip and trendy. Now give me that VC money.

enforcing one kind of linkage is a very bad thing.
I prefer dynamic linkage because of the memory savings and the security benefits, but I can see why static linkage is sometimes desirable.

the SJW part of the community sucks but the language is an improvement over C and C++ IMO

disgusting. C++ is exactly the reason why Rust exists

the compilation process sucks but that's the cost of statically checking for unsafe code, I guess

Because it isn't so shit, it's rather polarizing. Most people worship or hate it, without reason


Your arguments boil down to

Kernighan wrote a language for Google's internal needs, if anyone else finds it useful outside of Google for anything aside from web dev it's just a happy accident.

>OSX iOS Webkit LLVM Fagbooks and iMacs mobile iThings

off the basis of literally one shitter's post.

go is among the lesser sjw-infested newer languages compared to ruby, js, rust

go attacks oop hegemony generated by three decades of oop monopolising software design. waste of triples faggot.

its ken thompson

zero kek. rob pike is friends with penn teller (>>560696) who is an ancap. you have no other evidence for sjw infestation and no grounds for attacking it for political opinions held by its users and contributors

Holla Forums just love lying over and over again like their meme god trump and believe they have the right to, its just amazing

im beginning to believe this meme, the shills really come out of the woodwork whenever go is mentioned for whatever (often disingenuous) reason, like sjw mccarthyism, again challenging everything these faggots hold true and holy, attacking it for using old concepts instead of fucked up nonsense like object autism and functional rabies, i hope it continues to trigger frothing at the mouth for many a year longer

steve klabnik, 1919 tier semite

Programming languages are tools. Tools come with cognitive load.

The simpler it is to think about how the language operates, the faster you can write a working program. This is go's primary advantage. Even if you're a fucking genius, you're still saving effort and developing working software faster than your peers by avoiding complex "modern" paradigms.

But it meshes extremely well with the method every human brain falls back on to solve problems. Standard procedural programming, nothing fancy.

Simplicity is high level optimization.

He's still a card-carrying member of the holy trinity of Holla Forums. How the fuck does an inventor of UNIX and C make something as shitty as Go?


Time is harsh.

Go has almost all the same OO shit as Java. It even has a retarded inheritance system (though I think they don't call it that). Go is almost the same as Java 1.4. Sure the libraries are different but you could just as well have written a new set of libraries under this New Enlightened Non-OOP design in Java instead of make a new language. Not that I really care as I see both languages and docrtrines as retarded. At least Go's popular implementation doesn't have a slow as fuck startup time I guess.

pick. fucking. one.
this better be fucking bait


From what I've seen of Go it's just a hipster C++ which runs slower, uses far more memory, and generates much larger executables. Not to mention the lack of dynamic linking and the go-vs-gccgo madness. I don't see any upside, just use C++ if you want to go there.


Jesus fucking Christ. Is the "GDF" going to be the new "JIDF" now?

Go. The fuck. Away. Holla Forums

Different user here.
No, you can't, and that triggers my spergs very hard.

Go's not for embedded use. That's what C is for.

That said, compile with gccgo instead of gc and you will get much smaller binaries.

:= is short for "declare variable and infer type".
= is assign value

It's no different than "auto" keyword in C#.

Don't forget it's a statically typed language.

Because you either haven't tried doing real development in Go, realizing its strengths compared to other languages; or you're just an idiot.

I use C, C++, and Python. Which do I replace with Go? It can't replace C/C++ because of garbage collection. It can't replace python because it's so bloated and can't dynamically link. I don't see why anyone would use it.

>happy accident
Just like with javascript :^)

Of course you can do dynamic linking in Go. That just shows that don't even know what you are talking about.

Pray tell, what library are you needing to dynamically link? Perhaps there's already a battle tested Go equivalent you don't want to admit exists. Hmm?

Also, have you actually tried and measured your code under Go or are you prematurely optimizing away your fear of another programming language? GC latency in Go has been improved to the level that "humans" cannot detect the pauses anymore.

Rick Hudson's talk "Go GC: Solving the Latency Problem" July 28, 2015

So, if Go GC's is so evil and you're shopping around for a new language (why else would you bother posting in a Go thread?), why haven't you upgraded your code to Haskell already?

tru dat fam


Why does it matter? And custom domains are nice.


At least it doesn't look like a bloody black-headed pussy zit with legs.

lol you are literally mentally retarded. 'bloated' ahaha *vomits virtualenvs into system for every little script*

hn hates go? i've never actually browsed the user comments but I see go stuff being posted all the time

kill yourself


Reason enough for me to never touch it. If the creators don't take it serious, neither will I.

Plenty of other languages.


So, in one end we have cuck developers concerned about PR feeling sorry for panties jokes and, on the other side, SJWs developers bitching because the language is simple and tries to stick to this virtue?
I actually feel sorry for true GoLang developers. I don't mind the logo and I like simplicity, I would actually use it.

When Java GC sucks.


wow that article was shit

and no data neither program code is included so we gotta believe our guru

Fucking reading comprehension 101 fail, go back to school or something.

This is the real reason why go sucks - it rightly eschews some stuff while not fucking fixing other when it has chance.

By the way, it feels a bit weird to be defending OOP (or to be exact, encapsulation of state), but it's about the only way to make imperative programming at scale saner. It is basically a one big bandaid over being imperative and mutable, and go deciding to not do objects while still being imperative and mutable is just fucking dumb. C at least has the excuse of "no-one figured that out then yet".

You don't understand; if you let the actions of SJWs dictate your technology choices, instead of technological merit or availability, then you have ALREADY LOST.

Look, buddy, you don't have to be Usain Bolt to know what running is.

How big was n, or how much does go suck?

$ cat odds.pl #!/usr/bin/env perluse warnings;use strict;for ( my $i = 1; $i /dev/null

real 0m0.087s
user 0m0.035s
sys 0m0.001s

How big was n, or how much does go suck?

$ cat odds.pl #!/usr/bin/env perluse warnings;use strict;for ( my $i = 1; $i /dev/null

real 0m0.087s
user 0m0.035s
sys 0m0.001s

HN loves Go and to a lesser extent Rust. If you want to get a post to the HN front page the best way is to post [Some stupid program that would be completely uninteresting in any other programming language written in GO]

whoever made that probably used println because /tech is full of shamelessly retarded proles


time ./odds > /dev/null

real 0m0.053s
user 0m0.024s
sys 0m0.028s

So, you forgot to skip the even numbers in your sample...


What does your hardware do with this?

the java comment makes me think its not bait and that the poster is just a fucking retard

uhh you aren't even doing what the task requires

It still blows my mind how someone can say something so blatantly false so confidently.


time ./odds > /dev/null

real 0m0.028s
user 0m0.020s
sys 0m0.008s

meme languages for people who can't be bothered to try and understand anything but still need to squeeze out a turd

C(++) for everyone else

Double i++ can be replaced by i += 2

Try this : play.golang.org/p/v6qKB2O4Zx

More like this, user : play.golang.org/p/B5QYgeYMw5

Looks good. Ship it.

U dumb, user? C++ is the second biggest shit pile since punched cards were invented (the biggest is of course PHP; Java comes in third).

no, it's absurdly great because you can just revert to doing things the C way whenever you want or don't understand some feature. Maybe I'm weird but I like to understand what's on the screen and come to an obvious solution that is self evident by natural law. Not go to stack exchange and copy paste I.am(a, cuck)

It's cool you can revert to the comfort zone, but the thing is not all people are smart enough to do so and write dumbshit code with features they only barely understand instead. And yes, if you know C++ well you can write good code, using smart pointers instead of naked ones and so on, but the problem is most C++ programmer don't really know what they are doing and universities don't learn it properly either (point in case: my university grades C++ projects on how many C++ features you used and then gives topics like "System L fractal drawing" which is at most three classes with no inheritance and you have to start artificially stuffing things in to pass). In that case something that's simpler is just better.

If you're bored take a look at this:
or this:
It illustrates quite well how apparently harmless C++ feature of classes can lead you to write slow code (granted, it's less pronounced on x86 than on CELL but still), then imagine what people can and will do with less innocuous concepts.

Also with regard to being self evident, well for me this:
multiplyListByTwo aList = map (* 2) aList
is a more self evident way to say "I want this list multiplied by two" than:
std::vector multiplyListByTwo(std::vector aList) { std::vector resultList; for (std::vector::iterator it = aList.begin() ; it != aList.end(); ++it){ resultList.push_back(*it * 2); } return resultList;}
so YMMV. I mean, I understand that it does multiply things by two, but why the fuck do I need so much line noise to say that in C++? Inb4 std::algorithm - yes, it helps a bit, but you still have to specify iterators even if you want to transform the whole collection, not a subset. Also, it's just a simple example, the more complex the algorithm is, the more that line noise doesn't help you understand it.

Pretty much that. C++ when the templates get going was just pushing the language towards linenoise. After some point you need to scrap the syntax. Racket's meta-languages are nice for these kinds of DSL setups. Rust kept changing their syntax too much in the early days so I gave up keeping up with it. Later the SJWs got flamboyant and toxic to the pragmatism of techne.

If someone is going to recommend C++, I'll just politely point them to Nim. Because remember, I want Go to maintain its status of a secret weapon of the 1000x programmer.

C++ code re-use is a travesty, you have multiple decades of template hell when people "adopt" the latest features and write a bunch of code and then abandon ship for the next new paradigm. Bitcoin should have been written in C, but then there's now Decred in Go, so that's not as important now.

What language is the first one? Scala or some sort of unholy parenthesis-less Lisp?



multiplyListByTwo :: (Num a) => [a] -> [a]multiplyListByTwo = map (*2)

That compiles to a 1.6MB executable (after stripping).
It runs in 0.092~0.098 secs on my machine.

Compare to:
#!/usr/bin/env perl$\ = "\n";for (my $i = 1; $i

Namespaces don't really hurt. I'll try to "fix" it though:
template T multiplyListByTwo(T in) { T out; for (auto val : in) out.push_back(val * 2); return out;}
That is a little less messy, and to the untrained eye it almost looks like duck typing.

But the first one wasn't templated. A fairer comparison would be:
vector multiplyListByTwo(vector in) { vector out; for (int val : in) out.push_back(val * 2); return out;}

Which is perfectly understandable as long as you know any other C-based language. In fact, this should be valid Java code with some minor modifications (namely the fact that you would have to explicitly initialize the vector first).

Hw plox

Well, IMO Nim(rod) is pretty cool as a better-C replacement if you can stomach Pascal-like syntax (and has macros!). Except it has a non-optional soft-realtime GC, but that hurts only if you need to be hard-realtime.

Racket is really nice, especially for teaching, but it's nowhere near as performant as C/Nim/Rust.

As for Rust's syntax they seem to have gone into "make everything explicit as fuck even if it makes language read worse" direction I'm not sure I'm fond of. I mean, yeah, you can layer syntax sugar on top of that, but still. It felt more elegant before, though maybe it's a conscious choice, just like the "sacrifice neat features if performance suffers" approach which killed spaghetti stacks or optional GC.

Oh, I didn't do that to make C++ appear worse than it is, I just prefer to be explicit about where symbols come from (for example I almost never refer symbols in namespaces when writing Clojure, preferring alias them so I can write `time/now` instead of `clj-time.core/now`.
I did however consciously not use C++11 features like did*, because it's fairly new. As you can see using that kinda helps, but still is fairly verbose compared to just `map (* 2)`.

Also yeah, that's Haskell, just with explicit argument compared to the other user who used currying.

* - actually, him templating the whole function is somewhat more true to parametrically polymorphic spirit of the Haskell version, so yeah, my bad.

Well, yes. There's several topics "discussed" on the labs - classes, multiple inheritance, RTTI, exceptions, templates, STL containers, STL algorithms and probably a few I'm forgetting. If you have less than a 2/3 of those they dock you points, which is so stupid because most of programs second year students can write don't need esoteric things like RTTI or multiple inheritance.
Oh, and by discussed I mean the students are given a topic to prepare by themselves and the lazy bum of the supervisor nodding absentmindedly because they don't know C++ well enough to tell if students make shit up either.
No wonder there's so much dumbfuck programmers nowadays.

Actually, the templated version is more apples to apples, since Haskell functions are parametrically polymorphic like C++ templates by default. I just forgot about that when writing the example. : V

And yeah, it is understandable if you know something C-like, but the point is there's just so much line noise for something so simple, it obscures the clou of the algorithm. Imagine how it would've looked for something more complex. And yes, I'm very well aware you can't as easily compose imperative code together as you can with functional one.

While "food comes in, shit comes out" is probably much easier to do in functional languages, I would rather kill myself than attempt to make anything that has to hold state between iterations while avoiding state.

It's a shitty rip-off made by a retarded nigger named Rob Pike (aka Knob Kike) who had to make an alternative to C++ so retards like himself could use. Mozilla did the same thing. Retard SJW niggers couldn't use C++, so they created Rust to replace it. C++ is for craftsmen, and Go is for code monkeys churned out by academia to be drones for companies who just want some cheap, slapped together software.

I did this because IMO C++11 makes C++ a much more palatable language. I don't really use C++ pre-C++11 any more.

It's really not as difficult as you think.
I'm not great at haskell but here's an example of something you'd use "state" for:
count :: (Eq t) => [t] -> t -> Integercount els what = count0 els 0 where count0 (el:els) acc = count0 els (if el == what then acc + 1 else acc) count0 [] acc = acc
It counts the amount of "what"s in a list, going from first to last, then returns that amount. It keeps an "acc"umulator as it goes along (which is the count so far). A lot of stateful loops can be rewritten like that.

If not, there's always wiki.haskell.org/State_Monad

s/you'd use/you'd usually use/

Not necessarily, it's not much of a problem to pass state along as function argument in recursion, for example and just return that when recursion terminates. It might feel weird when you come from languages where everyone said "omg don't use recursion it blows the stack", but once you become accustomed to that (and that tail recursion is isomorphic to loops so doesn't blow stack) it's fairly natural. It all depends on how you were taught programming, the other paradigm will invariably feel weird at first.

Also, not all functional languages are as obnoxiously pure as Haskell either, in Clojure for example you can just do
(let [state (atom initial-state)] (loop [] (use-state @state) (reset! state modify-state) (recur)))
and it's not really all that annoying to write, other languages like MLs also have something similar:
let val counter = ref statein while true do ( useState !state state := modifyState !state )end
and even in Haskell you can use the Writer, Reader and State, though it's certainly more annoying because it changes the signature of the function (which makes sense, since it's no longer stateless, but not everybody cares as much about that).

But I'll give you this - while some algorithms are easier to express with recursion, some algorithms are certainly easier to express imperatively with mutability, especially if the state is non-local for some reason (like if you have a global cache of something for performance reasons). So depending on what exactly you are implementing one can be more awkward than the other for sure.

i do not hate Go. i don't think about it at all.

Yeah, I agree. While it's certainly possible to write a functional game engine with tail recursion, I can't imagine how fucking slow would it be with immutable data structures. Same goes for anything regarding databases, more because spergs sperging about muh purity rather than being actually difficult.

HW let me post

Yeah, with post-C++11 features it's certainly a more digestible language, but I kinda wanted to illustrate the general idea of implementation details sometimes getting in the way of the clarity of algorithm.
Sure, range-based for helps in this case, but not everything can be abstracted enough - it's a price you have to pay for performance and unrestricted mutability.

Well, immutable *persistent* data structures are not as inefficient as you may think - for example because they are immutable there's no need for defensive copying and you can share the parts of the datastructure that didn't change between copies.

Incidentally, immutability in databases actually makes sense, because it makes transactions simpler to implement for example. PostgreSQL uses MVCC to implement transaction which basically means it doesn't immediately mutate the datastore, but keeps a snapshots of database changes as long as they are relevant to currently running transactions and only compacts that after it's sure it won't introduce any inconsistencies.

But yeah, writing a graphical engine with immutability sounds like a sure way to shoot the performance to hell due to high allocation churn; IIRC the Haskell clone of Quake was pretty slow at ~60 FPS (Athlon XP 1900+ CPU,
512MB of DDR266 Ram and a Nvidia GeForce 4 MX 64MB, [email protected]).
On the other hand writing your game logic that way sounds quite feasible, as it's not that performance-sensitive as rendering, and might have nice benefits (easy interpolation between previous and current simulation frames, ability to write as much logic as you want as pure functions and so on, potentially easy time-travel if needed).
Would be interested in trying writing logic like that one day.

Game engines are often composed of several "subengines", such as the rendering engine, the physics engine, the AI engine...

In the case of physics, for example, I doubt it would be a great idea to write a stateless, pure engine. Sure, you can do it, but considering how fast most engines tick and, depending on the game, how fast do objects change their state and/or generate new objects, you would have to keep many "copies" (they aren't full copies, but you get my drift) of the data structure holding the bodies and its bodies, which would severely abuse the garbage collector and allocate-deallocate way too much memory to be feasible.

Furthermore, in order to build the input engine, you would have to hold some stack that could asynchronously receive input and then process it at the right moment. You would need to hold some state at some point if you don't want to lose input between pollings, but that would make the engine no longer "pure".

I have no doubt it's possible to write some sort of "stateless" pure engine, but I doubt the practicality of it. That said, it would be a very interesting project, and maybe I will get to it some day if I feel like being a masochist.

Talking about databases, the concept of a immutable database is great because you know how ugly can things get if you have multiple connections, but I am not sure how would you manage pulling information out of a database without compromising the purity of a function. You would introduce "impurity" at some point of the execution, even if all following functions can be perfectly pure. Although I haven't done functional programming in depth so I am not sure how important is it to keep everything pure for functional programmers, but the few ones I knew were easily triggered by anything that smelled of state (I knew one that was triggered by singletons not because they were useless, but because of muh impurity).


Well, like I said before, I don't mean the whole engine (though guys who wrote that Quake clone has publications on it if you're interested - wiki.haskell.org/Frag) because as you say it's not very practical, things like graphics or physics would probably buckle hard.

Input and game logic are probably the most feasible to try FP in, since that's where it can net you concrete benefits, as I mentioned previously. WRT input there are FRP (functional reactive programming) libraries that deal with all that impure stuff for you and you get a pure stream of events you can derive the current state of your application from. It's pretty interesting, really.

Well, a perfectly pure program would a black box that neither consumes input nor produces output and just eats your CPU cycles for some inexplicable reason. For any practical program you need some impurity to talk to the external world, the trick is to partition it up in a way, that you can keep most of your computation pure and only admit impurity where it's needed for the program to interact with the world.
I don't claim to have a PhD in monadology or whatever, but as far as I understand in Haskell you use the I/O monad to build a sequence of pure output actions given some inputs, and the runtime takes care of the impure part, injecting the impurely acquired inputs the rest of the program can then treat as pure.
As for the seemingly knee-jerk reaction to impurity - it's somewhat justified because the more impure you get, the more you loose benefits of FP (unless you have one of those fancy type systems that can track effects), but it's - like it always is in engineering - a question of trade-offs.

As a side note, immutability in databases is a fun topic.
There's Datomic by the Clojure guys, where they took the EAV databases and layered time onto that, so you can basically query the database as it was in any point of time (the Datolog query language they use is also neat, it's basically querying the DB with Prolog). As a bonux replication and caching is vastly simplified due to that.

Another interesting concept is Event Sourcing, where you basically only record the inputs from the user and derive the current state of the application when you need it, which means you don't really need a real database, just some durable log to store the events in.
This allows you to easily have multiple views on data, interpreting the events as it fits the concrete use case (this ties well with concepts such as CQRS and DDD, feel free to look them up if interested).
Also in relational databases if you don't think in advance about the kinds of statistics you need you're screwed - with ES you can derive them post factum easily, since you have all the intermediate data.
Another consequence of that is you don't have to share your logic between the application and database - since the state only exists as a derived value in-memory, then all the querying and business logic can be written in your language of choice; no need to write PL/SQL scripts ever again.
Which is also important, immutability lets you have easy replication and synchronisation (for example between your webapp and a smartphone client) because you only need to merge immutable events into one stream, not reconcile different mutable states on both ends or lock everything in transactions.


Bloat is a buzzword.

Go is literally everything that's bad about C combined with everything that's bad about java. Oh but we have goroutines! A novel construct so new that everyone has had it 20 years before go was even conceived!

The only new language worth a shit is rust.
Then again, it's not very fair because rust is actually ridiculously good. The only bad part about rust is who's behind it (namely mozilla) because they enforce a draconian, anti-productive, suicidal CoC on their communities. Thankfully, it does not apply to the compiler itself and other projects in rust are free of that bullshit.

When you do functional update on objects, you don't actually make a deep copy of the object. You copy almost nothing, in fact, because the compiler can optimize the operation and make the new structure share any non-modified data with the previous object. Functional programming language have GCs that are tailored to this approach: they allocate objects on a "fast heap" and a "slow heap". the "fast heap" is freed every frame if the object is dead. If it's alive, it's moved to the slow heap instead. The logic is that objects that survive a frame are typically going to live long, so the overhead is asymptotically irrelevant, whereas most objects will die very quickly, such as temporary variables and iterated builds (i.e. quick functionally-updated structs). One example of this GC scheme is realworldocaml.org/v1/en/html/understanding-the-garbage-collector.html and as far as I know, haskell uses a very similar scheme, but with modifications for parallelism, among other differences.

In practice, if you do many writes per frame, it will be slower to use functional data-structures. However, if you do few updates every frame, not only will non-functional structures no longer be faster, they'll in fact be notably slower.

For input, surely a reactive style would be a better approach. It's already a popular style for some C or C++ input managers and it meshes well with functional principles.

Over all, I think you greatly underestimate how powerful modern functional optimizations are.

In the end, pure functional is only a useful representation at the conceptual level. The underlying system will always be procedural instead. The thing is, we don't care. The user code is safe from any user-level code, and that's the point of it.

The binaries are literally 100 times larger than my existing code to do the same thing but slower and with more ram. That's bloat.

I knew that post sounded familiar. Rust is a disaster, deal with it.

enjoy your memory leaks, buffer overflows, and dangling pointers your Heartbleed faggot

Butthurt gotard detected. Enjoy your meme language that's already dead. Meanwhile rust is only growing because unlike your meme, it's a genuinely good language with no real flaw.

stay mad bro

going off topic but i dont care. have you heard abour nim? nim-lang.org/

seems to be good aswell.

What a faggot. Abandon ship, owner/maintainer is a simpering beta.

What a piece of shit. Get a better box instead of crying about other languages.

.NET stack. Especially if you want to actually find employment in the world.

Pick one.

Top kek, what faggots. Can't even at least get an American company for part sourcing?

Yes, I've checked it out too, but it's pretty shitty actually, and mostly because of the main dev. He's so autistic that despite everyone telling him otherwise, he refused to believe that making aFunctionName be a synonym for a_function_name or AFunctionName was a horrible idea, for instance. As I recall, he finally decided to accept that maybe it wasn't such a cool feature after 2 years of his community dying. Nim also doesn't actually do anything new or interesting at all, has a GC, and compiles to C (as opposed to having a native or IL target). It currently has a rather poor ecosystem and community and I really don't see it ever becoming successful.

I haven't even read half the thread yet and I'm already seeing the same misinformation that I keep seeing for Go, why do people keep doing this? Can you please at least check your facts before posting, I'm starting to believe people are doing this on purpose.

/tech should embrace Go. It is fast and compiles to a binary. Most SJW's will be scared off because off pointers(too hard). It has all the same autism as C so you should feel right at home!

Unless it's been added very recently, go doesn't support dynamic linking at all. This was a decision made on purpose by rob pike because he's literally got brain problems and believes that dynamic linking is """bloat""". If this has changed, then either he backpedal (a good thing), or someone less retarded than him took the reigns (also good). Eitherway, it's not misinformation.

Go has supported Dynamic linking via gcc since 1.0 iirc and officially in the Go compiler since 1.5.

Either way it's wrong.

After some googling I found this:

So until less than a year ago, it had 0 dynlink support, and I don't know if it supports dynlink on any platform beside amd64 at this point.

Literally the worse of both worlds.

Stop moving the goalpost and more importantly stop spreading misinformation. It has been possible since release, it is now officially supported and in 1.6 they supported more platforms. To say that it's not supported is flatout wrong.
Even prior to 1.5 the official compiler could link in C libraries dynamically.

Even if it was the onus is on you to check that your facts are up to date before speaking on a subject you're not familiar with.

Fuck off, pike, you're fooling nobody.

Go is just "C with shit syntax, no libraries, a forced GC, no parallelism, no low-level features, and slow". Why would anyone want to use it? To top it of, it's SJW-controlled.

Hopefully, neither are you.

The standard library seems pretty good for what it was designed for, network services, web servers, etc


It's probably a consequence of the GC's global lock. Go doesn't do parallelism. The go devs think it's a good thing.

What is the largest project anyone here has written in Go? Hello, World does not count.

To be fair, there's docker.

and IPFS.

And I'm not talking about memory leaks, rather stack overflow and co, which are much more common than you can imagine.

So explain me what the fuck are goroutines and channels. Go *is designed* for parallelism. But they call it "concurrency", which is roughly the same thing for fucktards like you.

Okay, you're probably confusing with GIL found in interpreted langages as Python or Ruby. Also, document yourself about Go's GC.

Source please.

kek indeed

It's literally designed to have no parallelism. You can even google "go parallelism" and what do you get? 20k results about a go dev presentation that explains that concurrency != parallelism.
Oh, you're just trolling. 6/10.

Prove me wrong.

Explain me what is the difference between multiprocessing and concurrency.

Not the same guy, but I only know of one language that has parallelism built in and that is Julia. It's essentially a wrapper for IPC and RPC, so in the language itself you can run functions across any number of processes and machines just with a simple macro.

A lot of languages have parallelism, from C and C++ to rust and haskell. It is not, in any way, shape or form, a wrapper for IPC and RPC.

Hey, you weren't at the last GDF meeting, were you? Did you get your decoder ring? They're about to change the secret handshake.

P.S. The next password is "TypeScript." Remember that. Idris won't work.

The #1 reason that Go is a shitty meme language:

It doesn't have a debugger. :^)

silly boy, nobody writes bugs in go, so there's nothing to de-bug

If you need a debugger in any language you can't code for shit. :^)

Meanwhile, even docker is chock full of bugs.

I didn't played with Rust much, but i liked Cargo. What is your problem with it?

I find rust code far more readable than perl

cargo seems pretty good to me, what are your specific issues with it?

He's probably just memeing. Nobody can seriously think rust is hard to read, let alone harder to read than perl, to begin with. Hell, it's even clearer than C.

Atheism was fun while it lasted.

You're nuts, even tutorial-level code needs a lot of explanation in rust.

let guess: u32 = match guess.trim().parse() { Ok(num) => num, Err(_) => continue, }; println!("You guessed: {}", guess);

There's a lot going on in the first goddamn tutorial of the book that has nothing at all to do with how the machine it's running on works.

Not sure if retarded or just pretending.jpg

So wait, all parallelism is concurrency, but not all concurrency is parallelism? Is this correct?

Yes, that is correct.
Parallelism means two atoms can execute simultaneously, whereas concurrency means tasks are split in atoms with interleaved execution.



Thanks for confirming you're clinically retarded.

I've never used a debugger other than to backtrace and reverse engineer. Who does this rather than just instrument the code? That's far more flexible than trying to reconstruct the info you wanted every time.

Average go advocate, ladies and gentlemen!

I'm actually a pretty low level C/C++ programmer.


I'm curious as to why you think it's so shocking that people do inline debugging rather than use debuggers. There are plenty of situations where you can't use a debugger at all, extremely few situations where it is more convenient than inline debugging, languages with no debuggers or a wildly different debugger for each language. How much thought have you put into this, or are you just some loser undergrad who wants to pretend and shit up Holla Forums in the process?

I don't even know rust and that code is clear as day. There is nothing at all I don't understand in that code.

Explain what the exclamation mark here is doing and why it couldn't have been written without it like the other functions.
It took me a day to originally figure out what was actually happening as the result of using "println!", can you figure it out?


Dude it's right there in the second chapter, directly after hello world:

guess why i don't use rust?

Ok, faggot. Let's see how clear it actually is.

So it's a macro, not a function. To clarify it by understanding what it actually caused to be dumped into your code and what actual functions were executed on its behalf you'll need to unwind it. Here, I'll help you. Here are the relevant pieces.
#[macro_export]#[stable(feature = "rust1", since = "1.0.0")]macro_rules! println { ($fmt:expr) => (print!(concat!($fmt, "\n"))); ($fmt:expr, $($arg:tt)*) => (print!(concat!($fmt, "\n"), $($arg)*));}#[macro_export]#[stable(feature = "rust1", since = "1.0.0")]#[allow_internal_unstable]macro_rules! print { ($($arg:tt)*) => ($crate::io::_print(format_args!($($arg)*)));}#[stable(feature = "rust1", since = "1.0.0")]#[macro_export]macro_rules! concat { ($($e:expr),*) => ({ /* compiler built-in */ }) }#[stable(feature = "rust1", since = "1.0.0")]#[macro_export]macro_rules! format_args { ($fmt:expr, $($args:tt)*) => ({ /* compiler built-in */}) }
Wow, it's so clear! And it's not even all there as some of it has been implemented as compiler built-ins (...why?)! The added pieces from the compiler of all places that you need to understand to make sense of what it actually did are too long to post on cripplechan, so here are just the two function definitions to look up:
pub fn expand_format_args Box {
Even iostreams isn't this incomprehensible and I'd call iostreams operator overloading a clarity disaster. gcc's implementation of of vfprintf is more clear (and shorter) and it's a hand written jump table! Even a variadic template printf (the modern C++ equivalent to this mess) is more clear and it's the least clear syntax of a major language I'd seen up until this point. Do you realize that this is the kind of code that you'll have to write if you ever leave the tutorial and attempt non-trivial rust? That anything remotely similar to something as incredibly simple as println will require oceans of code excrement to make it work?

Now you should be asking yourself, "Why wasn't I more curious as to what was going on here? I totally missed the obvious clue in that example that rust would be an incredibly unclear language. Am I going to be a webdev when I grow up? How have I failed this badly?". But you won't because of that lack of curiosity.

Good fucking goy

Everytime you post you show even more that you have brain problems. I don't know why you even bother anymore.

Have fun in webdev.

So expanding a templated string at compile time is "incredibly simple"?

Hey, here's a fun task, why not write me some C++ that does that. If it's incredibly simple, it should be easy for you right?


Mmph, something as incredibly simple as println. Oh baby lets see how simple it is.

__printf (const char *format, ...){ va_list arg; int done; va_start (arg, format); done = vfprintf (stdout, format, arg); va_end (arg); return done;}

oh baby, how simple, oh wait, we're just calling another function.

It's called... vfprintf.

Let's take a look at that shall we.


Damn, only 2k lines of code, making heavy use of C macros. So simple, I love it. Clearly this is superior.

No, please, don't use Go. Just... don't. You won't like it--it will give your applications herpes and kill your youngest iguana.

I'd not do it at compile time as that would be retarded. Localizers will often want to modify order and formatting and do so without knowing how to do software development so format strings are much better treated as data and processed at runtime. If I did want to do it at compile time I'd just do the same trick as gettext's macro but compile the extracted string. It'd be pretty clear. Qt also used to do something similar for its slots system a long time ago but I have no idea if they still do.

I already mentioned vfprintf uses a hand-coded jump table and despite being very unclear is still more clear than rust's code. Have you looked at the rust functions I pointed you to but couldn't paste here? I'm betting not.

By the way, string formatting in rust is so poorly designed that it doesn't even support localization.
It will have to be re-coded at some point in the future although the interface itself shouldn't change. The implementation is a total loss.


I'm waiting for your reply here . Did you read rust's code? The post you're replying to is a separate topic I included for the guy (possibly you) who seems to think compile-time formatting is a feature yet they now realize was a mistake.

I don't want to interrupt your discussion, but since this is actually a thread about Go, why don't you take a look at Go's implementation of the C-like printf?

In my opinion, this is extremely straight forward like pretty much all Go code and it's one of the reasons why I like this language.

Because Go's a hopeless joke of a language as GC makes it unsuitable for everything but replacing Ruby. Who cares.

if you want a language that no one cares about you'd have rust.


728 lines of code for something that's well commented and statically verifies that all of the data being passed is capable of being displayed in a sane way where as the whole printf mess in C explodes if you give it things that aren't the right size.

Rust not only manages to implement it in less lines of code, but rather clear lines of code, and it prevents a lot of the "btw remember because of the fact that we can't into static analysis you have to make sure to only handle this function in this specific way!"

As for the expand_syntax_ext, that's a function that is used to parse all macro syntax, it's not unique to println, and it's only 71 lines of code, bringing us to a grand total of 799 lines of code in something that is doing much more than simply printing text, but is actually a component of expanding all function arguments, which thus should be compared to parts of C's pre-processor.

Because it's not equivalent. glibc's is long due to optimizations making it screaming fast. Their jump table-based parser reads almost like compiler output and it calls several functions hand-written in assembly but it'd be very hard to go any faster. Rust's approach was to sidestep optimizing parsing and process the string at compile time instead. The problem with that is it's amaaaazingly retarded and prevents internationalizing code (and here I thought Mozilla was the champion of diversity..). As a result, all this code will need rewritten or deprecated at which point they'll have to pay attention to optimization, too.
So which Rust dev are you?
let pat = self.ecx.pat_tuple(self.fmtsp, pats); let arm = self.ecx.arm(self.fmtsp, vec!(pat), args_array); let head = self.ecx.expr(self.fmtsp, ast::ExprTup(heads)); let result = self.ecx.expr_match(self.fmtsp, head, vec!(arm));

You are literally the most retarded poster I've encountered in 10 years. Was that part of your plan?

fuck internationalization. if it ain't English, it ain't worth it


Yeah guys fuck compile time checks, heartbleed wasn't a bug, it was a feature, we don't need to worry about the sizes of strings and buffers!

C's library isn't any more capable of internationalization than Rust is, both of them are equally easy to implement it on top of.

It's not a feature of the standard library because both are able to target embedded systems and don't need more bloat than necessary. Checking at runtime and loading the correct map of strings is a simple task for anyone with even minute programming skill. I'm sure even you, who can't understand the simple pattern matching code could read the first chapter of the book and manage it.

I like how what you just posted doesn't related to what that user said in any way.
And what I like even more is the fact that this dumb picture doesn't have anything to do with the discussion at hand either.
Stay classy, Holla Forums.

But that's not true. If you read the link I gave, one of the Rust devs explains the inability to internationalize is a known problem they decided to wait until after 1.0 to address.
And this is just dumb. It's especially dumb as rust's method will cause more bloat. If you're confused here, they're effectively baking part of the parsing and formatting per string into your code rather than deriving it at runtime which will obviously require more code/data unless you were formatting literals. In other words, they bloat your code in an attempt to increase performance. A better way would have been to handle it like compiled regexes (and I bet when they rewrite it that's what they'll do) but rust's community banished devs like me who could have corrected them so they currently have a fuck up on their hands.
This won't work in rust. Pay attention.
Fuxk you. Every time I post you learn more about this and how little you actually understood.

How many retard Olympics gold medals have you won this year? I'm sure a grandmaster like yourself could not fail to acquire more than 30!

Doesn't it make you feel bad that you know so little about this stuff that the best rustlin' you're capable of are random insults? We could have a real good tech fight here but you came unarmed.

If you read your own link you'd see that they actually link to a package that does it, while stating that internationalization is a feature that's not used widely enough to warrant being put in the standard library.

Reading comprehension, user. Graydon posts about the general and unsolved Internationalization issues in fmt, how they plan to eventually solve it, the code they've been working on to solve it, then gives a half-answer to the streetshitter asking about l20n, referring him to a third party library. Exert yourself.

Somewhat unrelated, but no one, including me, gives a shit about l20n. It's the xhtml2 of internationalization. I hope they don't fall for it. Mozilla is the only group still pushing that shit uphill.

No body gives a shit about you either user, next thing we know you'll be namefagging.

Dem digits tho