Insane bloat

* 21k lines of C code, in 2018 (WHY?)
* literally pages of segfault bugs✓&q=is:issue is:open segmentation
* implements an entire lex/yacc BISON parser, compiler and bytecode interpreter (in C)

* implemented in rust, yet depends on V8 (node js)
* works by parsing the query in rust and then handing the result off to nodejs (WTF?)
* contains a 70k line javascript file>

* 60 lines of nodejs>

Other urls found in this thread:>

Why even use C if you're using JSON? Why are people such niggers?

It's easier for humans and machines to parse than XML.


I'm going to be sick...

No bloated C is bad because bloated code in any language is bad.
People would be more likely to take you seriously if you actually posted your own opinion instead of copy pasting some retarded pajeets opinion in every thread.

It's (you) again.

o I am laffin
C is more verbose than other languages, that doesn't mean the implementation is bloated or bad, it's just meticulous.
Define bloat, do you simply mean more lines or do you mean the typical meaning of the word, full of useless features?

Is JSON insane bloat? I never used it but I just thought it was just some kind of map?

I mean loads of useless(already implemented elsewhere on targeted systems) features. From OP:
I can use the classic OpenSSL example of re-implementing malloc as another example of bloat. Of course there were(still are?) many other issues with this but thats besides the point.

C is verbose, bloated, and bad.

How about more lines and no useful features?

The solution, as any well-seasoned unix veteran will tell you, is to use "tar" if you want to copy a hierarchy. No kidding. Simple and elegant, right?For me, using "tar" to copy directory hierarchies is thearchetype of what is fundamentally wrong with Unix, thefundamental wrongness being that in Unix the user is forcedalways to say *how it is* s/he wants something done, ratherthan *what it is* s/he wants to do.

The notion of using the tape archiver to move a directoryhierarchy is not inconsistent with the notion of using C towrite robust, maintainable programs. Think about it.

It's a language. Do you expect it to not implement a lexer and parser? If not, would it really be better to roll their own?

check your reading comprehension

Are you saying that it reimplements bison itself? Where?

Use C for the hard stuff, and a proper parser written in something else for everything else.

JSON is easy to parse.

Okay, given that there's a big FSF copyright header at the top of parser.c, I think it's safe to say that they used flex and bison in the standard way.
Which means that parser.c and lexer.c are generated from the .y and .l files.
So I'm asking you again: would it have been better for them to write their own parser and lexer, instead of handing the task over to an old respectable set of code generators? Did you count the generated code as part of the 21k lines?

Probably the smallest standard library of all the useable languages.
MANPAGER="sed -n '4p'" man pax
pax — read and write file archives and copy directory hierarchies

Okay, I think I figured out where you went wrong. You're comparing jq to the 60 lines of node.js used by jq.node.
Let's say I've got a server running a pretty old system that doesn't package a useful JSON processor. I could use jq, or I could use jq.node.
If I use jq, I make sure I have some very basic dependencies installed, I put the code on my server, and I compile it. It's very easy. I know that because I've done it, on my server.
If I use jq.node, I have to install all of node.js and the node package manager before I can even get started. Now, I don't know exactly how complicated building node.js is, but I know it's a lot more complicated than building jq. Look at this shit:
Look at that. Building node.js actually requires Python. What the fuck?
jq.node isn't just sixty lines of javascript, it's an entire bloated javascript implementation. You're writing node.js and pretending you're using a language implemented in 60 lines. It's enormous, bloated, and a hundred times as slow as jq according to the benchmarks provided by jq.node itself.
jq.node has a parser, and it's much larger than jq's, it's just in a different repository.

isnt java like flash? im surprised people use this shit. hell, even hooktube makes me disable noscript because of json bullshit.
it gets tiring trying to defend yourself against botnet. And that's just from a browser perspective.

This might be bait but if it isn't bait then please stop talking about this subject until you know more about it

>contains a 70k line javascript file>
jq.node uses that one too, it just doesn't put it directly in the repository because it expects you to pull it in with npm.

quads of truth tbh
if Holla Forums followed this advice, it would be a better board

why should anyone care about this project anyway? never heard of this shit before.
do a proper post with all the info, or fuck off this board.

Oh my god who gives a fucking shit about how many lines of code it has I just want the one that works the fastest and the most reliably.

No you, /g/ owns this board now.

Just because an implementation of something already exists, doesn't mean it's the best implementation, especially in all contexts. I unironically trust anything made by the OpenSSL team more than I would the standard.

Ironically sometimes people reimpliment things without the extra features they need in order to be more performant or portable.

kill yourself

$ ls adirectory-a directory-b$ cp -r a b$ ls bdirectory-a directory-b

Well, maybe you were talking about backups.
Your argument is still retarded because tar was literally made for tarrin' shit up for tape backups.
It's literally the purpose behind the tool that you are griping about using it for.

Exactly this. The OS was not meant for desktop use and it's really only like 1% who persist anyway. Unix was meant for servers or phones or any place where the permanence of data was not a thing.
This is one of the reasons Windows GNU thing is so fucking funny.

This will give different results depending on whether b exists or not.
Use `cp -r a/. b/` to copy reliably.

how is this shit so pervasive? literally all of UNIX userspace is full of bizarre special cases and trapdoors. Fuck it. redesigning everything from scratch is the only solution.

It's a JSON processing tool.
Here's a dumb example:
curl | jq -r '.posts[] | .name + " " + (.time | todate) + " No." + (.no | tostring) + "\n" + .com + "\n"'

All the behaviour is consistent. Read the man pages, retard.
Good luck with that, maybe you'll come up with OpenVMS and hate it as much as VMS users do

Check what you said.

They tried with Plan9.

json has a better signal to noise ratio than xml...

And plain text has a signal to noise ratio of 1 while being garbage for structured data, and comparing things to XML is setting the bar unreasonably low. What's your point?

what is this? did a quick web search and nothing relevant came up.

I always hear people shit on XML but I don't see why, and people praise JSON seemingly because it's easy to parse and "human readable". XML seems like it'd be much faster to parse for a machine, I don't see why human readability should even be a factor for structured data, the whole point is to compose human output from the input if you want human readability.

What's so bad about XML, what's so good about JSON, and what are some noteworthy (good and bad) alternatives?

JSON is great because it is based on javascript :-DDDDD

XML is bad because it's tedious to write and too complex for non-hardcore users to fully understand. There's some sort of attack that uses tricky standard-conforming tags to exponentially grow in memory, so even parsing it can be problematic. And after all that it's still annoying for humans to deal with.
JSON is just barely human readable enough to get used for human-authored configuration files, but it's finicky enough to make writing it annoying - no trailing commas, no comments. If it were a little less readable it would be abused a lot less. It's kind of bad.
XML is much more powerful than JSON. There are tasks for which XML is only kind of sucky but for which JSON would be a disaster.
YAML is good for a lot of the things people should stop using JSON for. It's actually useful for human beings. It's not worth using for anything that doesn't involve humans, though.

looks like a useless crap, these things are easier to do in a normal programming language like Python.

I use Python for complicated things, but jq is great for one-liners. If I want to explore a document's structure or do quick and dirty data extraction it's much more convenient.
I'd rather write jq .foo[].bar than python3 -c 'import json, sys; for x in json.load(sys.stdin)["foo"]: print(x["bar"])'.

No thanks! ADA and C are the best tyvm now kys.

Permanence of data was definitely a thing, but the UNIX filesystem got corrupted all the time because it sucked. Backups on real OSes were made to protect against physical hardware damage. Backups on UNIX were made because the filesystem lacked robustness and because there was no way to undelete a file or restore a previous version.

It's pervasive because they don't care and because of their attitude of everything being the user's fault.

Stanford had a system called "labrea". It was (is?) avax 750 with 10 fuji eagles (4.5 Gbytes, which was a lotwhen it was first around...) Tape handling has always been a real weak point of unix.Any real operating system has much better backup/restorecapabilities, and a lot of these are 10s of years old...

This poor user tried to use Unix's poor excuse forDEFSYSTEM. He is immediately sucked into the Unix "group ofuncooperative tools" philosophy, with a dash of the usualunix braindead mailer lossage for old times' sake. Of course, used to the usual unix weenie response of"no, the tool's not broken, it was user error" the poor usersadly (and incorrectly) concluded that it was human error,not unix braindamage, which led to his travails.

Let me supply you with an example. Just today I had thefollowing dialog with one well known computer manufacturer'sversion of Unix: > rm temp rm: temp directory > rmdir temp/ rmdir: temp/: Is a directory(Of course if I type the name of the directory -without- thetrailing "/", rmdir works just fine.) Now just what theheck braindamage do you suppose results in this idioticerror message? "OF COURSE IT'S A DIRECTORY", I shout at myterminal, "WHY THE HECK DO YOU THINK I'M USING RMDIR?"Now that's the kind of teeth-grinding experience thismailing list is all about.

If you want to remember the actual last time you edited those files, then keep your own damn database of dates and times, and stop bothering us Unix Wizards.I thought this is what RCS is for.I'm TA'ing an OS course this semester. The last lecture wasan intro to Unix since all other operating systems were onlyimperfect and premature attempts to create Unix anyway.Some lecture highlights...An aside during a discussion of uid's and many a unixweenie's obsession with them: "A lot of people in the Unixworld are weird."When asked if Ritchie et al regretted some otherinconsistency in Unix metaphysics, "These guys probablydon't care."Have another twinkie.

Manual straw management with tape on the can master race.

Works on my machine™

Watch this:
$ mkdir a$ ln -s a b$ rmdir b/rmdir: failed to remove 'b/': Not a directory$ rm b/rm: cannot remove 'b/': Is a directory

Oh, and in case you wondered:
$ rm -r b/rm: cannot remove 'b/': Not a directory


I have never in my entire life used rm -r without -f.

That's because b isn't a directory, it's a symlink.
Shiggy diggy.

Have you ever actually read the openssl code? Its a clusterfuck of terrible programming. Bob Beck gave a talk[1] on all the cancer they removed in the first month after forking it. I suggest you give it a watch. Not that I think that libressl is the solution to the openssl problem but if you need openssl compatibility its much better then using the actual thing.

Because `b/` isn't a directory. It's a symlink.
`rm b` works absolutely fine.

literally this
Also unlink.


can you read?

I know why it happens, I set it up myself. But it's inexcusable for a single tool to both complain that "b/" is a directory and that it's not.

My solution to you is to stop using retarded GNU coreutils. Besides emacs pretty much all gnu software is shit. Following your example
sbase$ rm b/rm: unlink b/: Not a directoryOpenBSD$ rm b/
Sbase tells you what you are supposed to do and OpenBSD just does it. Both are more acceptable then retarded gnuisms.