Kill unix

Why do we still put up with the absolute fucking garbage that it UNIX 1.0. It's time for UNIX 2.0. Let me explain: Processes on linux take as input env variables and an array of c strings, print text (the "universal interchange format") to stdout and stderr then they return a single byte.

Processes are the fundamental building blocks of our operating system. All the build systems, init scripts and package managers we have are made out of this. So it's important for it to be solid i.e. it should be done with well typed structured data - not piles of spagetti strings. There's heen so many problems because of it like invocations failing when inputs start with '-' becaues it get misinterpreted as a flag or programs breaking when the file path contains a space.

We should revise the concept of a process to declaratively explain its input in a well typed form, make a new kind of "terminal" to invoke them that isn't based on simulating a 1970s teletype machine (so shit like resizing and scrolling will actually work), have a scripting language that doesn't misintepret fucking everything you write and work entirely based on pasting strings together and then implement a new set of coreutils where things "ls" simply produce a *list* (an actual data type) of files in the directory instead of just some text that it's recommended you shouldn't ever parse.

Other urls found in this thread:

multicians.org/multics-vm.html
9p.io/sys/doc/lexnames.html
linusakesson.net/programming/tty/
langsec.org
pastebin.com/zdtHEYV
rush.heroku.com/
code.google.com/p/hotwire-shell/
msdn.microsoft.com/en-us/library/windows/desktop/dd835506(v=vs.85).aspx
tcl.tk/
en.wikipedia.org/wiki/REXX
gnu.org/software/emacs/emacs.html
9p.io/sys/doc/net/net.html
dwheeler.com/essays/fixing-unix-linux-filenames.html
git.suckless.org/sbase/tree/cat.c
metzdowd.com/pipermail/cryptography/2018-January/033574.html
en.wikipedia.org/wiki/Filename#Comparison_of_filename_limitations
twitter.com/SFWRedditGifs

I swear this place causes a hive mind, because I was thinking the exact same thing last week. s-expressions or even json would be miles better than the arbitrary formats unix tools spit out and accept.

The magic of Unix is that everyone can think of a better solution. That's why there were so many versions of Unix in the early days--all these commercial Unix projects pushing for "standardization", even though it was their transgression of "traditional" Unix that made their shtick, so they conveniently skirted POSIX until the point of irrelevancy. Look me dead in the eye and tell me that FreeBSD is True Unix--I fucking dare you. And then try using FreeBSD without non-Unix amendments to Unix-tier fuckery like dtrace and ZFS and tell me you still like Unix.

What is Plan 9 from Bell Labs

I feel like we need to rethink the filesystem perhaps. Maybe "everything is a file" is a bad idea. Here's what redox thinks of the situation:

This would be horrible for shell scripting. Just decide on field and record delimiters (FS, GS, RS, US could have a use other than taking four values in the ASCII set, for example), forbid them everywhere where they could cause shit (filenames, usernames, etc...) and you're set.

stopped reading there

Ever used jq? I don't think it would be bad for shell scrpting at all.

I'd like to offer a counter point: there is literally nothing wrong with string manipulation.

Unix 1.0? That's from early 70's.

Plan 9 from Bell Labs and Inferno from Vitanuova. Learn some history!

In Plan 9 it can return a string.

Indeed that was a problem and a known one. That's why (virtaully) every command accepts -- to signal end of parameters.

Typed input implies that you'll need translators for every program or everyone speaking the same type. And if that type needs change you'll have some work to update the tools. Like you said, the good thing about text it is that it's universal, everyone in the field know it isn't performatic, but it is simple, easy to implement and everyone understands.

Check out Powershell, again, learn some history. Powershell uses objects instead of text as IPC, but you'll have to learn new methods and objects for every new command you want to use. If you actually use it for production, because objects and methods, you'll expent a lot of time reading the command manual to see objects it accepts and how to deal with them.

The thing is that UNIX tools are made around delimiter based input ouput (awk, for example) and fixing it do this well is easier than your pipe dream.
Now, having json as a format choice could be an idea.

Why would s-expressions be horrible for scripting?They're very easy to manipulate, and as Emacs has proven, extremely versatile.

By putting more power in the hands of the user, you can cut down on the number of core utils you need to supply.

I feel like we need to rethink the filesystem perhaps. Maybe "everything is a file" is a bad idea. Here's what redox thinks of the situation:

How is that missing any logic? It's like having the floor plan of your home inside of your home. The entry in /dev is simply the representation of the disk itself.

$du -h /dev/null0 /dev/null
Dunno about sysfs.

I knew about Redox before, but I have to see this concept in practice to say if its good or bad.

What you're describing is basically Microsoft Research's Project Monad which later became Powershell. It's nice in theory, but becomes extremely clunky and verbose in practice due to all the boilerplate code you need to write to convert between typed streams. It's decent to write larger programs with (but then why aren't you using a real programming language?) but horrible as a shell.

Unix was already killed by Plan 9, which is better in every way except those ways in which the majority of people interact with their computers.

But it's worth learning about and running just so you can see what you're missing, and if enough people start using it and developing software for it, we can all live happily ever after.

But we're not running any faggot fucking Unix clone written in some gay ass Pajeet-safe language. Sorry kid.

You could see it as a floor plan, but the way it looks like is that / contains /dev/sda, which is a disk that contains /, which is a filesystem that contains /dev/sda, which is a disk that contains / which is a filesystem that contains /dev/sda, which is a disk that contains / which is a filesystem that contains /dev/sda, which is a disk that contains / which is a filesystem that contains /dev/sda, which is a disk that contains / which is a filesystem that contains /dev/sda, which is a disk that contains / which...

They say it's their own implementation, but the idea is apparently inspired by plan 9's 9P.

So it's not better. Ok then.

I believe you but could you show me a couple examples please?

How about you do a simple fucking search?
I'll spoon fed this one to you, though. Here's how you list the most recent 10 files:

Get-ChildItem -Path 'C:\path\to\your\dir' | Where-Object { -not $_.PsIsContainer } | Sort-Object LastWriteTime -Descending | Select-Object -first 10

In Unix:
ls -t|sed 11q

Your "UNIX 2.0" is called Multics.

multicians.org/multics-vm.html

For reasons I'm ashamed to admit, I am taking an "Introto Un*x" course. (Partly to give me a reason to get back onthis list...) Last night the instructor stated "BeforeUn*x, no file system had a tree structure." I almostscreamed out "Bullshit!" but stopped myself just in time. I knew beforehand this guy definitely wasn't playingwith a full deck, but can any of the old-timers on this listplease tell me which OS was the first with a tree-structuredfile system? My guess is Multics, in the late '60s.

Yesterday Rob Pike from Bell Labs gave a talk on the latestand greatest successor to unix, called Plan 9. Basically hedescribed ITS's mechanism for using file channels to controlresources as if it were the greatest new idea since thewheel.There may have been more; I took off after he credited Unixwith the invention of the hierarchial file system!

Amazing, wasn't it? They've even reinvented the JOB device.In another couple of years I expect they will discover theneed for PCLSRing (there were already hints of this in histalk yesterday).I suppose we could try explaining this to them now, butthey'll only look at us cross-eyed and sputter somethingabout how complex and inelegant that would be. And thenwe'd really lose it when they come back and tell us how theyinvented this really simple and elegant new thing...

Oh boy, here comes the faggot with The Unix Haters Handbook quotes.
It was a good book, of course Unix isn't perfect. But most of these quotes deal with software that was either fixed or don't exists anymore.

Fixed as in neutered. As in, no longer Unix; disowned. The same kind of inane shit suckless won't shut up about.

if you mount your flashdrive to /media, is the flashdrive part of your harddisk?

The device representation is in /dev. The file system it has is mounted and is part of the file tree. You are just grasping at straws now aren't you?

This reminds me of Web 1.0 vs. Web 2.0, except UNIX systems are already Web 2.0. You don't like them, because you didn't grow up with them. Develop with them. Grow with them. You started out on a bloated, confusing mess, that you most likely don't know all the ins and outs from. It's too big. The machine running UNIX will never be truly "yours."
I think you're misinterpreting the UNIX kernel as Kernel+Userland. Plan9 has already solved this issue. Everything is a file and not just strings. But, C is not a string processor, it's terrible for that. C is what the kernel is made of. You're conflating C with SH here. And your examples are not the strongest. Using package managers, init scripts, and build systems as your foundations for "operating system" tell that you're not well-versed in the intricacies of UNIX. I honestly believe you should try Plan9. UNIX's golden days are behind us already.
This is defined behavior and is on the individual programmer to program against. This is not a strong argument. A stronger argument would've been: "The OS should be done with a memory-safe language, instead of "buffer overflow" the language."
How do you think types are processed? A program looks at the input, checks against a list, and picks the most fitting description. Chars are the building blocks of strings. String are the building blocks for the flag meta-type. I don't see what you're against here. You can make new types in C too. Even complex data types.
This is a solved issue. Get a new terminal emulator if you want dynamic scaling on resize. I don't know what your gripe is with scrolling. Do you not like how it auto-scrolls down when a new line is printed? You can change that. Do you not like how you have a max line count? You can change that. Your terminal is behaving weirdly? Change it. You see to me to be someone not well versed in any of the topics you're speaking about and just looking to vent after a failed dip into running a CLI-focused environment, instead of relying on GUIs and more packages.
The behavior is defined. It takes time to get used to it, but it's the best at what it does: work with strings. You can use single quotes to make literals, double quotes to make safely expand variables and functions, a new line or carriage return to print a new line. A file is not the same as a line. You can convert a file to a line and vise versa, but you can't treat them the same. A line is a 2D array. A file is a 3D array. I will be more than happy to teach you some of the tricks I learned while spending a weekend scripting something in bash and thinking about offing myself because sed was being retarded. But no, it was not sed, it was I.
Strings are there for the benefit of humans, not machines. If you wish to not use strings, there is machine code. If you wish to not use a language that's a glorified factory line for strings, you can also hold them in variables. If you don't want to use it as your main system scripting language, you can switch to python. You can do syscalls to python and use bash funtions in python too.

(2/2)

I almost forgot. There are lists and JSON files in bash. ls -la produces a table of space delimited entries. 9 columns, to unlimited rows. It also produces a JSON file. The line is a nested hash table of values, called by getting it's key or "number." You then call the line's "column" key for its hash table. You can even specify the delimiter. You can format the data into any specification you want. It doesn't have to be spaces. JSON and XML aren't actually real formats. They're human-defined specifications. Merely UTF configurations that some humans found to be easier to work with than plain text. Plain text is smaller, simpler, and more flexible than both XML and JSON. You can convert plain text to JSON and XML easily, but not so in the reverse. Because JSON and XML use variables instead of the line and column consts, they're a bit tougher to automatically navigate than just plain text, but not hard. You can use regex to format out all the markup and get plain text back. It doesn't really matter, whatever format you can think of is built from the foundation plain text laid. The only reason you would use a different format was if it was quicker to traverse your specification with the tools you had on hand, and those tools didn't include a rich text parsing library. I can think of Python, with it's downright awful bastardization of regex, JavaScript which will likely literally misinterpret, I haven't forgotten your qualms, your strings (less so, but not gone, with Python). And then there's stuff like CSVs and spreedsheets. Unless they're compressed, they're just plaintext with space delimiters replaced by commas. After writing all of this out I believe you're just angry because whatever you were trying failed, using UNIX's tools. But just like with JSON, sh is a product of only having a certain set of tools on hand. That is, humans can better recognize words than numbers, and it's easier to distinguish them from each other, so there was a focus on making it as word-centric as possible. This may become moot, as in your case, when the systems start deviating from the text norm and start using pictures to represent all of their data. I'm sure you're more familiar with working with a GUI than a set of CLI tools, otherwise why make this thread?
You shouldn't parse it if you're looking for a portable and safe script, but if you're only working on your own machine, and you're not naming your files like a monkey, you will for the most part, be fine. If something breaks while your parse ls, you can just loop over it, use globs, change the delimiter, or use it as a 2D array instead of a 3D array.

I am peeved you baited my autism into replying, and stopping me from having a very satisfying wank, but here is your (You).

I'm just gonna leave this here, since I already brought up the redox filesystem. >>870273
What does everyone think of this pic related?

Terminal emulators "work" in the same way that X11 "works".
It is functional, but I think what OP was asking for was a completely redefined sort of command line that doesn't involve emulating teletypes.
Even the best terminals we have now still rely on being "emulators" of ancient mainframe terminals.

I don't believe OP was thinking too deep into it. He mentioned
and that was his only point about teletype emulators. Although, I recognize that there's a minor crevice filled with term dissillusionists, who ask "is this really it?" But, I don't know where to go from here without a specific complaint. Is it all the typing? Is it too simple? Is there something faster? Etc.

Powershell is fucking garbage, both in concept and execution. It's actually a step back from DOS-style interpreters.

You mean plaintext?

A string is sequence of data that ends in a NULL character. As everyone knows it's super easy to work with and would never cause any problems.


I'd be interested to see their file because file on my system is just a guessing program.

>no cp -r
1 / 10, made me reply.

The problem with re-thinking such things is that you will have to support the compatability with other systems or else nobody will use it. And supporting compatability is kinda defeating the purpose of "Kill UNIX".

cp -r is bloat, just use tar
this is what Plan 9 weenies actually believe

Support compatibility with windows and mac? That's a good idea but probably very hard to achieve. Better to just make a new OS and hope to attract steam and video/audio/business companies to the platform.

No, I mean, OP kinda implies that things like ELF and POSIX standards are bad too so this UNIX Killer OS should ditch them as well but by doing so you will break compatability with Linux and other UNIX-like systems.

`cp -r` was replaced with `dircp`, brainlets. Recursion with cp leads to all sorts of AIDS with symlinks and the like.

A process needs to be part in several namespaces: the filesystem, the process IDs, users, groups... etc. What if a process' namespace is controlled by its parent process and the parent can either pass its own namespaces down to child processes or create a new environment for each of its children? While we're at it, just allow processes to have multiple namespaces of each type. Add a systemcall getfs() that returns all filesystem trees that the calling process has access to. Allow namespaces to be sent through sockets (just like you can send file descriptors through sockets in unix).

Why do processes have to be created from a file that is on a filesystem? Why can't we create processes from memory? Just give it a pointer to some memory that contains the new process' code and there's your new process. Add a feature to tell the kernel you want the child process to share the parent's page table and you've made threading obsolete.

What's that? Processes are isolated? Then why do we need users any more? That's right, users are too high-level for our shiny new microkernel world. You can implement users on top of that.

Oh right, microkernel. A filesystem is just a service that another process provides. Where do we get that from?... Right. Call the parent. Which might in turn call its parent. Well.. sounds slow. Can we solve that with shared memory? Or does that fuck with processor caches too much?

Hasn't Poettering been woking on it since almost a decade or so?

#!/bin/rcswitch($#*){case 2 @{builtin cd $1 && tar cf /fd/1 .} | @{builtin cd $2 && tar xTf /fd/0}case * echo usage: dircp from to >[1=2] exit usage}

You mean that shitty example from cat -v?
$ pwd/a/b$ cd ..$ rm -rf b$ pwd/d$ ls -l /ab -> /d/e...$ # Drat! I've just removed the wrong set of files, d/e!
No matter how many times I tried to replicate it, nothing ever happened.

On top of that Plan 9 doesn't even have symbolic links.

If it was a new OS it would depend if the standards were sane or not. I mean there's no reason to throw out good ideas.
However there's definitely no reason to keep shit ideas because 10 people might be used to it being broken.
There might also be a different design idea. Unix is kind of a server OS. It's really not designed for real users and you can see that. So if someone was making an OS for users they'd probably break a lot of ideas because it's a different design goal.

which one is more readable?
which one will break when faced with full Unicode character diversity including bidirectional text in file names?

What makes you say unix is a 'server OS'? As opposed to an OS for 'real users'? It's equally good (or shit) for all use cases.

Not really. rm is an accident on a server, it's potentially half your life for a real user.
Did you never wonder why so many unix people are anal about backups? It's because servers assume you have a sysadmin to blame/fire for data loss. A user OS knows the sysadmin is the user and blaming your user after fucking them over is total faggotry.

THIS IS NOW A RUST THREAD


THIS IS NOW A RUST THREAD


THIS IS NOW A RUST THREAD

THIS IS NOW A RUST THREAD


THIS IS NOW A RUST THREAD


THIS IS NOW A RUST THREAD


THIS IS NOW A RUST THREAD


THIS IS NOW A RUST THREAD

kys fag

Everyone in this thread was tricked. This thread, like many on this board, is a cross-post from lobotomychan. Please exit stage life.

This thread was before the /g/ one.
Also,
go back to reddit

The progress of symlinks got a lot better since that example was written. pwd reports the expected result, to see the actual path you have to pass -P. pwd -P

Read Pike's paper about lexical file names (where that example was taken from).
9p.io/sys/doc/lexnames.html

I agree OP. I've been studying the Linux system call interface for a while now. My long term goal is to build my own programming language and use it to erect my own user space on top of Linux. While I agree that Unix as defined by POSIX is outdated, I think Linux is way too valuable to just throw away. Maybe it would be better to build a new cohesive system on top of it.

Let me comment on specific points:


"Printing" is an outdated notion of output that dates back to the ages where program output was literally printed on paper for the operator. On Linux, the interface is read/write and it operates in terms of bytes/octets, not text. They're often the same thing, but not necessarily. You can just as easily send serialized text or binary data according to some format like newline-separated JSON or CBOR.


I agree. Power Shell on Windows, of all things, implements this concept. I would go with the "universal programming language" approach and use serialized objects over untyped interfaces for interprocess communication.


Please expand on this.

A reference for everyone: linusakesson.net/programming/tty/

It's called using an existing tool to solve a problem, as opposed to writing more complex tools. Is sed a shitty hack that uses ed?

Didn't even know cat-v had problems with symlinks. I don't even know if I care, all I was saying is that it requires fuckery to deal with them.


Because of the fuckery required to manage them.


See

Traditional line-oriented Unix tools would obviously suck at whatever new paradigm we think of. We're in a thread about Unix 2.0, it implies the tools would have to be redesigned.

There is a middle ground: newline-delimited JSON. Essentially, program output is a list of JSON objects separated by newlines. The objects can then be processed independently of each other and even streamed. The only change in the standard JSON grammar is the fact newlines cannot appear in the middle of an object. To make it human readable, each object can be turned into a regular multiline JSON object. Pretty neat.

Strings are pointers to characters, and pointers are numbers. Not a significant innovation in my opinion.

On Linux, processes don't actually return anything because there is nowhere to return to after the program is over and it can't return to a kernel address. Programs which don't call sys_exit to request to be killed will segfault. Also, the notion of a return value is tied to synchronous programming, and processes are inherently asynchronous.

If we want ro return complex data in addition to error codes, a new sys_exit2 could be created that takes a block of data and associates it with the process results structure that mediates retrival of return code by PID. It'd probably be complicated though, and involve copying data to kernel space.


I disagre. The fact is regular expressions are the most sophisticated string manipulation mechanism available in all current mainstream languages. If they had an Earley context-free grammar parser that allowed people to feed it a grammar and input then get an abstract syntax tree or an error back, I'd be much more friendly towards text manipulation.

Remember that most security issues arise in imput handling layers. Buggy parsers cannot exist in a secure operating system.

langsec.org

ELF is great, it's POSIX that is dated and doesn't even encompass all the functionality of any OS. It's full of incomplete "implementation-defined" sections that Linux manual pages specify fully. The latter is much, much easier to program against and there are great new features, system calls and interfaces which are not in POSIX.


I just said PowerShell existed. In practice, I think it's too verbose to be used as a shell. There was an answer on stackoverflow that talked about this at length, but it's been deleted. The question was:


Wish I could find some archived version.

You can do this on Linux via the clone system call. It's how user space "threading" libraries are implemented on top of the "execution context" abstraction present on Linux. You can tell Linux exactly what you want the new process to share with the current one; if all you're going to do is execve some throw away command, you don't have to share much. Sadly, shit like glibc gets really freaky when you call clone behind their backs. Maintainers have gone on record saying the use case can leave glibc in an inconsistent state and is not supported. God I hate libc and its global faggotry like buffers and errno.


He just created software that's meant to be used specifically on Linux rather than the entire Unix-like ecosystem. Systemd makes use of Linux features like cgroups. It's not GNU/systemd, it's systemd/Linux. GNU can fuck off.

I think this is the way to go. Building Linux up to be a better system at the expense of traditional Unix is how it will be killed.

God please NO. There is no need for this crap.

I worked in UNIX and UNIX-like environments from '92 - '01 and ran FreeBSD on home computers for much of that time (after finding that every version of LINUX in those days absolutely sucked - they crashed regularly while FreeBSD never did). I've gone over to the M$ Dark Side only because my current employer uses winblows as the system of (stupid) choice. Yes, there are inconsistencies and inconveniences in the different 'NIXes, but they all beat the hell out of anything else I've seen come down the pike since I started programming in '81. If you can build the perfect OS, knock yourself out and program it or organize a group and get it done.

Turns out I can see the post because I have access to the moderator tools. Here's the top answer:

---

There are a couple of differences that I can think of; just thoughtstreaming here, in no particular order:

1. Python & Co. are designed to be good at scripting. Bash & Co. are designed to be *only* good at scripting, with absolutely no compromise. IOW: Python is designed to be good both at scripting and non-scripting, Bash cares only about scripting.

2. Bash & Co. are untyped, Python & Co. are strongly typed, which means that the number `123`, the string `123` and the file `123` are quite different. They are, however, not *statically* typed, which means they need to have different literals for those, in order to keep them apart.
Example:

| Ruby | Bash ----------------------------------------- number | 123 | 123 string | '123' | 123 regexp | /123/ | 123 file | File.open('123') | 123 file descriptor | IO.open('123') | 123 URI | URI.parse('123') | 123 command | `123` | 123

3. Python & Co. are designed to scale *up* to 10000, 100000, maybe even 1000000 line programs, Bash & Co. are designed to scale *down* to 10 *character* programs.

4. In Bash & Co., files, directories, file descriptors, processes are all first-class objects, in Python, only Python objects are first-class, if you want to manipulate files, directories etc., you have to wrap them in a Python object first.

5. Shell programming is basically dataflow programming. Nobody realizes that, not even the people who write shells, but it turns out that shells are quite good at that, and general-purpose languages not so much. In the general-purpose programming world, dataflow seems to be mostly viewed as a concurrency model, not so much as a programming paradigm.

Here's the full answer:

pastebin.com/zdtHEYV

Agreed. It's like having to perform a lengthy exercise for an anal retentive typing teacher when a simple shorthand note would suffice.

It is unstructured plaintext with inconsistent delimiters.

I like the idea but how would one implement cross-program typing?
It would surely require a change to the C standard.

There's a damn good reason your hipster OS never caught on.
With any luck, it never will.

Why can't we drop C as well?

I like the idea of a fundamental data structure transfer mechanism. Things like JSON and CBOR implement this idea for human-readable and binary formats, respectively.

C data types are essentially integers of different sizes, floating point numbers and pointers. The language also supports various aggregates of these things:


Most binary file formats of old mirror these structures. I don't think it is a coincidence. Modern languages support more data structures at the language level. Things like linked lists, dynamic arrays and hash tables. The new formats have risen to mirror that. Decoding JSON data in C is some kind of absolute madman endeavor, but other languages just get a bunch of nested objects.

I think the format should be something like CBOR or JSON, so that we can parse it behind the scenes in a secure manner and return data to applications as programming language data structures. They can then extract whatever data they need without any need to implement a schema or type system for any specific application. The application's only concern would be to validate the data; no parsing it from the ground up. We already have tools that check whether JSON data follows a given schema; this could be our "type system".

While S-Expressions are indeed quite elegant, they're just linked lists. They don't directly support other data structures such as hash maps or arrays. Sure, they can be extended in order to do that, but is it still S-Expressions if we do?

YESS
That's where Redox comes in. Rust ftw

That's the ideal, in my opinion.

I often hear Go programmers saying it's better to reimplement things in Go than link to a C library. If they were to take this idea far enough, they could one day build a cohesive pure Go user space.

Personally, I think a language that respects the SysV ABI and produces ELF executables is best. Most "modern" languages don't care about this. Dynamic languages are fully virtualized; interoperating with them means hosting the virtual machine itself in your process. C++ produces ridiculously mangled symbols and extremely incompatible executables due to things like exceptions. It's virtually impossible to link to a C++ program from anything but another C++ program, and often times the exact same compiler must be used due to ABI issues. This is why C++ people sometimes wrap their code behind a C interface. Ironically, only old ass languages like C, Fortran, Pascal etc., the same ones we're trying to replace, care about this kind of stuff.

The pastebin page was removed. Can you reupload?

What the fuck? They removed a fucking CC-BY-SA text that had attribution in the title? Fuck these people.

Sure, of course I'll post it. I'll split it up.

---

I have the feeling that trying to address these points by bolting features or DSLs onto a general-purpose programming language doesn't work. At least, I have yet to see a convincing implementation of it. There is RuSH (Ruby shell), which tries to implement a shell in Ruby, there is [rush][1], which is an internal DSL for shell programming in Ruby, there is [Hotwire][2], which is a Python shell, but IMO none of those come even close to competing with Bash, Zsh, fish and friends.

Actually, IMHO, the best current shell is [Microsoft PowerShell][3], which is very surprising considering that for several *decades* now, Microsoft has continually had the *worst* shells *evar*. I mean, `COMMAND.COM`? Really? (Unfortunately, they still have a crappy terminal. It's still the "command prompt" that has been around since, what? Windows 3.0?)

PowerShell was basically created by ignoring everything Microsoft has ever done (`COMMAND.COM`, `CMD.EXE`, VBScript, JScript) and instead starting from the Unix shell, then removing all backwards-compatibility cruft (like backticks for command substitution) and massaging it a bit to make it more Windows-friendly (like using the now unused backtick as an escape character instead of the backslash which is the path component separator character in Windows). After that, is when the magic happens.

They address problem 1 and 3 from above, by basically making the opposite choice compared to Python. Python cares about large programs first, scripting second. Bash cares only about scripting. PowerShell cares about scripting first, large programs second. A defining moment for me was watching a video of an interview with Jeffrey Snover (PowerShell's lead designer), when the interviewer asked him how big of a program one could write with PowerShell and Snover answered without missing a beat: "80 characters." At that moment I realized that this is *finally* a guy at Microsoft who "gets" shell programming (probably related to the fact that PowerShell was *neither* developed by Microsoft's programming language group (i.e. lambda-calculus math nerds) nor the OS group (kernel nerds) but rather the server group (i.e. sysadmins who actually *use* shells)), and that I should probably take a serious look at PowerShell.

Number 2 is solved by having arguments be statically typed. So, you can write just `123` and PowerShell knows whether it is a string or a number or a file, because the cmdlet (which is what shell commands are called in PowerShell) declares the types of its arguments to the shell. This has pretty deep ramifications: unlike Unix, where each command is responsible for parsing its own arguments (the shell basically passes the arguments as an array of strings), argument parsing in PowerShell is done by the *shell*. The cmdlets specify all their options and flags and arguments, as well as their types and names and documentation(!) to the shell, which then can perform argument parsing, tab completion, IntelliSense, inline documentation popups etc. in one centralized place. (This is not revolutionary, and the PowerShell designers acknowledge shells like the DIGITAL Command Language (DCL) and the IBM OS/400 Command Language (CL) as prior art. For anyone who has ever used an AS/400, this should sound familiar. In OS/400, you can write a shell command and if you don't know the syntax of certain arguments, you can simply leave them out and hit F4, which will bring a menu (similar to an HTML form) with labelled fields, dropdown, help texts etc. This is only possible because the OS knows about all the possible arguments and their types.) In the Unix shell, this information is often duplicated three times: in the argument parsing code in the command itself, in the `bash-completion` script for tab-completion and in the manpage.

Number 4 is solved by the fact that PowerShell operates on strongly typed objects, which includes stuff like files, processes, folders and so on.

Number 5 is particularly interesting, because PowerShell is the only shell I know of, where the people who wrote it were actually *aware* of the fact that shells are essentially dataflow engines and deliberately implemented it as a dataflow engine.

Another nice thing about PowerShell are the naming conventions: all cmdlets are named `Action-Object` and moreover, there are also standardized names for specific actions and specific objects. (Again, this should sound familar to OS/400 users.) For example, everything which is related to receiving some information is called `Get-Foo`. And everything operating on (sub-)objects is called `Bar-ChildItem`. So, the equivalent to `ls` is `Get-ChildItem` (although PowerShell also provides builtin aliases `ls` and `dir` – in fact, whenever it makes sense, they provide both Unix and `CMD.EXE` aliases as well as abbreviations (`gci` in this case)).

But the killer feature IMO is the strongly typed object pipelines. While PowerShell is derived from the Unix shell, there is one very important distinction: in Unix, all communication (both via pipes and redirections as well as via command arguments) is done with untyped, unstructured strings. In PowerShell, it's all strongly typed, structured objects. This is so incredibly powerful that I seriously wonder why noone else has thought of it. (Well, they have, but they never became popular.) In my shell scripts, I estimate that up to one third of the commands is only there to act as an adapter between two other commands that don't agree on a common textual format. Many of those adapters go away in PowerShell, because the cmdlets exchange structured objects instead of unstructured text. And if you look *inside* the commands, then they pretty much consist of three stages: parse the textual input into an internal object representation, manipulate the objects, convert them back into text. Again, the first and third stage basically go away, because the data already comes in as objects.

However, the designers have taken great care to preserve the dynamicity and flexibility of shell scripting through what they call an *Adaptive Type System*.

Anyway, I don't want to turn this into a PowerShell commercial. There are plenty of things that are *not* so great about PowerShell, although most of those have to do either with Windows or with the specific implementation, and not so much with the concepts. (E.g. the fact that it is implemented in .NET means that the very first time you start up the shell can take up to several seconds if the .NET framework is not already in the filesystem cache due to some other application that needs it. Considering that you often use the shell for well under a second, that is completely unacceptable.)

The most important point I want to make is that if you want to look at existing work in scripting languages and shells, you shouldn't stop at Unix and the Ruby/Python/Perl/PHP family. For example, [Tcl][4] was already mentioned. [Rexx][5] would be another scripting language. [Emacs Lisp][6] would be yet another. And in the shell realm there are some of the already mentioned mainframe/midrange shells such as the OS/400 command line and DCL. Also, Plan9's rc.


[1]: rush.heroku.com/
[2]: code.google.com/p/hotwire-shell/
[3]: msdn.microsoft.com/en-us/library/windows/desktop/dd835506(v=vs.85).aspx
[4]: tcl.tk/
[5]: en.wikipedia.org/wiki/REXX
[6]: gnu.org/software/emacs/emacs.html

pajeet pls go

Appreciated.

Another answer quoted _why, that Ruby guy. It's a nice example:

---

Not long ago a friend asked me how to recursively search his PHP scripts for a string. He had a lot of big binary files and templates in those directories that could have really bogged down a plain grep. I couldn't think of a way to use grep to make this happen, so I figured using find and grep together would be my best bet.

find . -name "*.php" -exec grep 'search_string' {} \; -print

Here's the above file search reworked in Ruby:

Dir['**/*.php'].each do |path| File.open( path ) do |f| f.grep( /search_string/ ) do |line| puts path, ':', line end end end

Your first reaction may be, "Well, that's quite a bit wordier than the original." And I just have to shrug and let it be. "It's a lot easier to extend," I say. And it works across platforms.


Oh and this answer is by Jorg W. Mittag. Don't fuck me, DMCAing Stack Exchange bots.


Am I in /g/ ?


No problem.

Because there's no mature enough alternative yet.

/g/ is in you

That's the point. OP's concept looks nifty on the first glance, but when implemented IRL its shortcomings become obvious. I actually disagree regarding "execution" being garbage here. Aside from needless wordiness (but what do you expect from Pajeetrosoft), it's the concept itself that is bad through and through. Everything else stems from that.

You just described Plan9.

jq is such a fucking pain in the ass for anything other than simple selections.

Also if you've ever done PowerShell scripting you'd know how messy this would end up looking. I'm with on this. Though there would need to be some kind of standardized way of inputting these things for scripts and some nice visually obvious way of representing them.

Finally there is someone who agrees that general purpose languages aren't always better than shell for everything. It's such a fucking annoying mindset that you see almost everyhwere.

Diversity confirmed for cancer once again.

Yeah, that Jorg W. Mittag is a pretty based guy. I learned a lot about programming languages just from reading his answers. He makes a lot of good points.

Supporting every language isn't diversity, it's good software.

This entire post is why I suggest there's three types of languages: programming, scripting and gluing.

Bash also cares about interactive use. Just a minor nitpick though.

Not just up to but also in minimum couple of 1000 lines. Often the solution for very simple problems in python is at least a few thousand lines and uses all kinds of obscure libraries and the whole thing cna be replaced with 10-100 lines of shell and standard Unix tools.

To the profane, powershell. To the user who has used the shell for few hours, sh.
How is the shell syntax responsible for anything related to multibyte support? Protip: it's not.

For things like filenames it's sometime's shell's responsibility. Though I believe we should only allow a subset of ascii for filenames.

Interesting, I've never looked into plan9. But it seems like they're still buying to much into the file meme. Files just aren't the best kind of interface, especially when you need special ioctls for all kinds of stuff.

Just increase the number of file types and you're golden.

attaboy, at least use plan 9 from userspace, acme is the best thing ever made

import shell_programsrun_complex_shell_program()
only 2 lines

You could also look at how Hurd solves this. Everything is a filesystem but arbitrary programs can decide what is done with the operations. Think of fuse but for everything.

How it compares with alternatives?

Plan 9 doesn't have ioctls. Devices are controlled through writing in files[1]

[1] 9p.io/sys/doc/net/net.html
[code]
The convoluted syntax and semantics of the UNIX ioctl system call convinced us to leave it out of Plan 9. Instead, ioctl is replaced by the ctl file. Writing to the ctl file is identical to writing to a data file except the blocks are of type control. [\code]

No fuck you. Linux does it right: anything other than path separator or NUL is allowed. That's how it should be. You can fuck off with your POSIX "portable" file names.

dwheeler.com/essays/fixing-unix-linux-filenames.html

How do you differentiate between fields in tabular data (what df, free, ls -l and hundreds of tools give you) with only one delimiter? You can't, you need a second one. Forbidding only newlines in filenames would fix so much shit; (RS=\n and FS=\0 this way).

lmfao

Unicode in filenames is a bad idea as well.
Why would a filename need obscure kanji or emoji in it? It doesn't.
If people want their graphical filemanager to display some fancy shit like that they could very well have some kinds of format that is encoded into regular ascii a la base64 or the punycodes that are used in domains.

Use double dashes to separate options from file names. Next time, we don't design shitty tools that only get 80% of the job done correctly.

Do you people listen to yourselves?

Fix Unix, not a perfectly good file system that isn't doing anything wrong. You people remind me of Windows and its DOS-era interfaces that break when msys2 creates a file name with a ? in it. NTFS doesn't care.

The UTF-8 part is sound, though. It should be standard everywhere.

...

There are plenty of tools that don't support -- so using ./-file would be more universal solution to this.

That's a nice solution when it works, but it doesn't always work. There's a standard, but people often ignore it and roll their own based on how they sort of think it works. Here's jpegoptim:
$ jpegoptim -alto1.jpgjpegoptim: invalid option -- 'a'jpegoptim: invalid option -- 'l'jpegoptim: invalid option -- '1'jpegoptim: invalid option -- '.'jpegoptim: invalid option -- 'j'jpegoptim: invalid option -- 'g'Average compression (0 files): -nan% (0k)$ jpegoptim -- -alto1.jpg$ jpegoptim ./-alto1.jpg./-alto1.jpg 2272x1704 24bit N JFIF [OK] 954698 --> 954698 bytes (0.00%), skipped.
-- makes it just ignore the other arguments. I have no idea why it does that, and I feel too disgusted to go digging into it.
./ is a nice solution when you're working with relative file paths, but not all command line arguments are file paths. I'm sure there are badly written programs out there that don't have a way at all to accept certain arguments with a leading dash.
If the underlying API reflected the command line structure this problem wouldn't exist. But it doesn't, the underlying API just passes an arbitrary sequence of arbitrary strings (without null bytes). Each program needs some kind of parser, and it turns out a lot of them use different, incompatible parsers.
Saying that everything is a string doesn't actually get rid of complexity, it just moves it around. Now the complexity has to be implemented a thousand times by different people and all of them do it in different ways, possibly very subtly different ways, and it gets harder to manage, not easier.

I just did a quick read through the source code of the implementation of cat[1] that I was using and I'm fairly certain that '--' is not interpreted by the program and is in fact a function of the shell. Either that or I'm just retarded.

[1] git.suckless.org/sbase/tree/cat.c

ignore this I'm just retarded.

man 3p getopt

Is that RMS? What's the story behind the picture.

Some anime series I have only have their name in kanji/kana. Without those characters it would be hard to tell which is which.

-- is not a feature of the shell. Lol.

I know. I use getopt. The creator of jpegoptim apparently didn't, or used it wrong, and that's a problem whether it's a mistake I would personally have made or not.


It's a pun.

...

He really is autistic.

And that's exactly why we love him.

GIMP?

Yes. Truly a sign of freedom respecting software.

maybe we should stop trying to shove a multi-user timeshare operating system into every teapot?
How about keeping embedded appliances, content servers and personal computers separate?

What non-white hell hole is this?

Some Spanish-speaking south american country, maybe. Governments there simply don't give a fuck. It's not built for cars like USA is.

Not using emoji in filenames I agree (this is more a side-effect, though), but the world doesn't have to learn and use english because of filesystem limitations.
UTF-8 is easy to understand and implement, just use it.

Multics mapped all information into the address space as segments.


The lesson I just learned is: When developing with Make on 2 different machines, make sure their clocks do not differ by more than one minute. Raise your hand if you remember when file systems had version numbers. Don't. The paranoiac weenies in charge of Unix proselytizing will shoot you dead. They don't like people who know the truth.Heck, I remember when the filesystem was mapped into theaddress space! I even re

What's with all this seemingly lost technology? Every time I read old papers, I am impresed by what our ancestors have achieved, and yet it's all but forgotten. Things like operating systems and hardware architectures were alive and advancing but now it seems they're literally dead subjects because we are at the "state of the art". I used to think programming languages was part of the same group, but now I see this new enthusiasm for type systems and new languages are being created every year or so.

Unix 1.1

cant wait for the green light to put redox on bare metal...

Because as OP said, it's no longer the 1970's. The US is no longer leading technological innovation and unless you want an ungodly mess of text encodings we better make a standard that can accommodate English, Chinese, Spanish, Japanese, Korean and Arabic. At least.

By what metric?

test

There's already UTF-8. It supports everything and works with legacy tools. Absolutely nothing else is needed. There is no need to justify this any further. UTF-8 is something that should simply be shoved down people's throats. People will deal with it and the world will be better. UTF-8 at the OS, library and programming language implementations is just the right thing to do.

this is satire right?

A lot of those papers that would still be impressive if they came out in 2018.

When something is better, they want you to learn about the old thing so you know how much of an improvement the new thing is. When it's worse, they don't want anyone to know about the older technology. That's how advertising works.

When it's UNIX, you have to pretend the rest of the world didn't exist.

What I find disgusting about UNIX is that it has *never*grown any operating system extensions of its own, all thecreative work is derived from VMS, Multics and theoperating systems it killed.

Multics was written in a high-level language first. ITS ranon the PDP-6 and PDP-10.Sure they came up with an implementation. You just make amachine that looks just like a PDP-11 and you can port unixto it. No problem!The latest idea is to build machines (RISC machines withregister windows) which are designed specifically for Cprograms and unix (just check out the original Berkeley RISCpapers if you don't believe me: it was a specific designgoal). Now, people tell me that the advantage of a Sun overa Lisp machine is that it's a general-purpose machine ("Ofcourse it's general purpose." they say. "Why it even runsunix.").Hmm, well this example shows that at least the weenix uniesknow how to USE recursion!

Are YOU satire?

Non sequitur. There are good memes.

I'm not sure.

Again, I haven't looked into plan9 yet. Seems like I shoud!


What's wrong with separation of privileges? We need more of that tbh. Users might not be the best privilege model, but it's better than nothing. I see the point you're trying to make, but multi-user is a bad argument for that.
It's all a matter of cost and there are only so many talented programmers in the world who can write fast and reliable software for microcontrollers. Shipping a full-blown unix with a few lines of pajeet software is much cheaper.

Have fun with all the filenames you can't type or look identical.

I don't know much about Multics or ITS other than what I read on wikipedia, but my government used to have these AS/400 machines and they were extremely impressive. The people I talked to and the papers and posts I've read about it described a disturbing amount of innovations *nix tools didn't adopt until relatively recently (as far as I know), simple things like autocompletion. Also it didn't use file descriptors for everything; everything was an object that was transparently persistent. It's amazing that this system saw poduction use and even supported things like apache, mysql and the JVM.

Amazing "legacy" technology.

Brainlet.

We need physical privilege separation.

There was a nice comment on this topic when the meltdown stuff was going down.
metzdowd.com/pipermail/cryptography/2018-January/033574.html

Physical privilege separation is a waste of resources (in terms of ROI) in many areas. Are you going to pay for it? Because a lot of people won't. Customers' demand dictates product availability, so this will make your preferred solution even more expensive. I think the only area where this could be done at reasonable cost is - ironically - cloud computing: you can share machines with other customers, you can horizontally grow or shrink your infrastructure on demand - just make sure you don't share machines with others at the same time.

But there's another point in the e-mail you linked. Restricting untrusted code (i.e. not running untrusted turing-complete-anything on a machine that also runs other bits of code) doesn't seem too hard. Disabling Javascript, limiting HTML to its core features and using old BPF instead of eBPF is no rocket science.

If you're gay enough to want to name a file -file why wouldn't you just suck the dick of the double quotes?

Where's the work towards making this a reality?

...

"Untrusted code" is pretty much any code you run. Really, turing complete virtual machines or interpreters, such as JavaScript ones, can be very secure, since they will, in theory, only do what the interpreters let them do. In practice, buffer overflows and arbitrary shellcode execution exist, so now your problem applies to pretty much any program that can read untrusted input. Yes, even HTML interpreters.

It bamboozles me how Holla Forums always seems clueless about this, almost as if Rust's memory safety only meant preventing segfaults. No, segfaults are good; this is security 101.

It is currently expensive, that's true. And the huge amount of RND needed to build proper hardware that allows physical separation would cost a shit ton. But I believe it would be worth it in the long therm.


Double quotes ain't not gonna do nothing here m8.


We have yet to agree what we would even be making yet.
Also we're still missing the name and the logo.

The "rust will fix all our security problems" meme is retarded.

No. Rust, and Ada, and pretty much any language that's not C and has array bounds checking or iterators will solve our most basic and yet most dangerous security problems. Logic bugs can cause data loss, information leakage and even compromise confidential information from time to time, but they won't let you install backdoors on your target to ensure all these things and more can be done continuously on the infected machines with total impunity.

It's also not just Rust, it's also sane mitigation mechanisms that don't let a program go rogue and wreck the shit out of everything, but we do not have those. Instead, we have shitty patches that attempt to mitigate these, but they are very fallible since they are actual patches over a ragged fabric that's not holding too well. We have to rebuild the base so it is solid from the start, but we won't because the current fabric is just barely functional.

UNIXTARDS BTFO

Legal filenames are edge cases. Like malloc returning NULL is a weird edge case too right?
It's definitely consistently confusing.

Everyone knows that the true patricians character set is ISO646, the only real cross platform way to send text. If your C code isn't written with trigraphs, delete it. If you use a language that doesn't support ISO646, kill yourself.

Maybe this is spoon feeding but this is for any new fags who aren't aware, about what charecter sets are acceptable for poeople to use

Oh fuck off faggot. If some retard is making those sorts of filenames it's his own damn fault for being homosexual. But of course, modern nu-males can't handle that kind of unconstrained environment can they? They have to demand that their OS make everything retard-proof for them so they aren't allowed to make a fucking file named fucking ".

Why don't you go write a script in Rust and fellate yourself over how you don't have to manage memory like you had to in crusty old C.

...

...

...

z/OS is PURE CANCER
JCL GAVE ME AIDS

lol you missed the .filename file. It's such a powerful bug that if you type ls -a in your home dir you'll see dozens of homosexual retarded devs have taken advantage of it. But you're a hardcore man you can handle all the unconstrained files up in your folders right. You don't need no protection.

Those are a shell feature. They don't make any difference whatsoever here.

its called a MAINFRAME you NIGGER

its an ENTERPRISE

AT&T shills managed to convince you that UNIX bugs are actually the user's fault. There should be no problem with any bytes in any character set being a valid filename. The shell and "group of uncooperative tools" are broken. The whole way they expand filenames and parse arguments is broken.

This poor user tried to use Unix's poor excuse forDEFSYSTEM. He is immediately sucked into the Unix "group ofuncooperative tools" philosophy, with a dash of the usualunix braindead mailer lossage for old times' sake. Of course, used to the usual unix weenie response of"no, the tool's not broken, it was user error" the poor usersadly (and incorrectly) concluded that it was human error,not unix braindamage, which led to his travails.

What do you suppose "csh# rm -rf /tmp/.*" does? If you said "Csh expands that on the command line and tries to recursively remove the directory above /tmp (root)", you were right. Too bad the shithead who wrote rm couldn't be BOTHERED to special case obviously bogus commands like "Recursively remove the directory above me, please". Undoubtedly, though, the Unix weenies will maintain that this is the right thing.Very nice. Particularly in view of the manual page for rmwhich states: WARNING It is forbidden to remove the file `..' to avoid the antisocial consequences of inadvertently doing something like `rm -r .*'.There is no indication that anything overrides this. Whatuseful protection! How nice to have a system which knowsbetter than I what I really meant to do.Incidentally, I tried this out for myself just to see if itworked the same way on my system. This is Unix, andconsistency is not guaranteed. It did the same thing, butwith one other delightful addition: lancet% mkdir foo lancet% cd foo lancet% mkdir bar lancet% cd bar lancet% /bin/rm -rf `pwd`/.* Segmentation fault lancet% After all, what Unix feature is complete without the everpopular Segmentation Fault?

good luck handling NULL on any FS

Seriously just neck yourself at this point. This particular excerpt you have taken is possibly the most retarded out of all your shitposts that I have bothered to read.

I like them. You fuck off.

Seems like 8-bit FAT allows it as long as it's not in the first byte and HFS can handle it (but discourages it). There are others on the list, but those are two I've checked specifically. It actually says very early Unix did allow null bytes in filenames - I wonder what's up with that?
en.wikipedia.org/wiki/Filename#Comparison_of_filename_limitations
There is no inherent problem with null bytes. There is an inherent problem with null bytes in strings in C, but that's just because C doesn't have real strings. There have been languages with robust strings both before and after C.

Everything is confusing for brainlets.

The only gripes of Unix I hear is from Millennials who want to shit on raw efficient perfection for "feels like it works better".

It is like they sit in a finely tuned racing car built by a better generation, then bitch about it not having a coffee maker.

"How can UNIX not handle these filenames? If I name a file like this commands fail to work! This is a bug of the UNIX system!"

"How can a car NOT have a coffee maker? If I drive with coffee in my lap I can burn myself! This is a failure in the car design!"

No it's a failure in your driving capabilities. Go get a clown car with coffee holder.

excuse the unintentional R​eddit spacing

No shit.

What did "good luck handling NULL on any FS" mean? Maybe I just misunderstood.

You mean Tcl

misunderstood what? if you put a NULL before the logical end of the string, the string will be abridged. good luck handling it.

Only if you at some level use a language with bad string handling.
Like I showed, there are filesystems and operating systems without that restriction, and even early Unix didn't have that restriction - probably before it was written in C.
Handling NULL bytes in filenames is pretty much impossible on Unix, but I think that was the point of the post you were replying to. That's Unix's fault. It's pretty much a leaking implementation detail.

ITT: pro-Mac OS X shills vs anti-Mac OS X shills (windows cucks)


So true. You sound as relevant as a Welsh Nationalist.

I don't believe the author is very well versed in shell, if his github and stackoverflow profile are any sort of indication. In UNIX shell scripting, everything is a string. When the author speaks of "adapting," or creating a middleman to convert the output of one command to an "API"-compatible format of another, it sounds sounds to me like something someone who's never dug beneath the surface would say. Someone who's used shell as a tool in passing. In his defense, he does label himself as a hobbyist and that shell isn't one of the languages he's "interested" in. But to get back to the point, you don't need a translator to convert output into a compatible input. Everything is a string. Some tools take strings as inputs, and some only as stdin inputs. That's all there is to it, in my opinion. As for the "common textual format" remark, I don't understand where this comes from. The regular UNIX utils do not take complex inputs. They all take strings, and usually just words or filenames, along with single-char flags. Strip away the abstraction, and it's just strings. The commands read strings and write strings.

This is another piece of evidence for my suspicions, you don't need all those commands. A cursory grep'ing of the grep manpage, for those following along at their mother's basement:
man grep | grep recursive -B 2 -A 2
Tells you one option to recursively search directories. All that is adding the "-r" flag before your inputs, and an "-I" flag to skips binaries. You could even use the "-l" flag for only filenames. Using his example it would be:
grep -rlI "*.php" .
It really is that simple, if you take the time to utilize the manuals properly, i.e RTFM. On his comment about "extending," I don't know what he's saying. If you want to boil down your results and refine them, just pipe it into grep again, or whatever command you want that takes stdin (just about all of them). And "cross-platform," the above commands work on any system with GNU grep, which is all mainstream GNU/Lunix OS'es. Maybe not Windows, but likely Mac if you fiddle with the flags.

I recommend you read the manuals and really immerse yourself with Plan9 if you want to get a hang of it. Plan9 isn't like Emacs, or Lunix, where you can just jump right in, pick some things up for the task at hand, and learn the rest later. It's like Vim. Everything is arcane and seems intuitive. Until finally, you've discovered every nook and cranny, and it clicks. You understand, and now everything is simple (the same can be said about bash!). Whereas with Emacs (Lunix) you'll likely never in your lifetime find every nook and cranny. Or two lifetimes even.

If you're using tabular data, every cell needs to have value. That means there will be an X*Y amount of cells in your table and you can set a row delimiter by length, instead of by char. So if your table is 5*30, you will hold a "column" counter for inputs, and iterate a column every 5 cells. Or, you could run it through AWK, and choose a character that's not being used as a new delimiter. This can be any of the numerous escape sequences or any font character your system supports. If you somehow end up with data with every single character of all the fonts you're using, You should take better care in sourcing data you can extend the fonts yourself with a custom delimiter that no one will have ever used. If this seems excessive for just data processing, then I retort to that with "if you try to bake a cake with jell-o, you're gonna need to do some extra work on your end."

~>> man grep 2>/dev/null | grep recursive -B 2 -A 2grep: fopen -B: No such file or directorygrep: fopen 2: No such file or directorygrep: fopen -A: No such file or directorygrep: fopen 2: No such file or directory~>> man grep 2>/dev/null | sed -n '$p'sbase 0.0 February 21, 2018 sbase 0.0~>>
Strange my version of grep doesn't seem to support those flags, I guess its because I'm at uni and not in my moms basement :^)

Works on my machine. and user is using after-market utils ///*WON'T FIX*///.

/// ACTHUNG!
Sbase grep doesn't have half the functionality of gnu grep, or otherwise known as ggrep on *BSD systems.
Cheeky monkey.

...

into le trash it goes

>user is using after-market utils ///*WON'T FIX*///.
what if I told you I was using sabotage

That's exactly what he says in his post.


Yes you do. Textual inputs have grammars. Sometimes they're trivially compatible, many times they are not. Tools like awk, cut, sed and company are often found in-between pipes in order to massage one program's output into another's input. The fact you can't just pipe some output directly into another program contradicts your claim.

>sabotage
Then you should make like Uriel and start a niche blog to spread the to world esoteric, but well designed, technologies.

Yes, it was bizarre to see him acknowledge that everything is a string, but go writing as if he didn't understand it when he was scripting. Perhaps it's the language barrier, but what he says he understands and then how he displays that understanding are two colliding modes. Nitpick, but the information I am interpreting from his writing is not what the sum of all his words mean. It reads like a piece chastising the "only strings" system, yet going back and analyzing the writing word by word, paints a different picture. It's such a strange thing. It feels so alien, to see a very fluent grasp of English syntax and meaning, but for it to not be "English." It's like someone took a bunch of single, independent statements, and then deftly welded them all into a cohesive unit. I almost feel like I'm being lied to when I read his words. Is this what cognitive dissonance is like? To see someone so knowledgeable and capable, but be so wrong. The missing context of the OP stackoverflow passage does not help at all in this regard. I have no argument here and I may just be having a psychotic break, or there's something very wrong with those two posts.

Now, onto things with objectivity. Textual inputs, have grammar this is true. Or atleast I think. I'm still trying to shake off the last five minutes. I'm interpreting what you're saying as, programs require the inputs to be in a certain "format." Is this what you're trying to convey? If it is, I agree with you. The following statement, I don't know what to do with it. It reads like a sentence, but why is it there? Sometimes inputs and outputs are compatible a little bit, most time they're not. It just seems to me like saying sometimes it rains, but usually not. Is this a statement? If so, where is the evidence? If not, and it's just used as a fluid transition to your next point, then I have to express how jarring it is. Especially with the sentence that follows! I will tell you how I took your third sentence. You're telling me certain text processing tools are often used to convert one output into a suitable input of another program. Yes? Ok. I, again, do not understand why this is a sentence that needs to be made corporeal. Is this not an implicit constant? Yes, tools are not standardized and you need to format your data to be more effectively utilized. Did you read that last sentence I wrote. Doesn't it sound redundant? It's like common sense, except common sense is uncommon and I think anyone who uses bash knows intuitively what you just said. Often. But what's even more dissacoiatve to my train of thought, is: where are the examples? And when they're provided, if they're true to a large part of a bash program, then what? What is the takeaway? That bash has one messy type and it's not automatically easy to work with? Ok. What is the point of this statement? I do not disagree, but what is the point? It's a behavior that is noticed, internalized, and then affects the way you go about using the language next time. What use is there stating this? Bash has a clusterfuck of I/O system. Ok. Now what? What are you achieving by stating this?

Ah, but onto your last statement. You're correct. And now what? You know you need to work pipes so they can be utilized properly. Ok. What is there now. What has been achieved by you writing this stream of statements?

Ah, now maybe I understand the issues I'm having here. You took two of my minor points and provided commentary on them that seemed needless. And the other is my fault, for language, as complex as it is, is terrible at conveying meaning. For this we have unspoken cues that aren't present in our current communications! My intent, for my post, was that what the author was referring to as an "adapter," is just the natural peculiarities (some would say no good, awful stench!) of the bash scripting language. My biggest qualm was with putting emphasis on that point. It is true that many programs need very well-managed pipes, I must ask, why? It's such an insignificant point to address, it seems so useless in its efforts. Ok, you need to massage your pipes. Ok? Ok, we know this. Ok. As for my strings remark, its intent also failed to be included, and was optimized away by the compiler that is digital communications. What I meant, was that strings are defined. They're predictable, you know what they're gonna do, how they're gonna behave, and what you can do about them. Just like you know ints are numerical and arrays are indexable. I cannot make a proper simile, but I will try! It's like saying a large part of calling arrays is by referencing them through their index. Ok. What is the use in saying this? This is not new, this is not useful, this does not achieve anything. So why do you state it?! (Reader take note, the "?!" combination is commonly attributed to angered questioning, almost demanding, but in this case it is excited questioning, or energetic if that doesn't paint a lurid picutre).

You posted factually wrong statements like:


And I corrected you. That's all there is to it. No need to write huge walls of text asking what's the point of it all.

Still even with GNU grep the convention is to list the options before the pattern.

I'm a rebel. I run my desktop on root, make numerous changes to my library headers, and port Windows software to Lunix, with the original DLLs and no emulator.

What was gained here?