Hacker News new | past | comments | ask | show | jobs | submit login
CoreUtils implemented in pure JavaScript (github.com/dthree)
177 points by fidbit on March 22, 2016 | hide | past | favorite | 125 comments



>Huh? Okay - think Cygwin, except:

Oh boy, another one of these. Cygwin is useful because it provides a POSIX C API on Windows. It is not useful because of its shell and coreutils. And saying it uses "pure ES6" is stretching it - it's using the plain ES5 Node.js require/export patterns instead of ES6 modules.

>1/15th of the size [of Cygwin]

Haha, and 1/100th the functionality


Is this indicative of what's going on in the JS world today, with all the churn of packages and libs? Does it boil down to being sucky and not useful if it isn't written in ES6/TypeScript/Whatever and not invented by your team? Why do the other things suck because they're not this?

EDIT: I've said it on other threads- I like the creativity and exuberance in the JS community. I'm not putting it down. There's just a lot of churn/NIHS.

(I didn't seem to be getting downvoted, but I also want to be clear that I'm not putting the community down)

END EDIT

I think it's a cool excercise. Why would I ever want to use this in the way it is saying it should be used? Why would I ever use this period?


Not Invented Here Syndrome has been around a long time. But it used to indicate a sort of conservatism and fear in corporate culture.

This NIHS seems more the result of ADHD and impatience, sprinkled with a little enlightened precociousness.

You can see it in the satire surrounding computer programming. In the 90s, Dilbert had insight into the causes of NIHS. These days, Silicon Valley hits closer to the target.

(PS: the creator of this package looks to be getting a lot of flak here. Kudos to them for doing it, but they would benefit from understanding why it's such a shallow "replacement" for Cygwin.)


> (PS: the creator of this package looks to be getting a lot of flak here. Kudos to them for doing it, but they would benefit from understanding why it's such a shallow "replacement" for Cygwin.)

I agree. Like I said, I think it's a cool excercise. I'm all for doing things just to learn and explore. Which is what I'd classify this as. The things you learn doing stuff like this will likely help you in the future in some tangential way. No one should be excoriated for practicing their craft, especially in a case like this where it's a nifty project.

The problem is when you put down other projects or even OSes in this case that, frankly, are immensely more powerful and useful than what you've put forth. I think lacking the 'without the suck', it probably wouldn't have gotten much attention, especially the negative.


In instances like this, there's no issue with it being ES5.

However, in the web world, ES6 modules have a huge (pardon the pun) advantage that's just being realized. Because of the static nature of ES6 imports, you can eliminate unused code when you package it all together. The newest Webpack versions and Rollup are both taking advantage of this, and it can result in huge savings in filesize.

For example, lodash is a large library with hundreds of methods, and a lot of the time people only use 3 or 4 imports from it. If you use an ES5 version, you'll get all of those functions in a compiled bundle no matter what. If you use an ES6 version with Webpack 2, all the methods you use will end up being the only ones that are actually included in the final code.


I imagine most libraries has some common functions that ones from public API rely on (and of course some exports can be used within module itself).

Let's say you never import `helperX` and `helperY` from a library, but you import `usefulFunction` which relies on `helperX`. How would webpack know to exclude `helperY` but not `helperX`?

Unless it does some kind of code analysis, or lib is split in "module per function" or similar manner, I can't really see a way to achieve it. And if library is common core plus a bunch of modules per function it exports, how is that different in CommonJS versus ES6 modules?

Am I missing something?


I believe they are doing code analysis, at least that's what the README for Rollup says:

> Rollup statically analyses your code, and your dependencies, and includes the bare minimum in your bundle.


I don't think that's possible in all cases:

    var mod = {};
    module.foo = function () { return 42; };
    module.bar = function () { return -1; };

    module[Math.random() > 0.5 ? "foo" : "bar"]();
How can `Rollup` determine that I need both functions here?


That's why it only works with ES6 imports. ES6 imports must be made at the top of the file, and aren't dynamic. You must explicitly import what you need.

Additionally, this method only works with named imports. So for example, you can have two files:

File a.js

    export a = 5;
    export b = 6;
    export c = 7;
    export d = 8;
    export default {a, b, c, d};
File b.js

    import {a, b} from "./a.js";

    console.log(a);
    console.log(b);
c/d will never be included in the output file, nor will the default export. However, if you do:

    import a from "./a.js";
Then it won't optimize at all, because all of those objects are referenced in the default export.

Note that it's possible I'm wrong as to the specific optimization method, but largely speaking this is how it should work.


That much makes sense, but if I understand GGGP's example properly, they are talking about pruning unused internal references within a module, but GGP's response implies that somehow Rollup will prune those out.


Ah, I understand the confusion now. I'm not 100% sure how that code works, but I do know it uses estree and that case is tested for, nearly verbatim. Here's a link to the source which handles that sort of stuff.

https://github.com/rollup/rollup/blob/6fc8631ba1aad718885151...


Webpack 2 and Rollup don't support tree-shaking with Lodash yet. You'll need to use https://www.npmjs.com/package/babel-plugin-lodash.


The same thing has happened in the Python/Java/Ruby/etc. world. There's a reason "written in pure Foo" was an expression even before Node.js.


Granted- I've only been actively developing for about 18-20 years (not being facetious, I was about 14 and wouldn't have been privy to or really involved enough to perceive the thrashing or lack there of with Java 1.0 or Python 1.0, etc) but I don't think they ever really thrashed like what we're seeing with JS, I don't think they could have.

You didn't have broadband, you couldn't really share entire libraries of code freely, and package managers were not around at the very beginning. In 1995/96 you certainly had message boards and forums, but there wasn't a StackOverflow/whatever like community that made it easy to find an answer to your problem AND/OR a lib to help solve it.

You needed to know how to write a linked list, because you had to write your own. Well maybe not, but you had to write a lot of your own things that just don't need to be written today. God I remember my StringUtils jar from when I was a teenager. That damn thing followed me through college, and I gave the source to friends who then iterated on it (or not). But I didn't have a GitHub to post it to and an NPM to announce, and make it available to anyone looking for a new StringUtils library. It was just mine, free to anyone who asked, but no one knew to ask.

Now your ideas and code and spread to millions of people instantly AND they can find it, which is letting people start from an idea and iterate on it rapidly. Everybody is announcing their StringUtil as the next best thing since taking vowels out of words became cool.

So, I guess yeah- we did thrash 20 years ago, but we thrashed silently and alone.


Cant upvote you enough....

Also... Node... Huh? seriously? replace a siongle DLL with Node.JS is simplifying it??


For my purposes, I don't need a POSIX C API. What I need are the shell commands that I commonly use as my IDE to be available on Windows without a massive hack.

Cash seems to be right up my alley, and I'm excited for this.


I'm not saying that it's not without use-cases. I'm saying the readme is pompous and overstates its usefulness and was clearly written by someone who doesn't have a clue about the tools they're replacing.


If you have npm already.. I guess?

I personally haven't had a windows machine without git in years. Distribution for windows includes somewhat proper shell with unix utils. I can use the same .bashrc I have on my linux machines, write commit messages using vim, have colors in the terminal, etc...


Simpler is better.

Cygwin is overkill unless you are trying to mimic your exact non-Windows environment, and it doesn't mix well with Windows' vanilla CMD environment. For those of us who want to be working with more Windows than UNIX, but who are used to standard coreutil commands, 1/100th the functionality of cygwin actually sounds kind of great!

Not saying that the commands need to be written in JS, but it seems like every language gets at least one coreutils implementation, so why not?

So much negativity around here. Honestly I find it pretty neat that JS has come this far.


node.js is not simple or even simpler


Using a node script doesn't munge the rest of my shell environment. Cygwin, on the other hand...


However, I already have it.


I also have Fortran, Cobol, SWI-Prolog, OCaml, Elixir and Erlang on my computer, so I guess we need to re-implement core utils in these languages too. Using the followings in any technical arguments is just plain stupid:

- it works on my laptop

- but I already have it


You already have a processor too, and something running directly on that is, in fact, more simple.


It's more simple from the point of view of the processor, I suppose, but that's not a useful metric for me.


Is your unit of simplicity really "I already have this installed"? Because then this hello world line I just wrote is more complex than your OS.


Maybe simplicity isn't exactly what I'm talking about, but yes. In fact, I couldn't run your hello world without running my OS. So from a practical point of view, me running your hello world has a dependency on my OS. From that perspective, it's easier for me to run my OS than your hello world.


I see. So you have these coreutils installed? Because if not, they're exactly as "simple" as cygwin, which is also not installed.


No, I don't. I have a vague recollection that cygwin was a huge and confusing monolith, but I haven't looked at it in years. This new javascript thing sounded more straight-forward. It's not really objectively supported.

I don't even like javascript.


Cygwin is pretty simple to install and use. Download an .exe installer, click click click, open the Cygwin prompt shortcut.


From the point of view of the user, yes.


Yes, but - when you hit some bug or just need some specific combination of commands, there would be thousands posts on the web that explain how to do on unix that will work in Cygwin. Will they work in this environment?


The question is hypothetical. Sometimes the answer would be yes. Sometimes the answer would be no.


I suppose in this case "less is more".

More often than not, if I'm working on a windows machine, I'm not programming in C. I'm probably working on something in C# or Java. I don't care about the C API. I just want the terminal to work in more or less the same way it does on linux/osx.


As far as I can see, what's been implemented is a shell in JavaScript, which emulates a bunch of coreutils by treating them as built-in keywords.

It's kinda cool I guess (I don't use Windows), I just don't get why the tagline is "Cross-platform Linux without the suck". Apart from the phrase "cross-platform Linux" not making sense in this context, I don't see why other implementations inherently "suck".


I don't understand this at all. UNIX shells suck. Javascript sucks. How could one implement a UNIX shell in Javascript and have it not suck? Does the result suck so bad that it just wraps around becomes good again?

Is the "way forward" recombining and rehashing the terrible technologies everyone is familiar with already, indefinitely?


> UNIX shells suck. Javascript sucks. How could one implement a UNIX shell in Javascript and have it not suck?

What do you mean by "UNIX shells suck"? I'm fairly sure most people would consider UNIX shells to be one of the bigger revolutions in computing.


Perhaps it's my own shortcoming, but for any given script I write, I find that, say, 20% of all my code is productive, and the remaining 80% is massaging the output of one tool into a shape that's compatible with the input of the next tool. Compare with the exprience of using Windows PowerShell where cmdlets return objects and you can grab precisely the fields you want with minimum ceremony.

Things are better now with builtin tests, but god help you if you need to use /bin/[ and you end up with idioms like 'if ["x$foo" = "x$bar"]' so that bad things don't happen if $foo starts with a dash, or either variable is empty.

It gets even more fun when you look at the amount of cruft around terminal emulators themselves, and the support for legacy terminals in the command line. Why do modern versions of OSX still ship with /usr/lib/libtermcap.dylib?

Don't get me wrong -- UNIX shells are amazingly powerful tools, but they're just stuck in the 70s and we can do much better today.


I keep wanting to like PowerShell but finding it limited in silly ways. Text output is crippled, and there's no sensible way to save the intermediate value of a pipeline in a file.

The security ceremony required around PS is also an obstacle to getting started. And of course Microsoft have missed the opportunity to fix the path separator.

I admit that Bourne shell is not great for anything over a few lines, and find it better to drop into perl or python where filename-separator issues are mostly eliminated.


and there's no sensible way to save the intermediate value of a pipeline in a file

Sounds like you are looking for Tee-Object[0].

"Saves command output in a file or variable and also sends it down the pipeline."

And of course Microsoft have missed the opportunity to fix the path separator.

Windows supports both path separators in pretty much all cases (open files, changing directories, etc)

  PS C:\> d:
  PS D:\> cd /projects/node_modules/immutable/node_modules/uglify-js/node_modules
  PS D:\projects\node_modules\immutable\node_modules\uglify-js\node_modules>
Text output is crippled

I guess this depends on what you want out of the system. Some options are:

"format-table -auto" which provides a nicely formatted table view of the data

"format-list" which provides each property on a separate line with a blank line between objects

"format-custom" which allows you to create your own views of objects (with a very readable default), which has ultimate flexibility

"converto-csv" which output the object as a CSV (or with custom delimiters such as TSV)

"converto-html" which outputs the object as an html table (with somewhat customizable markup)

"convertto-xml" similar to the above

But really, if it's possible, then it makes sense to use the object pipeline until you're done (so that different parts of the pipeline can use different parts of the object). Then spitting out CSV, TSV, XML, etc should get interop with basically every other tool that's going to process text in some way.

[0] - https://technet.microsoft.com/en-us/library/hh849937.aspx


Text output is chopped at 80 characters unless you specify otherwise, even when directed to a file; I'm probably going to try converto-csv next time I want to input it into python, thanks for the tip.

The output of tee-object can't easily be read back in to a new pipeline. I suppose we're supposed to stash it in a variable which will be good enough for 99% of cases.


I think that's the point- they're amazingly powerful. Additionally with POSIX, you can assume that the command you used on one OS behaves or at least outputs the same on another.

I feel like the interface for these commands is exactly what you'd want from a collection of executables that do only one thing.

As for shell scripting, you don't have to use bash/sh/whatever. You could write your script or executable in any language that will let you call system commands, so it's really a non-issue if you absolutely can't stand the syntax. I write plenty of bash scripts and I do feel your pain, they're not usually very pleasant if you have to go beyond trivial use cases- but it's also (generally) not impossible.


If you like PowerShell, try using jq in your Unix shell pipelines. Ingesting data into json is still a bit finicky, but once you know the patterns you need to use with cut and a csv-to-json tool things become much faster.


I do love jq :)

Hadn't thought of chaining that with a csv-to-json workflow, that's a neat idea -- but if that's the easiest way out, that's clearly a symptom of the problem I'm describing, isn't it?


Quite possible. I will say that I'd much prefer structured text that I can massage to my liking over an actual object model. Having things print and read JSON streams is just as good as having cmdlets push and pull .net object, except that we can write our programs in languages that aren't .net-aware. And it makes it way easier for the system to interact with systems that don't speak interchange data format, including things like filesystems and websites.

One of the things I've been trying to figure out for a while is a way to build a trivial adapter for the csv-to-json | jq pipelines I've been building, something sufficiently easy to use that I could bake it into a |-equivalent extension to something like zsh.


Heh kids these days, I bet the next thing you're going to demand is a consistent configuration format that all aplications will use! (a man can dream)


Could not agree more, best thing that could have happened for CLIs is PS. I wish there was something like that for Linux available today.


I was told XML will solve this.


Some people would agree with both statements. UNIX shells are amazing and the recent improvements in their UX and environment integration is great. (git repo status, programming environments, intelligent completion, the whole of oh-my-zsh, ...)

But as soon as you want to do anything moderately interesting, it's better to grab a real programming language, because UNIX shells suck. Escaping failures, whitespace handling, almost-real-arrays, fail-and-continue by default, and many many other reasons you'll fail by accident one day is why they absolutely suck for almost anything past basics.


The next step up would then be a REPL shell. As much as I love the idea of us running Lisp machines, at least $SHELL has a moderate enough learning curve that both beginners and experts can exploit many of it's features. Plus while many (most?) of us are programmers, there are quite a number of administrators who don't enjoy coding or even can't code at all. $SHELL is entry level enough that even they can chain a few commands together where a fully featured programming language's REPL shell would just intimidate them away from the command line entirely.

This is, in my opinion, the beauty of UNIX shells. They're simply enough to use like DOS but powerful enough to use like a REPL. It's not perfect; far from it. But few other tools support such a wide demographic.


Unix shells are notoriously not user friendly. The commands and their interfaces are full of insonsistencies and often really strange. Reading man pages is horror.

The shell languages are not much better.

There are some useful concepts, but in general Unix shells are used despite their user unfriendliness.

The shells of the various Lisp Machines were quite different. The Symbolics shell, later called Dynamic Lisp Listener was quite nice on the GUI side and the management of commands, completions, defaults, interactive help, etc..

See for example: https://www.youtube.com/watch?v=o4-YnLpLgtk

The interactive help of that Lisp Machine OS is quite a step up from what any typical shell offers. Though it was mainly developed for single user machines with powerful GUIs.

The problems of that approach: it wasn't very sophisticated on the text terminal - actually it was quite bad. The whole UI mostly assumed a GUI. For development one needed an extended Common Lisp (or Zetalisp), which is a bit too complex for many users.

See also a video I made long ago about the user interface of Symbolics Genera, the operating system of the Symbolics Lisp Machine line of personal workstations.

https://vimeo.com/159946178


> Unix shells are notoriously not user friendly. The commands and their interfaces are full of insonsistencies and often really strange. Reading man pages is horror.

They're not that bad (GNU coreutils is generally pretty good) and you usually remember the edge cases fairly quickly (eg `dd`). Sadly you get inconsistencies in all coding frameworks, whether it's semantic, function names in the core libraries or whatever.

UNIX shells do have a lot of hidden gotchya's which can make life "interesting" (read: "oh fuck oh fuck oh fuck" lol). But the power of being able to use reusable blocks of any language through pipelining (and to a lesser extent, exit codes) is genius. It means one can mix and match LISP, Java, Perl, Python, C++, Go all inside one shell script.

I do understand the hate towards UNIX shells. There are a lot of faults and a lot of times I I'd be halfway through writing a Bash script and then be wondering if I should have just written it in Perl or Go instead. But no tool is perfect and pragmatically I've found shells to be far more productive than anything I've ever attempted to replace it with. Which is the real crux of why we use these tools. But like anything in IT, this is just my personal preference. Your mileage may vary.


> They're not that bad (GNU coreutils is generally pretty good) and you usually remember the edge cases fairly quickly (eg `dd`). Sadly you get inconsistencies in all coding frameworks, whether it's semantic, function names in the core libraries or whatever.

Generally this is all awful and low-level. Just see the UI interface difference between 'dd' and 'Copy File' on a Lisp Machine. The UI is worlds away.

> UNIX shells do have a lot of hidden gotchya's which can make life "interesting" (read: "oh fuck oh fuck oh fuck" lol). But the power of being able to use reusable blocks of any language through pipelining (and to a lesser extent, exit codes) is genius. It means one can mix and match LISP, Java, Perl, Python, C++, Go all inside one shell script.

You can do that on a Lisp Machine, too. With the difference that no pipelining of text is necessary. Just reuse the objects. The data is all fully object-oriented and self identifying.

Note that I'm using Unix and other OS variants since the 80s and I'm fully aware of command line UIs from VMS, SUN OS, Cisco, IBM AIX, OSX, GNU, Plan9, Oberon OS, and various others.

> I do understand the hate towards UNIX shells.

It's not hate. Most Unix shells are just fully dumb. Many people like to use primitive text based UIs with lots of corner cases, which makes them look intelligent for remembering obscure commands, obscure options without real UI help.

Take cp.

The shell does not know the various options the command takes. The shell does not know what the types and the syntax of the options is. The shell does not know which options can be combined and which not. It can't prompt for shell options. It can't check the command syntax before calling it. It can't provide any help when the syntax is wrong. It can't deal with errors during command execution. There is no in-context help. It can't reusing prior commands other that just editing them on a textual base. The output of the command is just text and not structured data. There are really really zillions of problems.

There have been attempts to address this putting different user interfaces on top. For example IBM provided an extensive menu based administration tool for AIX.

> But no tool is perfect and pragmatically I've found shells to be far more productive than anything I've ever attempted to replace it with. Which is the real crux of why we use these tools. But like anything in IT, this is just my personal preference. Your mileage may vary.

Many people have found shell productive. That's why there is a zillion of different shells. You can even use Lisp-based shells like scsh and esh (http://www.serpentine.com/blog/2007/02/16/esh-the-fabulous-f...) or GNU Emacs.

But for most part all these attempts stay in that little box and don't escape the general problems.


I suspect your argument is now more about personal preference than anything. So I'll just address a few specific points you've raise:

> Generally this is all awful and low-level. Just see the UI interface difference between 'dd' and 'Copy File' on a Lisp Machine. The UI is worlds away.

Well yeah, I did already example `dd` as an inconsistency. :)

> You can do that on a Lisp Machine, too. With the difference that no pipelining of text is necessary. Just reuse the objects. The data is all fully object-oriented and self identifying.

Indeed you can. As you can also with DOS, Powershell and so forth. I wasn't suggesting that UNIX shells were unique (though I can see how it might read that way), but that it was UNIX shells which pioneered that concept. For all their faults and the technology that might have superseded it: the idea of pipelining reusable blocks of code was a genius idea for its era.

Powershell also supports passing objects like Lisp does. Personally I prefer the dumb approach; however in all other aspects of programming I do prefer strongly typed languages. This is just personal preference.

> The shell does not know the various options the command takes. The shell does not know what the types and the syntax of the options is. The shell does not know which options can be combined and which not. It can't prompt for shell options. It can't check the command syntax before calling it. It can't provide any help when the syntax is wrong. It can't deal with errors during command execution. There is no in-context help. It can't reusing prior commands other that just editing them on a textual base. The output of the command is just text and not structured data. There are really really zillions of problems.

This part isn't accurate. The original Bourne shell cannot but bash, zsh and fish can all do all of the above. Albeit sometimes (particularly with Bash) you need to install additional helper routines that aren't always shipped / configured with the default package. I believe csh also supports most if not all of the above too.

I'm sure lisp does it better, but I was never arguing that UNIX shells are better Lisp to begin with. Just that I believe Bash et al to have a lower barrier of entry than Lisp. I think on that specific point we might have to agree to disagree - but I don't see many non-programmers using Lisp and this is why I think Bash has a lower barrier of entry (to go back to my original point).

I can completely understand and relate to why you enjoy working inside a Lisp REPL shell though.


> I'm sure lisp does it better

Lisp does nothing. It is a programming language.

> Just that I believe Bash et al to have a lower barrier of entry than Lisp

I doubt that, given how horrible Bash as a language and as a shell is. There is no reason why there can't be more sane command systems and better shell languages.

See the zsh documentation on completion:

http://zsh.sourceforge.net/Guide/zshguide06.html

This is all totally over the head of the average user.

     _perforce_revisions() {
        local rline match mbegin mend pfx
        local -a rl

        pfx=${${(Q)PREFIX}%%\#*}
        compset -P '*\#'

        # Numerical revision numbers, possibly with text.
        if [[ -z $PREFIX || $PREFIX = <-> ]]; then
            # always allowed (same as none)
            rl=($rl 0)
            _call_program filelog p4 filelog \$pfx 2>/dev/null |
               while read rline; do
                if [[ $rline = (#b)'... #'(<->)*\'(*)\' ]]; then
                    rl=($l "${match[1]}:${match[2]}")
                fi
            done
        fi
        # Non-numerical (special) revision names.
        if [[ -z $PREFIX || $PREFIX != <-> ]]; then
            rl=($rl 'head:head revision' 'none:empty revision'
                    'have:current synced revision')
        fi
        _describe -t revisions 'revision' rl
      }
Tell me that this piece of code has a 'low barrier of entry' or even a 'lower barrier of entry'. It's just that a generation of experts has been self selected to find that usable.

> but I was never arguing that UNIX shells are better Lisp to begin with

I was not talking about Lisp. I was talking about software. One could write much better command shells in C than what Unix shells offers. I understand that certain people find Unix like shells attractive. From a general user interface perspective they are horrible.


You are confusing terminology here. Typical UNIX commands like ls, grep, rm, and so on are not part of the shell.


The harmonization I've come to for "shells are awesome" and "shells suck" is that interactive usage and non-interactive usage are too different to be fully covered by one tool (i.e., you can occasionally cross over for small tasks but only very small ones). In particular, it's the error handling. Interactive error handling by a human in the loop and fully engaged vs. error handling in a program are too fundamentally different.

In both cases, for small uses, you can stretch your shell or REPL or programming language over to the other case, but for long term use it's just not practical. After all, in theory many languages have REPLs that could have long since replaced shell, but only a handful of very dedicated people actually replace their shell with a REPL for some language. I've tried, and I can't do it, even with shell support libraries in the relevant language.

The suckiness of UNIX shells is mitigated if you view them as "the tool optimized for interactive usage". And once you let them be that, they really aren't so bad. It's only if you try to force them to be programming languages, especially past a couple dozen lines or when they try to do something other than just run lots of commands pretty much blindly, that they become really bad.

Also complicated quoting is a pain, but as a percentage of the number of commands I run in my shell, they aren't that large, they just loom large in my memory. Run history and really look at what you're doing day in, day out with your shell. Unless you've got a really crazy usecase, there's a lot of things like "ls" followed by a "cd dir" and such. It's part of why I can't get into REPL replacement; replacing "cd tmp" with even "cd('tmp')" is, percentage-wise, a huge increase in keystrokes. (And don't forget, the space is much cheaper than left paren; the space bar is huge and I basically have an entire digit dedicated to it, whereas left paren is two not-on-home keys at the same time.)


>The next step up would then be a REPL shell.

I had been juggling the idea of using a modified REPL for shell work, but I haven't written much code. However, someone has written such a shell, namely Avesh[0]. Check out the Reddit comments as well [1].

I haven't used it yet, but the examples seem to fit what I would want in a Common Lisp shell: * Simple sh-like commands with flags (e.g. ls -o, grep -i) * Handling commands and piping as native CL functions (e.g. | (| ls (grep c))(grep r) ).

[0] = https://gitlab.com/ralt/avesh [1] = https://www.reddit.com/r/lisp/comments/48r70b/avesh_the_supe...


If someone were to make a distro with powershell instead of the unix shell I'd lose my mind.


i haven't really used PS, but from what i understand you'd also need to rewrite every other program to adhere to its rules. as - if i understood it correctly - the real benefit of powershell is that they all use the same output format. this decreases the trouble of handing output from one command to the next significantly.


It should be noted that basically all tools use the same format: text. And badly behaving tools can be cooerced with a single line of awk.


However, if you want to string programs together and script pre-written code, everything except UNIX shell sucks.


That's why perl never took off, right?


Perl is actually more in the "real language" camp, believe it or not: It makes it more verbose to launch other programs than to not launch them, and it doesn't have pipes.

In a shell, any bareword is either one of a few built-ins or an external program. Using a bareword which doesn't correspond to anything isn't a syntax error, because the parser can't know what programs you have installed from moment to moment. That is the biggest difference between a shell CLI and a programming language REPL.


Because the shell has a lot of room for improvement. Try powershell for a bit. Look at its design, look at the things it can do that the unix shell has problems with.

Everything's an object. Instead of passing lines of text, you're passing objects, with properties, types, etc. This becomes important because of

Library files are first class citizens. Because the shell is string based, you have to wrap everything up in functions, creating helpers in order to use any sort of function. Powershell makes this trivial, such that you can use functions you wrote elsewhere in your scripts as if they were builtins. Hell, it's trivial to write a webserver that forks multiple processes in powershell.

Yes, the unix shell was revolutionary when it was released, but it's been hamstrung by tradition. The shell is long overdue for letting in more interesting abilties.


Powershell is a great programming language an not a bad shell in and of itself, but using it in the same manner as a shell is quite painful. I tried to rewrite some of my ksh stuff in Powershell after getting a job at a mostly Windows-only shop. Basically every ten-line function ended up into a fourty-line superthing that's subtly incomprehensible to anyone who doesn't know .net.

I get why Powershell is great for sysadmins (and especially on Windows, where so many system-level utils weren't built with the idea that you'd pipe their output in mind), but for me, it's a step back.


I agree that Powershell for a large part doesn't feel like a shell. Yes, it has a CLI, and it has short commands (actually aliases for longer names), but using it feels more like programming, at least to me.

Having said that, I can't vouch for your scripts, but my 10 line shell scripts are unlikely to handle 'edge' cases such as spaces, quotes and backticks in file names.

Also, "incomprehensible to anyone who doesn't know .net" may be true, but your average shell script that builds on tools such as sort, grep, ls, cut, cat, sed, awk, etc. isn't that particularly comprehensible to people who don't know UNIX, either.

Finally, I think it is likely that people who grow up on Powershell feel differently about this.


> Also, "incomprehensible to anyone who doesn't know .net" may be true, but your average shell script that builds on tools such as sort, grep, ls, cut, cat, sed, awk, etc. isn't that particularly comprehensible to people who don't know UNIX, either.

Absolutely. The point is, "you can use familiar .NET functions and everything is an object" is not something that makes the whole thing easier if you're not familiar with .NET and don't want objects in your shell scripts :-).

Powershell and Unix shell have vastly different descents, and it shows. It's more or less a historical accident that we're conflating them nowadays.


From my own experience learning bat scripting then sh scripting and then powershell, the 40-line monstrosity is what happens when you first learn another scripting language. Sometimes the paradigm can be so different that simply trying to port one script to another language ends up being more difficult (but better, from a learning standpoint, for me) than simply starting over.


Well, I never claimed I properly learned Powershell :-).

Like everything on the Internet, lack of success with a technology should be taken to mean a failure of that technology as much as a failure of the programmer attempting to use it. It may well have been a problem above the application layer.


Would you say the same about people who haven't been able to master sed? Success with any technology is as dependent on the technology as it is on the user. :)


Annoyingly they're "good enough". Default shell will always stay sh-compatible. Scripts will always default to bash. I'd love for something like powershell to take over the unix world (ok, not exactly powershell, it has its problems too), but I don't think that's going to happen for the next 20ish years.

Hybrids like ipython shell are quite interesting for personal use though.


In my experience Powershell requires a lot of .NET concepts to understand outside of C# syntax from what I recall, and the amount of typing was far greater than typical Linux shells.

If I'm spending twice the time typing on average, even more advanced features won't win me over when I get to use them occasionally (while, again, typing way more on average). It should be saving you time, in the end.


Well actually no. I am fluent in several shells (ksh, bash, fish, zsh) but I would not say that UNIX shells are awesome. Most of the CLI tools written for UNIX-like clones are fixing the shortcomings of shells. On the other hand you can do a lot (and I am not even fluent) with PowerShell while not needing a huge "core utils" equivalent on Windows. I used to work for a company that ported PowerShell features to UNIX so we could do lots of things that otherwise would have been impossible or very hard to do only with coreutils + shells. UNIX is old and it has many flaws but it is also wide spread and people think it is "awesome". In fact, if you look to other systems, for example VMS those had better features than UNIX (and clones). Again, I have something like 15+ years on Linux and other UNIX-like clones.


Yes, I don't understand it either. "No ugly DLLs" but you have to run everything in Node.js. There are a few technologies that have tried to do something better than Unix shells, like Microsoft's PowerShell. It has a lot of failings, but it gets a lot of things right. I am always stunned when I see Windows folks stumbling around in cmd.exe when PowerShell has been installed standard for a while now.

There's still a lot of room for improvement.


I use cmd a lot more than PS because:

1. Win+R -> cmd enter is a lot faster than Win+R -> powershell enter

2. cmd starts faster than powershell

3. I seldom require all the power from powershell


I napped CTRL+ALT+T to start powershell. Very convenient.


"X sucks" doesn't say much, except you don't like X (for whatever reasons). You need to say what it is supposedly bad for, for whom, in what context, and compared to what.


There's a module available to put each command on your PATH:

  npm install cash-global -g


Machine code sucks. For the love, how can we ever get past that?


It's not just a separate shell. Each command can be enabled globally. It just might not be desirable for everyone who wants to use these.


Apart from this being of questionable usability, why do people feel the need to solve every problem with (Node)JS? I can think about a ton of languages better suited for systems programming at the top of the hat.


Because JavaScript is the most widely used language in the world.

As a web developer, I know JavaScript very well. I have already encountered thousands of its idiosyncrasies and I am very good at making it work for me.

Therefore any tool written in JavaScript is automatically easy for me to understand and modify. If that tool could have been more elegantly written in another language, that's awesome for people who know that other language, but it's irrelevant to me. Even if I kind of know my way around the other language, I'm never going to be as effective with it as the language I use every day. And a lot of people use JavaScript every day. This is the reason why people feel the need to solve problems in JavaScript.


But JS can't solve every problem, nor can it solve certain problems well.

I understand people feel the need to solve problems in JS because they're scared or inconvenienced by learning a new programming language.

But why not learn a new language? Having a working knowledge of C won't kill you.

edit: could you explain how JS is the most widely used language in the world?


> Having a working knowledge of C won't kill you.

I do have some knowledge of C, and I could learn more. But unless I change career, I'll always be faster and more effective in JavaScript because I think in it all day, so I'll almost always choose JS over C. I'm not disparaging anyone who uses C, just answering @faaef's question about why people "feel the need" to make everything in JavaScript – because they can, they happen to know JS very well, and because there's automatically a huge potential contributor base.

> could you explain how JS is the most widely used language in the world?

I don't know, but I assumed it's the most widely used because websites :) Also it's the most popular language on GitHub


JS is definitely the wrong language for a number of programming projects. JS lacks a number of features that other languages have... on its own, that doesn't make it a bad language, but it means it's less useful for certain projects.

Like this one. Like the POSIX API that is actually compatible with other applications.

And by that definition of used, C and Java are pretty strong competitors, seeing as every SIM card and Bluray player and so many more run Java... and then there's Linux and its embedded RTOS cousins, etc., etc., that are written in C or C++. I think embedded devices beat out JS pretty well.


I'm not talking necessarily about elegance, but performance, safety, the ability to run on embedded devices, a widely-available toolchain and so on.

Apart from that, it's great to have a favorite language, but choosing a hammer for every task does not seem like the mark of a great craftsman for me.


I believe most people in every profession are not/do not aspire to be great craftsmen. And in the end, if their tools work, nobody cares outside of their immediate colleagues.


...and herein lies the problem.


What problem? I think we need a mixture of specialists and generalists


For fun and education? I am not a fan of the idea from a practical point of view, but it seems like implementing these things could be a nice learning experience.


Here is my language-choosing thought process...

Perl/Python/PHP/Ruby: Generally 10x or more slower than V8 Node.js, plus global interpreter lock = automatic no.

C/C++/etc? Fast, but too complex to work with build system / platform compat / 3rd party modules = generally no.

Java? Way too slow startup time; JVM install for users painful = automatic no

Scala/Clojure/etc: Fine languages but same practical problems as JVM languages = automatic no.

Go? Probably nice, but I just don't know it = maybe

Node.js: Fairly fast, easy to debug, fast startup time, easy to use 3rd party code = yes

Rust: safe, C perf, and easy to build and use 3rd party modules via Cargo = yes


Why would threading with GIL be worse than no threading at all?

As long as IO is your bottleneck the GIL shouldn't be much of a problem. And Python (and I'm sure Ruby, too) has excellent async support.

Not saying Node.js isn't useful, but I wouldn't automatically say no to Python or Ruby.


Fair point. I would probably use a multiprocessing approach in node.js or use https://www.npmjs.com/package/webworker-threads. If perf were very critical I would use Rust.


It's always mind blowing how when someone makes a neat project like this and the comments are nothing but complaining, vitriol, etc.


To me the problem with this project is more the puffed up description than anything else. Lots of "awesome" "wow" "woah" type self praise for their project, and a lot of "without the suck" "ugly DLLs" etc. for other solutions.

If it were described as just a simple humble Node.js shell without the puffery and unnecessary bash at Cygwin, I doubt you'd see the same vitriol.

(Personally, I could use a little less of this trend in computing for everything to be awesome and everyone to be rock star ninja guru superstars. For reasons like the responses here show.)


Lots of vitriol because it's yet another "I took something that works and rewrote it to be slower, less functional, and JavaScript"

Making computing worse, one fork at a time.


So..., just don't use it? If it's useful for just one person, it's useful.


This is the attitude that drives people away from working on open source.


It's better to say nothing so people can be happy with what they've just done and continue to make the same mistakes in the future?


So I guess no one should write a toy kernel again the future, because it won't be perfect? What kind of logic is that?


Toy kernels are usually presented as such.

This is presented as Unix Without the Suck, when in fact it's almost 100% suck.


I understand why you'd use Node.js for I/O intensive tasks like web development and scraping, but for a cross-platform program meant to do everything? None of Node.js's benefits shine.

Sure, Node.js has a big community with lots of (mostly linux-only) packages, but you get a mediocre language and pretty crap cross-platform support. This doesn't matter as a web development language, but seriously gets in the way when you're developing a program designed to do literally everything.

A more useful project might be coreutils in rust, which several projects are already doing.


I'm sitting here wondering why no one ever talks about PowerShell. It supports .NET reflection, which allows direct .NET API access. It's basically c# shell.


Because then I'd have to be running Windows, which is a deal-breaker.


I'm not 100% sure but I think much is due to its syntax. The idea is amazing though.


how cli tools implemented in node is "without suck" is beyond me. fucken hipsters


Absolutely... Replacing a single DLL with Node is "simplifying" it.... Jeez...


More validation of the Atwood principle. http://blog.codinghorror.com/the-principle-of-least-power/


For what purpose?


That would come in handy for cross-platform npm 'run' scripts.


Writing cross platform build scripts for web projects can be a pain when you want to rely on commands like cat, rm etc. This looks like it could relieve a lot of that. Nice job.

P.S. I'm constantly amazed by the number of non-JS developers that pile onto JS-related HN threads to tell us how much JS sucks. I'd like there to be less of it, it's getting a bit old. It seems like any amount of familiarity with the web, or the fact that someone once wrote a JS function sometime makes people feel entitled to weigh in on modern JS development. Look after your own communities and let us plough our own furrow ... we think lots of small modules is an interesting approach, and we'd like to see where we can get with it.


I never understood the hatred for Cygwin, its a (set of) DLLs that form a POSIX->Windows translation layer.

All the bulk is just the utilities that people want to provide once they've got that layer (i.e. CoreUtils).


It is easier to port shell than shell script.


I'll wait for GNU/Windows.


That's roughly what Cygwin is, with actual GNU instead of pared-down replacements.


You know, some kind of GNU/NT using the ReactOS kernel would actually be really cool.


I was very interested in contributing to this project, but the code quality of some of the commands seems prohibitively low.


Mongo(DB) shell interprets JS and it's very useful. Modifying documents, reorganizing scheme, processing queries.

I can see the usefulness in a JS shell mainly because JS is a popular scripting language.

Running this on Linux will be difficult because node runs with a very limited set of permissions.


What the fuck I don't even



I hope the goals are

- short-term : POSIX compliant

- long-term : runs *nix binary / programs

p/s: love anything CLI, so I may be biased


How would they run *nix binaries in a node.js environment if they call it "Cross-platform Linux without the suck"? You are talking about Cygwin here which is what these guys hate for some reason.


Cygwin can't run binaries compiled for non-Windows systems, can it?


well, I mean what's the point of the project if it's JUST to run linux commands? we already have Bellard's JS VM for that

OR

maybe I didn't see their long-term goal?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: