This looks really neat. There's way too little experimentation happening in shells. Most shells are a few decades old, and they provide incredibly poor defaults and tooling.
For example, since I'm prone to making stupid mistakes, I always use a linter. But as far as I know, there weren't any well known options for linting until ShellCheck [0].
Personally, I've been using fish for a while now and I'm generally happy with it. It has great defaults, so I rarely have to tweak anything. Hacking together small scripts is easy, leaving very little room for error.
If I have to do anything event remotely tricky, I use node with shelljs [1]. Now that async functions are available, it results in some pretty clean scripts. And although I haven't tried it yet, the pkg [2] tool allows you to bundle things up into a single binary. That should make certain deployments incredibly easy.
Circling back to Elvish, one minor detail that immediately stood out to me was that they create another folder in your home folder. Since I dislike the ever-growing clutter in my home, I created a GitHub issue asking them to consider following the XDG base directory spec [3].
Their web site is beautiful. Static served over h2 with regional POPs from cloudflare provides seamless experience between pages, and that's with no javascript. After first load it caches and literally takes less than a frame to switch between pages. It feels like an SPA.
Frontenders: take note! This is the next step logical step in the frontend evolution.
All javascript can now be minified to zero bytes [0]. This can result in a significant improvement in asset load times as well as render times.
Furthermore, by following the process backwards developers can save time in the build process.
I'm also happy to announce that this new technology is supported by all major browsers that anyone will ever want to use: Firefox, Chrome, Opera, Edge, lynx etc.
(PS: there are also significant improvements waiting in accessibility and SEO.)
Hi, I am glad that you like it! In case you haven't noticed it, the demos on the homepage (which is a slideshow with animations) also works without JavaScript, it degrades gracefully into a top-down layout.
>Use plain old HTML + CSS and your site feels like this.
That's just the point. Few people do this, yet the effect is vastly more usable, readable, and clean than "modern design", which ironically is much more difficult to pull off.
But how will I alert all eighteen analytic companies that someone from Malaysia who is between 18-24 and likes over ripe mangos has looked at my site so that they can be better served by one of my twenty ad providers?
I think beacons are potentially the only thing you want to leave on a static web site, out of all possible javascript assets. And it'd be best to load them asynchronously (which pretty much all the tags already do, by creating a request <script> tag dynamically).
Certain tags also provide noscript 1x1 gif, so even your analytics is gracefully degraded!
Potentially web fonts are another candidate for a necessary javascript inclusion (css fonts are load blocking unfortunately), but that's by itself is pretty excessive.
It's also missing the .8 s delay & locking of all inputs after every click as well as random breakage, broken browser navigation and broken deep linking.
It's a bit faster than some SPA's I've used :) I don't think it's as well set up as the GP does though...most links 301 to put a trailing backslash on and while it's served from CDN it's not set up to use the browser cache.
That's true - I looked into why even though there's no cache-control header: when there's no explicit cache header, some browsers will cache anything with a Last Modified header for some small fraction of the distance between the current date and the header.
As this is a static site, those Last Modified dates are probably coming from the fs under the webserver. Obviously for dynamic sites the Last Modified header won't be there unless you put it there.
Easy to say when your site has almost no functionality besides just displaying some info ;), Elvish is software you download and install, not a web app.
Yeah, but that's the thing: these days, tons of sites that do nothing but display some info are horrifically bloated and slow to load, and full of Javascript, just to show a little bit of text.
Does not work for Safari, each click loads page and it takes a fraction of second, definitely not less a frame. With Google it seems to work faster, but still far from frame. Full page reload, relayout, etc and it's visible. Good SPA would be more smooth. This site is fine as it is, though.
I love the structured pipes, but as mentioned in another discussion, replacing the shell is a big leap for a lot of people. And you can get quite far without it with tools like "jq". And when I saw this, I just had to tinker a bit to see what I could do with Ruby, based on the example on the homepage:
$ curl -s https://api.github.com/repos/elves/elvish/issues |
jr 'each{|issue| puts "#{issue["number"]}: #{issue["title"]}"} ' | head -n 11
Or (I expect pitchforks when you see the implementation for this):
$ curl -s https://api.github.com/repos/elves/elvish/issues |
jr 'puts{"#{$I["number"]}: #{$I["title"]}"}' | head -n 11
With "jr" looking like this[1]. Expect this to fall apart in all kinds of ways - it was just something I threw together. Not convinced the monkey-patched "puts" is worth it, but it's just an experiment..
I don't think there's a whole lot more we can achieve beyond what you see there... I mean, I guess you could dynamically assign global variables based on the keys, and enforce treating them as strings, so you could reduce it to
jr 'puts{$number+": "+$title}'
E.g by changing the "each" in my example with:
def each █
$J.each do |i|
$I = i;
i.keys.each {|k| eval("$#{k} = $I[k].to_s")}
block.call(i)
end
end
Though this is error prone - e.g. if the key collides with any pre-defined Ruby globals like $stdin... Would probably be worth using a prefix etc.. This is the downside to using a general purpose language for stuff like this instead of a specialised tool like jq which can ignore other syntax concerns..
If you're on Unix you may be interested in an object shell a guy I used to work with has been developing. It gives more objecty-ness imo: http://mash-shell.org/
One thing I'd want to see is support for standalone polyglot utilities, i.e. that you can write utilities in any language and they compose seamlessly into the system because they are all launched as independent processes. That is one thing I would like to retain from classic unix-style shells.
The problem of designing the shell becomes both much harder and simpler at the same time. Harder because you need to figure a way to really pass objects from one process to another, which implies some sort of serialization protocol. On the other hand the shell becomes simpler again, because the processes will (most likely) do most of the heavy lifting of managing objects etc.
It might be me succumbing to stereotypes, but POSIX shell syntax feels something that grew organically, out of use and necessity, with basic guiding principles but no set guidelines or rules.
Powershell feels like something designed by committee, it has an internal consistency and logic, but feels bloated and overcomplicated.
I see the point and to a certain extent agree in theory. But every improvement feels like it's improved on something that just didn't need to be improved, some things become easier, but others become worse. Along the way the simplicity of text streams becomes lost.
I can see a case where passing around structured data could make sense, but Powershell definitely takes things to far by passing around full .net classes that have functionality.
This is a common complaint about PS/elvish/etc, but I’m not sure it makes sense. Is that “simple text stream” tab delimited data? CSV? JSON? XML?
At the end of the day, what you have isn’t just a “simple” text stream; you have a more complex data structure serialized into text, which will need to be parsed by each command in the pipeline for any non-trivial processing.
Quite often, yes it's just text. 90% of the time you're only a string split away from having the structure you need. Most of the more complicated systems are chasing that 10% but complicating the 90%.
"Elvish has no concept of exit status. Instead, it has exceptions that, when thrown, interrupt the flow of execution." ... "Non-zero exit status from external commands are turned into exceptions"
But think of how useful this would be in a complex-hierarchy shell system. Some commands might fail when processing in the tree, and you might want to just remove that one leaf instead of halt further processing of data in another part of the tree.
There is no plan yet. However, there is a plan to eliminate cgo dependency, which can facilitate the possible Windows port. Any contribution is welcome! :)
> Pipeline is the main tool for function composition. To make pipelines suitable for complex data manipulation, Elvish extends them to be able to carry structured data (as opposed to just bytes).
Well that is definitely interesting.... Are there any examples? I was writing a compiler that worked by pipelining different stages to different parts of the compiler (tokenizer -> parser -> AST) all using unix pipes. I had to stop and do something productive once I realized I couldn't pipe complex datatypes easily without a lot of effort to parse the inputs.
The shell I've really been looking for is basically fish-but-bash. All the feedback and usability of fish, with bash compatible syntax. Bash's syntax may be a lecherous growth upon a lecherous growth, but at least it's a familiar leech.
Amen. The only thing oh-my-zsh adds is convenience, which is bought with awful performance. Even without any plugins activated, startup time of the oh-my-zsh--zsh takes forever.
Better use a homegrown .zshrc, with a simple plugin manager like antibody[1].
For example, since I'm prone to making stupid mistakes, I always use a linter. But as far as I know, there weren't any well known options for linting until ShellCheck [0].
Personally, I've been using fish for a while now and I'm generally happy with it. It has great defaults, so I rarely have to tweak anything. Hacking together small scripts is easy, leaving very little room for error.
If I have to do anything event remotely tricky, I use node with shelljs [1]. Now that async functions are available, it results in some pretty clean scripts. And although I haven't tried it yet, the pkg [2] tool allows you to bundle things up into a single binary. That should make certain deployments incredibly easy.
Circling back to Elvish, one minor detail that immediately stood out to me was that they create another folder in your home folder. Since I dislike the ever-growing clutter in my home, I created a GitHub issue asking them to consider following the XDG base directory spec [3].
[0] https://www.shellcheck.net
[1] http://documentup.com/shelljs/shelljs
[2] https://github.com/zeit/pkg
[3] https://github.com/elves/elvish/issues/383