Hacker News new | past | comments | ask | show | jobs | submit login
Stronger Shell (odul.us)
105 points by mattbowen on Aug 12, 2015 | hide | past | favorite | 77 comments



Just about everything I've learned about bash has been from the #bash IRC channel on Freenode. You'll see the same repeated warnings of not learning from the public web, due to the fact that misinformation spreads like a wild fire, and some articles out there are just flat out wrong (sort of like w3schools in the #css circle).

There is just one thing I will never understand. The tired argument of "...but it isn't portable". Features that are new, and make programming bash easier (such as "[" vs "[["), are looked down upon in some circles because it isn't portable. If I'm programming for bash, then upon deployment, I'll be using bash. To use another shell, and hope it just works is a bit insane.

I'd also like to second using the Fish shell. Far better than bash/zsh for everyday use, and I just can't go back to other shells.


Seconded. I can't live without fish, and I hear this "but what about my bash scripts?!" argument a lot. Unless you're actually sourcing the script, it has a magic "#!/bin/bash" line on top that makes it work correctly!


It's not that simple. Changes in the syntax for strings and environmental variables are going to bite precisely newcomers who would benefit the most from fish. And sourcing happens.

I do know that all these problems result from fish having a saner syntax compared to bash (hell, FWIW I'm still mad that globs are not regular expressions and have a different syntax) but everytime someone points out these problems the reaction is "#!/bin/bash", which kinda misses the point.


> upon deployment, I'll be using bash

Unless you're on Ubuntu, in which case you will be using /bin/dash at occasionally entirely unexpected moments.


For the want of a shebang, the core was dumped.


Depends what you're going for, I guess. If it's something you're developing for public use and/or something you want to run on !Linux, portability might be a concern (FreeBSD, for example, doesn't ship with bash and, if you install it, it lands in /usr/local/bin instead of /bin).


I'd be remiss if I didn't use the comments to mention the amazing and totally free, "Unix for the Beginning Mage". The author of the article also didn't mention this great resource. It's an amazing book and a short read. It took me about 3 hours to work through completely. www.unixmages.com

If you don't know how to use your shell, block yourself off an evening with this book and change how you forever use you computer.


I, too, was looking for this in his list. It's a really great resource.


> You must have spaces inside your test expressions (so [[-z $FOO]] won't work; [[ -z $FOO ]] is correct).

That's because '[[' is a special bash extension which started out as a better '[' which is actually a binary (/usr/bin/[). And, well, you have to have a space between an executable and its first argument.

(nowadays '[' and '[[' are builtins in most shells, but the external binary still exists).


The core difference is that:

- `[` is treated as a statement

- `[[` is treated as an expression

Which, for example, is why you can't use `>` in `[` based expressions. Bash thinks you want to redirect the output of `[` to somewhere else. This restriction is lifted for `[[` and makes for much more natural looking code.


I personally can't stand bash for writing scripts (it's fine for typing one-liners into terminal though). This question and its top answer capture the insanity of bash quite nicely http://stackoverflow.com/questions/3601515/how-to-check-if-a... .


Most bash scripts that I write are extended and more nicely formatted one-liners, packaged up with some help text and sanity checking for easier reuse. Rewriting in a completely different language rather than snipping out the already working text from the terminal seems a little pointless.

Your link doesn't really counteract this usage. Your stance seems to be a bit like giving up on writing Javascript because of the JS WAT video. All languages have weird corners; bash has more than most because it's so old, and has only reluctantly added real programming language features, but the better you know the programming language, the more you tend to use it on the terminal. And I spend most of my non-editor, non-browser time in the terminal.

I still write bash scripts in preference to Ruby or Python even in less trivial situations, especially when multiprocess orchestration of heterogeneous executables is involved.


I mostly just have a problem with the default behavior of ignoring errors and continuing (yes I know it can be overridden). This coupled with the fact that you'll be working almost exclusively with strings, and all the quoting and escaping rules (which can be a huge pain themselves), can easily lead to bugs like this one https://github.com/ValveSoftware/steam-for-linux/issues/3671 .

I have written some bash scripts myself, but none that were longer than maybe 10-20 lines. Above that, I usually reach for other tools.


I've written some in the 200-300 line range, that took 500+ lines of Ruby to replace. Ruby was less efficient in wall-clock time, but easier to scale up in features. It required using a database for storage rather than line-oriented text files (the problem set was too large to fit in-memory as Ruby objects at least) - a source of performance loss, as I went with sqlite to keep things self-contained. I could have coded in a pipe / batch oriented way like bash, but then I wouldn't have gained any ease of feature implementation.

If you can express a problem as a set of filters and pipes, don't need to fork too much, and have efficient executables to run each stage of the pipe, bash can be hard to beat without breaking out a real programming language with threading support. That's because bash isn't doing the heavy lifting, just doing process orchestration - something it does better than any scripting language I've used. <() in particular is tedious to do in most non-shells (and many alleged shells).


Use "set -ue -o pipefail" at beginning of the shell script for "strict mode". Use "bash-modules" library, which is designed for strict mode. Always quote variables unless you really want them to be parsed by bash.


> This is because it doesn't distinguish between a variable that is unset and a variable that is set to the empty string.

Yeah, this is insane and the default settings of bash are quite horrible. You can set some bash options in the beginning to avoid such problems. For the StackOverflow problem "set -o nounset" can be a good safeguard. Other options to look into to safeguard bash scripts from some insane behavior are "errexit" and "pipefail". Even though setting these is good practice, it doesn't stop some stuff that could be expected to be stopped by these. These options are not inherited to subshells invoked by command substitution and the likes. They wouldn't have stopped this nasty steam bug [1] for example. There are other nasty quirks too, it's really hard to unfck bash.

[1] https://github.com/ValveSoftware/steam-for-linux/issues/3671...



Some of these are programmer errors though. Quote everything is known best practice for bash, vim lints unquoted variable expansion as red. Also bash scripts frequently interact with the FS so it's no surprise that some bugs remove files that are not meant to be removed. I think "set -o nounset" would stop the RHEL bug, although it's not clear if the variable is unset or really just an empty string.

I don't want to protect bash though, but teaching bash scripting best practices would be nice. I learned these options from this site:

http://kvz.io/blog/2013/11/21/bash-best-practices/


I completely agree: it's awful to do even simple things in a shell script. A shell should just be used as a shell: an interface for a human to the system in a terminal. The syntax for if-else, switch, loops, conditions is insane. I will never remember how to write a condition expression: just a missing space can break everything.

I quit using bash for scripting a few months ago. Now I use haskell with shell-conduit: http://chrisdone.com/posts/shell-conduit and bash only for very very simple things where haskell would be just overkill.


Is there a better shell? By which I mean more flexible, more consistent, and simpler. (Anyone who says 'zsh' is disqualified.)

I've briefly looked at rc, but while it's significantly simpler and more orthogonal than sh it's got it's own weirdnesses; but the 'Design Principles' section here is worth reading: http://plan9.bell-labs.com/sys/doc/rc.html

Surely there must be a usable shell wrapped around an actual modern language?


There's fish: http://fishshell.com/

Also, a nice experiment: http://xonsh.org/ (Bash-alike with embedded Python)


I second fish. I've been using it as my primary shell for more than a few years now.

The syntax is better than bash and familiar to those who've done programming in languages like Ruby or Python.

The line editor is actually useful. It handles indentation and highlighting. It will display an appropriate help page if you mess up some command.

It comes bundled with argument completion for most common Unix tools and is easy to extend for other programs.

    for file in (ls /home/me/mp3s);
        echo $file;
    end
It will also highlight incomplete paths, bad syntax, etc, as you type. I know at a glance if I messed up a path or misspelled a program name.

Really useful shell.


With that example in particular, fish by default does not mess up file names containing spaces, as opposed to bash which treats them a separate files. I'm not sure why, but it's certainly useful.


What about files with newlines? evil smirk


I like xonsh.

http://xonsh.org/

It has two modes, which can be a bit confusing. A python mode and a bash-ish mode.

    testdata = $(ls -l) #Takes thes bash command 'ls -l' and saves it to the python variable testdata

    echo ${testdata.upper()} | grep XLIO

    $(ls).split("\n")
It uses some magic to figure out whether you're in a python context or a bash context, defaulting to python. If you save some stuff to a variable called "ls", you need to del(ls) before you can use ls as a command again.


> Is there a better shell? By which I mean more flexible, more consistent, and simpler.

Yes, PowerShell. And it may actually arrive at Linux/Unix soon, due to CoreCLR. The specification is already open, but until now not much progress has been made towards bringing it uncrippled to Linux/Unix

PowerShell is extremely consistent. Examples: All commands are strictly on the verb-noun form where there are only 40 or so "approved" verbs. Noun represent the topic, so if you are working with ACLs, the commands are Get-Acl, Set-Acl, and if you are working with network network adapters, the commands are Disable-NetAdapter, Enable-NetAdapter, Get-NetAdapter, Rename-NetAdapter, Restart-NetAdapter, Set-NetAdapter. All (PowerShell) commands require the parameters in the same way (no - and -- confusion) as it is actually the shell that does the parameter parsing.

The grammar is "modern" so it has no surprises for anyone used to context-free grammars (like C, Python, Java, ...).

PowerShell is certainly flexible as it comes with the ability to invoke anything that has an exposed .NET API, or even has a WBEN/CIM standard interface (like some network switches etc).


I had an opportunity to use PowerShell recently. The task was simple: on a bunch of freshly installed windows 8 boxes, kill a running graphical program of a certain name, copy over new executables, and restart that graphical program. The kind of thing you do essentially constantly on Unix boxes.

I urge anyone at all with any interest in PowerShell to give it a shot. Spoiler: it's not possible without an epic level of hackery; but along the way, you will uncover that PS uses the same parameter to mean very different things for different commands, that the 'everything is an object' conceit doesn't work when the object you want doesn't exist, that PS is chock full of hacky, revolting cruft already despite its youth, hacky revolting cruft that makes oh-my-zsh look like /bin/rc, that most windows commands will fill your screen with banner(8) style output even on success, and that no two windows commands are alike.

Given which decade it was developed in, PowerShell represents the most absolute and fundamental failure of software I've had the pleasure of encountering in the last five years; and I've seen Windows 8.0. Anyone who thinks PowerShell is at all any good has, axiomatically, never seen any other shell besides cmd.exe before. You may think I'm exaggerating. Please, try the example above, then report back here with a list of what you had to do.


Using the non-aliased verbose form of the cmdlets:

    $hosts = 'host1','host2','host3'
    $source = '\\host0\c$\source'
    $dest = 'c:\dest'
    $executable = 'C:\Program Files (x86)\Notepad++\notepad++.exe'

    Invoke-Command -Computer $hosts -ScriptBlock { 
        Get-Process | Where-Object Path -eq $using:executable | Stop-Process -Force
        Copy-Item -Path $using:source -Destination $using:dest
    }
Explanation:

Line 1-4: Set up variables to make the script more explanatory. Line 1 defines an array (the "," operator)

Line 6: Invoke-Command takes an array of (remote) hosts (the -Computer parameter) to execute the script block (the -ScriptBlock parameter).

Lines 7-9: The script to execute at each host simultaneously (the Invoke-Command executes the scripts in parallel at each host)

Line 7: Get the process list (Get-Process), pipe through the filter (Where-Object) which selects only the process(es) executing the desired executable, pipe those processes to the Stop-Process cmdlet which will forcably stop the process.

Line 8: Copy files from the desired source at an UNC path to the local machine at the destination

Now, the above script was the canonical way, using the long form. For casual scription, I could have written just this:

    $hosts = 'host1','host2','host3'
    $executable = 'C:\Program Files (x86)\Notepad++\notepad++.exe'
    icm $hosts { ps | ? Path -eq $using:executable | kill -f; cp \\host0\c$\source c:\dest }


While likely to work (I don't use powershell), I think you missed the main constraint posed: kill a running GUI with a certain name.

Your solution finds the process to kill only if the executable path is at a known location.

How would you do this if you only know what the process name will be -- e.g., what if the executable path is on D:\stuff\gui.exe on host1 and E:\secretstuff\gui.exe on host2, if gui.exe always runs as a process named "Updatable GUI Thing"?


Silly me, I assumed that the executable path was the requirement. My bad. If you need to find a process just by it's name, it is even simpler: Just indicate the process name (possibly with wildcards) to Get-Process (alias ps):

ps Notepad++ | kill

Actually, kill (alias for Stop-Process) takes a name parameter directly, so the "kill" line from the script could be written as simply

kill notepad++

But your question is actually really good, because what if we did not know the neither the process name nor the executable, but -say- only the Window title? Get-process (alias ps) will produce a sequence objects describing the running processes. If I want to know what properties those objects have that I can possibly filter on, I can pipe the objects through the Get-Member cmdlet (alias gm):

    ps | gm
This produces a table-formatted list like this (shortened):

    TypeName: System.Diagnostics.Process

    Name                       MemberType     Definition
    ----                       ----------     ----------
    Handles                    AliasProperty  Handles = Handlecount
    Name                       AliasProperty  Name = ProcessName
	...
    MainModule                 Property       System.Diagnostics.ProcessModule MainModule {get;}
    MainWindowHandle           Property       System.IntPtr MainWindowHandle {get;}
    MainWindowTitle            Property       string MainWindowTitle {get;}
    MaxWorkingSet              Property       System.IntPtr MaxWorkingSet {get;set;}
	...
    Site                       Property       System.ComponentModel.ISite Site {get;set;}
    StandardError              Property       System.IO.StreamReader StandardError {get;}
    StandardInput              Property       System.IO.StreamWriter StandardInput {get;}
    StandardOutput             Property       System.IO.StreamReader StandardOutput {get;}
    StartInfo                  Property       System.Diagnostics.ProcessStartInfo StartInfo {get;set;}
	...
    Product                    ScriptProperty System.Object Product {get=$this.Mainmodule.FileVersionInfo.ProductName;}
    ProductVersion             ScriptProperty System.Object ProductVersion {get=$this.Mainmodule.FileVersionInfo.ProductVersion;}

Lo and behold, there is a property called MainWindowTitle. So to stop a process by it's main window title I could write:

    ps | ? MainWindowTitle -eq 'Deepthought Main Console' | kill
That is, find all processes, pipe them through a filter selecting only those where the MainWindowTitle equals the desired text, and pipe those processes to the Stop-Process cmdlet


hey, this was a pretty good try! But here's what else you need to do:

https://rkeithhill.wordpress.com/2009/05/02/powershell-v2-re...

and if that's not hair-raisingly terrifying enough (note the super-intuitive 'set-item wsman://...' call), you'll quickly find that when you start a program (which you left off) using a script remotely, it doesn't show up anywhere on-screen, despite being authenticated as the logged-in user, but it does show up in the process list. Guess why THAT is.

There are some pretty parts to PS; but they fall apart, at the touch, instantly, like fairy buildings made of dew.


> hey, this was a pretty good try! But here's what else you need to do:

No I don't. To open up a machine for remote administration, all I have to do is run Enable-PSRemoting like so:

    Enable-PSRemoting -Force
For domain-joined machines the authentication from there is just automatic, -i.e. when I use Invoke-Command it automatically created an authenticated and encrypted connection for the duration of the script execution.

> you'll quickly find that when you start a program (which you left off) using a script remotely, it doesn't show up anywhere on-screen, despite being authenticated as the logged-in user, but it does show up in the process list. Guess why THAT is.

That has nothing to do with PowerShell and everything to do with your expectation that you have equally poor session separation as with your typical nix.

With sufficient rights you can of course stop any process. But to reach in and control another user session is something else.

On Windows, processes are separated not just by the account they run under, but also by the session* under which they are created. Security barriers prevents a process in one session from interacting with processes in another session, even if they run as the same user.

This security boundary was raised to prevent compromised user processes from reaching into services and vice versa. This is part of the protection against shatter attacks and more.

I know a utility like psexec (sysinternals) may be able to launch a process in a foreign session, but I'm honestly not sure how it achieves this without some kernel support.

> but it does show up in the process list. Guess why THAT is.

That is because when you launch a process is launches in your session. A session is associated with a Windows Desktop (an operating system object type) which is a namespace separation somewhat like cnames in Linux. Windows on your (remote) session lives on the non-visible desktop associated with that session. When you log off it destroys the session and any processes running under the session.

You may not agree with the extra security features built into Windows, but the separation they create has noting to do with PowerShell.


> you will uncover that PS uses the same parameter to mean very different things for different commands

Example needed.

The claim is surprising, given that there exists a PowerShell Command Line Standard which includes a list of parameter names and their semantics: https://technet.microsoft.com/en-us/library/ee156811.aspx#EM...


> Anyone who thinks PowerShell is at all any good has, axiomatically, never seen any other shell besides cmd.exe before. You may think I'm exaggerating

Really. That feels like quite an uncharitable statement. The only reason someone could think something that you don't like is good because they just don't know any better?


Can you recommend a resource for learning powershell? (Assuming one has UNIX background)

[it does generate annoyingly long error messages!]


Long error messages: Yes, they can be a little strange at first. If they annoy you (or you want errors to cause a little less scrolling) you can set the "error view":

    $ErrorView = "CategoryView"
(unlike bash, whitespace is not significant)

As for learning PowerShell, if you lean towards online training, Microsoft Virtual Acedemy has a really good series of courses, even featuring Jeffrey Snover - who is surprisingly good at explaining stuff.

This is a good starting point:

https://www.microsoftvirtualacademy.com/en-US/training-cours...

I usually prefer books/articles etc where I can set my own pace, but those online courses are really that good.

Otherwise links: * https://technet.microsoft.com/en-us/library/cc281945(v=sql.1... * http://learn-powershell.net/

And let's not forget the very friendly community at http://powershell.org/ with articles such as this one: http://powershell.org/wp/2015/07/31/introduction-to-powershe...


Is Microsoft Virtual Academy worth the time? Every time I've tried to watch a video it's been 99% fluff. The video you linked to doesn't even open Powershell until 18 minutes in.


That's true. I left MS-land years ago, and PowerShell is one of the few things I miss.

The only problem with PowerShell is that you need to write C# (or any other .NET language of course) if you want to create a real cmdlet (if I remember the name correctly). Other than that it's a quite nice shell and a sane procedural scripting language (again, with access to most of .NET).

The other, general problem with cmdlines/shells under Windows is the fact that they all work very differently than terminals that worked with type-writers 40 years ago. Which is what most console apps expect, for one reason or another, under Linux.


> The only problem with PowerShell is that you need to write C# (or any other .NET language of course) if you want to create a real cmdlet

What you say was true for PowerShell 1.0. Since 2.0 you have been able to create cmdlets and modules through scripting. An "advanced function" with [CmdletBinding] attribute is a full-featured cmdlet.

Now at version 5.0, PowerShell even has native syntax for creating classes and enums.

> The other, general problem with cmdlines/shells under Windows is the fact that they all work very differently than terminals that worked with type-writers 40 years ago

Not sure I'm following. The straight PowerShell engine is the most basic (typewriter-like) shell. But if you think in terms of support for terminal control characters - like VT220 - you'd be right. However, that is per specification not part of PowerShell but rather part of the console/shell process hosting the engine. The engine does indeed feature a number of hooks to make rich interaction possible. That is why the same engine can be used in the (admittedly) rather basic console "command prompt" of Windows pre-10 as well as with the ISE and Windows 10's not wuite so embarrassing "command prompt".


> What you say was true for PowerShell 1.0.

Ok, I said I used it many years ago :) Thanks, it's good to know.

About the console/terminal emulation: PowerShell has way saner architecture in this regard. I wasn't clear enough, what I wanted to say is that on Linux we have terminal emulators which try their best to emulate (hence the name) physical hardware from ages ago. I was both impressed and seriously disturbed when I read this: http://www.linusakesson.net/programming/tty/ In short:

> In present time, we find ourselves in a world where physical teletypes and video terminals are practically extinct. Unless you visit a museum or a hardware enthusiast, all the TTYs you're likely to see will be emulated video terminals — software simulations of the real thing. But as we shall see, the legacy from the old cast-iron beasts is still lurking beneath the surface.

PowerShell simply ignores all the legacy cruft and doesn't even try to be compatible with 1940-era hardware. Which is, in my opinion, better and a sound technical decision, which resulted in much better architecture. The problem is that most Linux cmd-line apps expect to work under terminal emulators, not inside a modern environment PS provides.


PowerShell is more like a worse Python and its core functionality is not the same as that of Unix shells.

PowerShell's commands ("cmdlets") are .NET classes within PowerShell, not arbitrary executables as in Unix shells. PowerShell's "object pipeline" is function chaining like in Ruby, not using OS-level pipelines as in Unix shells.

PowerShell is really just a .NET CLI, analogous to the Ruby's irb or Python's CLI, but dressed up to look like a Unix shell.


>Surely there must be a usable shell wrapped around an actual modern language?

AFAIK, the answer is no. I've spent a bit of time thinking about this problem and my conclusion is that the things that make a shell nice to use interactively are the same things that make it bad as a programming language. Rc may be as close as we will ever get to it.


> Anyone who says 'zsh' is disqualified.

I'm curious as to why you say that? This is a naive question, i have no skin in the game.


Zsh is largely compatible with bash's quirks, and then layers lots of weird things of its own on top of that. ZSH is a great, very powerful shell which I use and enjoy, but it's not any of: "simple", "consistent", "fast" or "lightweight".


OK. Fair enough. I use zsh (only scratching the surface) because years ago it turned out that the completion "seemed" to me to be much more powerful (e.g., fuzzy matching, substring matching, etc.), but i haven't invested time to become a wizard. Do you (or anyone else here) endorse a shell which is worth becoming a wizard in? I'm thinking perhaps eshell or simply zsh-in-M-x-shell is perhaps a good starting point, but somehow that's always felt awkward to me, notably because completion seemed to be weaker than simply xterm+zsh.


Don't go the eshell route unless you're prepared to learn Elisp very well and then write a ton of custom Elisp code. Take a look at its documentation:

> Eshell is _not_ a replacement for system shells such as ‘bash’ or ‘zsh’. Use Eshell when you want to move text between Emacs and external processes; if you only want to pipe output from one external process to another (and then another, and so on), use a system shell, because Emacs’s IO system is buffer oriented, not stream oriented, and is very inefficient at such tasks. If you want to write shell scripts in Eshell, don’t; either write an elisp library or use a system shell.

And yes, I never managed to make auto-completion work really well in M-x shell (or eshell).

One possible shell env you might be interested in is IPython. It's a Python REPL with support for launching external programs, shell-like variable substitution and much more. Coupled with qtconsole it makes for a great shell-like experience. I was using IPython under Windows as my shell for awhile, before switching to PowerShell.


zsh is really powerful and flexible, but it does so by being really, really complicated --- it's basically taking bash and turning all the weirdness and bizarity up to 11. It's an uber-bash, which is very much the opposite of what I'm looking for.


> it's basically taking bash and turning all the weirdness and bizarity up to 11

While I certainly understand where you are coming from, I have found that zsh may be more complicated, but less "weird" (i.e. it is more consistent)

However, zsh is tons of tons of special cases that may be mostly consistent, but without any clear concept which would allow you to remember or deduce solutions. When you are a zsh wizard you can be really productive, but it is a strange skillset that takes time to achieve and require constant refresh.

IMHO the closest thing zsh has to a "concept" is that if it takes too many characters, then there's probably a special case which - if you knew it - could solve the problem in a shorter form.


Imagine what if this one had been standard https://wryun.github.io/es-shell/manpage.html


I've seen a lot of tools I use migrate from using sh wrappers to python wrappers. My observation is that it's nearly impossible to read and suffers extreme featureitis because "batteries are included".


Following the trends of transpiling modern linguistic ideas to lower languages (see all the xyz -> js), I asked why it wasn't the case, someone told me it "was", it's called awk.


How many people learned shell scripting while engaged in morally ambiguous activities? For me it was scraping for porn ~25 years ago. I imagine lots of people learned scripting while "penetration testing".


I've found that as soon as bash script goes to more than a couple lines, or as soon as it needs anything modestly complex, like "if" statements or functions, then it is almost always more efficient to write it in ipython. If you know python, then ipython is really superior way to do any ops and administration tasks. I've also found it easier to use for server setup than alternatives like ansible or puppet. I can just use standard, reusable classes and functions to do all my various activities.


Perhaps it's just me, but flow control statements and functions in Bash have yet to scare me off. It's still faster, and readable, to write even moderately complex scripts in bash.

This is doubly true if you have to chain together external scripts - the "easy" ways to do it in Python can have some very major limitations.

I liken Bash scripting to Perl scripting. You can write safe, beautiful, and readable code in both. However, you can also create a summoning circle to the 5th circle of hell if you don't spend the effort to make it safe, beautiful and readable.


Part of this was that I became an expert in python before learning much bash, so it was just much easier to write in the language I already knew, and I found the bash syntax strange with a bunch of gotchas. I'm sure though, that if I did a ton of bash scripting I would not even notice the gotchas.

You are correct that the main weakness of ipython is that it doesn't have pipes, and that chaining is very limited. But usually this does not impact me. And if all your scripting is in python, then you just import functions and run them, rather than chaining scripts together.


What suprises me is how often Bash is being used as 'the shell' and software projects use it, even when they could simply call /bin/sh instead. This is particularly bad for people who port software to other platforms (where BSD systems dont come with Bash, for example) and have to deal with pure shell scripts calling Bash. One recent example is CoreOS/etcd that's currently dropping Bash in favor of Sh, because they simply didnt need it.

Bash is a shell, but Bash is not the shell!


It may be better for those people to call /bin/bash, and fail early and obviously for anyone who doesn't have that; than to call /bin/sh, attempt to write portably without really knowing the quirks of this variant of the language, and possibly fail subtly in obscure situations.


Bash is just yet another dependency for a script. Perl script needs perl. Ruby script needs ruby. Python script needs python. Bash script needs bash. What wrong with that?


Confession: The horrors of bash set back my programming hobby until high school. Even then, I didn't actually like interacting with my computer programmatically until I found a comfortable set of tools sometime in college. I'm lucky that my day job lets me use the tools I like and avoid systems like the bash shell.


Interestingly my experience has been opposite. My first programming adventure was automating some file validation using bash. I agree though that the silly and inconsistent syntax made me really appreciate python when I started learning it.


If you're writing shell scripts, you might as well write Ansible playbooks. You can use Ansible playbooks locally without the SSH layer.

Bash/sh is a horrible programming language.


You don't need to learn Bash to write shell scripts. Many other languages support executing commands with special syntax (PHP and Perl have backticks for example).

I write my scripts in PHP if they're moderately complex, and only use Bash for dumb lists of commands.


actually [ is same as test and the last ] is just the last (useless) argument of [, iirc


Actually you most probably have /bin/[ on your filesystem as a standalone 'test' executable or a link to 'test'. [ is a shell builtin in bash though.


interviewers tend not to test for shell skill even though your shell skills are usually more relevant to daily tasks. I'm debating whether to improve on my shell skills or algorithm skills.


My interview for the job I have now actually had me do some hands-on shell activities and had very little algorithms, but that was partly because they were wanting me to work with code deployment and other tasks where it would come in handy.


One this which greatly helped me was understanding that [ is a program, like any other, which takes arguments. It's a small thing, but made the shall less magic.


Bash is good as a system shell. For interactive shell, zsh/fish/... wins.

That's why I only learn Bash. One lesson for everything for both purposes ;)


Programmer time is better spent learning almost any language other than bash. My god, the centuries wasted on minutiae like the stupid "[ is a program, that's why you have to put spaces around it."

And system administrators should use Ruby if given a choice.


fish would be better than bash as a system shell as well. More consistent, for example. However, bash is available and usually preinstalled everywhere, in contrast to fish.


I agree. But "fish" is quite strange to me, especially its way to recall history. In "Bash", I can type "^ R" and browser the history very fast. In "fish", I have to type and select with up/down array; it's not easy to browser the history randomly (FIXME).

My #Bash" history has > 96k entries (woh, believe it or not; because of this too big number, "xterm" + "screen" always stuck when exitting; but "urxvt" + "tmux" work perfectly thanks to "urxvt" daemon mode.)

Porting this huge history database from my daily "bash" to "fish" is just a nightmare ... :D


Actually fish will import your ~/.bash_history automatically!


(Talking about interactive "fish" shell; but that's the point: I need to be familiar with its interative mode first.)


Was worth reading if only to find out about ShellCheck.

My current project has a lot of bash. The program itself is just `foo | bar | baz | ...`, but the associated test script has grown quite long.

I've just added an extra test which calls shellcheck on each script, and it's spotted a bunch of redundant code for me :)


I generally avoid aliases. I prefer to rote learn complex commands; if it's fitting for an actual shell script then I'll just bang one of those little guys out in no time.


Please don't write shell code that you ever intend to distribute to anyone. The future will thank you.


Nice! Thanks for sharing!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: