And yet, the top search result is tldp. This is what #bash on freenode has to say about that site:
> The infamous "Advanced" Bash Scripting Guide should be avoided unless you know how to filter out the junk. It will teach you to write bugs, not scripts. In that light, the BashGuide was written: http://mywiki.wooledge.org/BashGuide
Now, that site is on the second "page" of the search results.
Well for example, I just skimmed through it and I don't see an explanation of quoting, probably one of the first things I would want to explain, around the time variables are explained. It also has examples like this:
#!/bin/bash
for i in $( ls ); do
echo item: $i
done
which both uselessly calls ls (you may as well just use a glob) and doesn't work as intended if there are filenames with spaces in the directory.
You also have:
#!/bin/bash
for i in `seq 1 10`;
do
echo $i
done
Why is it using backticks here instead of $() as in thr former example? Also, no mention that seq is not available on some systems. And, you should be using bash's sequence expression[1] anyway (added in bash 4.x), so the guide is likely out of date.
Using ls instead of a glob can be valid if you need to force ordering for some reason (you want to process things in date order for instance, IIRC the order out of a glob wildcard is alphanumeric in most places and arbitrary in some). Your point about spaces in filenames is definitely valid though but my preferred solution to that would be to stab people with a needle every time they put a space in a filename!
There are a great many things on the command line (less elsewhere, but still some) that don't handle spaces in filenames well.
You are probably right that this is a technological fault rather than a problem with spaces themselves, but IMnsHO until we are not regularly using spaces as list item separators using them with the items being listed is asking for issues.
You can easily deal with spaces in bourne shell. Quote your variables. Change the IFS to make it only newlines. Of course, newlines in filenames are perfectly valid too.
I use some systems without seq, and they have Bash with version 2.x or 3.x on them (eg., OSX). seq has the nice property that writing a shell or executable replacement is easy.
I think the 2nd example is fine. It might be cut and pasted from a POSIX example.
The TLDP bash guide is full of a range of submitted examples that run the gamut of good and bad practices. That said, it's still one of the better resources out there, and I have the reference page bookmarked. But this is Bash we're talking about, "best" practice is a relative term.
It seems like DDG is positioning itself to become WolframAlpha for tech. Hopefully, this will allow it to better catch the exact market segment that would be wary of search engine tracking (educated tech-savvy people).
Actually one of their guys approached me last time explainshell appeared on HN. I think he suggested I implement it as some sort of shortcut, but I haven't had time to look into it.
I have a branch somewhere that exposes the results as JSON, which should make it easy to integrate if they were interested.
I wonder if the developer of explainshell is reading. It looks like development has stalled. It's a really nice idea and could serve as a powerful backend for DDG.
Admittedly I haven't had a lot of time recently to improve explainshell. But I haven't deserted it; it's just a matter of finding a block of free days to focus on it.
This is pretty awesome, but I don't know if people often phrase their queries like that, and I'm not sure whether it's worth it to start phrasing it like this. It doesn't seem to support more complex bash queries, so I'm not really sure how often this will be used.
Nooo! Mistake #0 about shell scripting is thinking [ is syntax. It makes it way more weird and inexplicable. [ is a command, just like any other. That's why there needs to be a space between it and the next thing, and why you have to use weird looking flags and such. This is why I prefer the command `test`. It does the same thing as `[`, but is more obviously a command. And when you see it as a command, it's obvious that any command can be used in an if directly; if is just checking the exit code. (Whenever I see someone doing `if [ $? -ne 0 ]`, it makes me cry.)
...or is that a bashism that I'm not aware of? If so, that's just horrible.
Personally, I prefer to write portable shell scripts. Entering "!posix test" into DuckDuckGo to obtain the man page for "test" yields far more meaningful information for the same problem.
[ -z hello ] is perfectly fine, but hello is not a variable, it's the string "hello". It will always return false. ddg doesn't seem to evaluate the command, just explicit it.
"results to true if the length of 'hello' is zero."
I'm glad they cleared that up. Otherwise, I might have assumed that there was no case where 'hello' had a length of zero.
Joking aside, it is a useful syntax guide.
I've been reading through some of those goodies, and they all seem pretty cool, but I'm wondering how secure using the 4 word random passphrase generator[0] would be...
But of course you could run it ten times and pick one of them. Or modify the instant answer to return ten or twenty. Not ideal or optimal, but there it is.
That hardly helps at all. Now instead of knowing your exact password, your attacker knows that your password is one of these 10-20 entries, and it's easy to just try them all.
I'm always finding another bash magic variable. '$?' is 'exit code of last command'. Last week I found '$-', which tells you which flags are currently set (set -x, set -e type of thing).
> The infamous "Advanced" Bash Scripting Guide should be avoided unless you know how to filter out the junk. It will teach you to write bugs, not scripts. In that light, the BashGuide was written: http://mywiki.wooledge.org/BashGuide
Now, that site is on the second "page" of the search results.