In a dozen or so years of administrating many different Unix machines, I've never had cron email be a problem and I get a few (useful) emails from various cron scripts every day.
This interacts badly with many unix commands, which often send status info to standard out. Some commands have a quiet options, but that can turn off all error output too.
Maybe it's that I've mostly been administrating BSD machines, and so most of the tools follow old Unix guidelines like not printing anything unless it's necessary, echoing errors to stderr, using proper exit statuses, etc. (http://fmg-www.cs.ucla.edu/geoff/interfaces.html) I think it's a GNUism/Linuxism that commands are overly chatty, writing junk all over stdout (like author/license information - do we need to see this every single time?), and using ANSI colors by default.
Which tools are you talking about? The standard software suite is basically identical between the systems. And what differences there are lie in the implementations of the core stuff, whose behavior is specified by POSIX for the most part.
I'm sure there are exceptions somewhere. But in something you'd throw into a cron job? Frankly, that seems like a very weird snipe.
Debian has this as well. I do "unalias -a" whenever I log into an account I haven't configured to my liking yet (I want no ls coloring, but I do want a bash prompt in boldface).
I remember getting an angry email from someone because of this when I was back at uni, had just finishing installing a linux box at home, but hadn't finished configuring everything. I went to bed while it was busily running cron jobs throughout the remainder of the night and emailing the output to someone else who had the same username at my ISP as the local one I'd set up on the machine. I don't think he saw the funny side of it sadly.
Writing programs like this must be a rite of passage. About a year ago I wrote one (https://github.com/mlaiosa/cronwrap). One day when I googled "cronwrap" to try to find the github page, there was a gazillion hits of other programs that also did the same thing. I looked at a couple and I still like mine more - but I also have a moderate case of not-invented-here syndrome.
&1 refers to file descriptor 1. a>&b calls dup2(2) with oldfd=b and newfd=a. So ordering matters. STDOUT starts out attached to fd=1, but if you dup something else to fd=1, the association "fd 1 is STDOUT" is forgotten.
And that's exactly what usually happens; we replace fd 1 with one open to /dev/null, and then when we say 2>&1, that means to replace fd 2 (stderr) with whatever's at fd 1 (now /dev/null). When you write "command 2>&1 >/dev/null" that means something else, it means "send fd 2's output to what's currently at fd 1 (stdout)", and then "send what's currently at fd 1 to /dev/null". In other words, the source of a redirection is "by reference", but the destination of a redirection is "by value". If that makes any sense...
Think of it as saying "take fd 2 (stderr) and send it to the same place fd 1 is going now" So:
$ command >file.txt 2>&1
first redirects fd=1 to file.txt and then has fd=2 go the same place. Where:
$ command 2>&1 >file.txt
first has fd=2 go the original place stdout was and then redirect fd=1 only to file.txt. Usually not what you want. If you really wanted file.txt to get only the stdout while simultaneously sending what used to be stderr to stdout I think you'd have to use another file descriptor like:
$ command 3>&1 >file.txt 2>&3
That is, save the original stdout as fd=3, redirect stdout, then make stderr go the same place fd=3 is going.
I think this[1] is probably what you're looking for. If I'm understainding you correctly, the 2nd example will have mapped 2/stderr to &1 (stdout), before pointing stdout to the file, so you end up with both in the file.
Siblings have already responded to this well, so I'll just add an example:
$ ls x
ls: cannot access x: No such file or directory
$ ls x >/dev/null 2>&1
$ ls x 2>&1 >/dev/null
ls: cannot access x: No such file or directory
The first command shows us trying to list a non-existent file, raising an error. The second sends stderr to stdout before sending stdout to null, suppressing all output. The third sends the error to stdout; any output on stdout would have been suppressed (can you come up with a way to verify this?)
Your second example puts stderr into the old stdout, then makes a new stdout go into the file. You have to redirect the files in the proper sequence to get the desired behavior. It is quirky if you don't expect it.
I have this problem, I only want to be emailed upon failure. However, the crontab of the account is shared by several different jobs run by different people, and I don't have control over that. Perhaps I could set the email in the crontab line?
That's not too bad and too be honest I think I'd like to see it inside cron.
So we'd just..
* * * * * blahscript.sh
and it'd do what cronic does without any extra. I know i know the "one tool one job", but i believe this job is really cron's job (word play unintended)
I use a wrapper similar to this. Sometimes you just need to call a script someone else wrote that you have no control over the output. I never want an email unless the script results in a non-zero rc.
Unfortunately the FDA has ruled that this treatment would require approval. The approval process is expensive, and there is no way that anyone can recoup their expenses for doing so. Therefore this treatment will never be approved in the USA.
But I know someone who had severe Crohn's disease who went to Canada to receive a mail order of hookworm. Thanks to that he's been totally off drugs for over a year, and shows no signs of the disease.
This interacts badly with many unix commands, which often send status info to standard out. Some commands have a quiet options, but that can turn off all error output too.
Maybe it's that I've mostly been administrating BSD machines, and so most of the tools follow old Unix guidelines like not printing anything unless it's necessary, echoing errors to stderr, using proper exit statuses, etc. (http://fmg-www.cs.ucla.edu/geoff/interfaces.html) I think it's a GNUism/Linuxism that commands are overly chatty, writing junk all over stdout (like author/license information - do we need to see this every single time?), and using ANSI colors by default.