This is non-portable, as OS X has no sha256sum out of the box. But it does have shasum from Perl, and Linux distros typically come with Perl, so 'shasum -ba256' should work...
I wonder if there is any concise way to do this without needing to save a file to disk. I can't think of one, as it requires splitting the input in two, which bash can do with >(command) but not sequentially or with the ability to communicate from the subshell to the outer one.
You don't get your hash without consuming the entire input.
You can't start feeding the input to the shell, until you verify the hash.
You don't want to download it twice, it case it comes back different the second time (or in case your network is slow).
At the point that you are verifying the hash, the entire file must exist on your system somewhere. Disk, memory, OCR-friendly printout, whatever.
Buffering to memory can only work if the input is "small", whatever that means. And pipelines aren't meant to do this, so you'll have to do something a bit odd (ie, confusing to anyone trying to understand your code) to make it work.
> Buffering to memory can only work if the input is "small", whatever that means.
Well, increasingly it means "up to 8GB or more". Or put another way, RAM is increasing faster than network speed (or the speed of light for that matter...). So, I'd say that, yes, for many things you'd want to download from the internet, caching in RAM is absolutely an option?
I wonder if there is any concise way to do this without needing to save a file to disk. I can't think of one, as it requires splitting the input in two, which bash can do with >(command) but not sequentially or with the ability to communicate from the subshell to the outer one.