Taking the sum is itself expensive, though (in terms of logical operations). I don't think that realization really buys you much of anything - in particular, the least significant bit of the sum is by definition the parity of the set, so calculating the rest of the sum will only increase the number of operations required.
"Taking the sum is itself expensive, though (in terms of logical operations)"
"the least significant bit of the sum is by definition the parity of the set, so calculating the rest of the sum will only increase the number of operations required"
Can you expand on these two statements please? Examining the sum seemed like a more elegant solution when thinking in terms of higher level languages but the extent of my knowledge ends there. I'd really like to understand how sums compare in terms of actual operations.
Well, at its base, sums are just extensions of Boolean operations. You can implement simple logic circuits which compute sums. You can check out http://en.wikipedia.org/wiki/Adder_%28electronics%29 for more information; it describes some simple (and not so simple) circuits for adding two binary numbers.