Hacker News new | past | comments | ask | show | jobs | submit login

I think what's surprising to a mathematician is if the fact that x can change is not stated prior to witnessing it. So you might say that immutability is a reasonable default for mathematics, but again not necessary as the author claims.

While I think programmers might find it surprising, in mathematical proofs context is often the primary tool one uses to figure out what the hell is going on locally in some expression. For example, indices like time and the size of a problem (in combinatorics) are often dropped because the reader understands that the context is asymptotic.




> I think what's surprising to a mathematician is if the fact that x can change is not stated prior to witnessing it

This is precisely what happens in languages without immutability / control of mutation. They effectively say "this may or may not mutate. See for yourself. Oops! It just mutated. Sorry!"

Also, isn't what you talk about in your example just a shorthand to avoid cluttering a proof or whatever? It's not that variables mutate; it's that by convention you elide the notation that shows they mutate. That makes a lot of sense: the writer and the reader assume some things and go on with a compact notation that eliminates noise. It's a useful convention, and if the reader loses track of the notation, the worst that can happen is that he/she won't be able to understand the proof.

The problem with computer programs is that it's about both the reader and the computer interpreting the program. Because variables may or may not mutate, it's harder for the computer to perform some optimizations. Because of the same reason, it's harder for humans to understand the code ("is this going to use the convention that it won't mutate variables? I have to carefully read the code and consider all paths in order to be sure").

In a sense, programming with uncontrolled mutation is like switching the default convention to its most confusing setting :)


I'm not trying to remark about what is or isn't good for program design. I actually like functional programming. I'm criticizing the author's claims about mathematics which are clearly a justification for his views.

If you're reading a paper with theorems and proofs and algorithms and the author says at the beginning that the algorithms in the paper will be described using mutation (or even if the reader has to figure this out on the fly) then I argue you can't use this as a reason to say what you're reading somehow isn't math. You can criticize it if you think it's unclear, sure, but it's still math.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: