Hacker News new | past | comments | ask | show | jobs | submit login

That's interesting. I'm not quite sure what I would call those issues, but I agree they can be tricky.

Many of those abuses are for a very good cause. Leibniz notation is very powerful, for example, but it's hard to master and physicists really go nuts with it.

For E[X|Y], all you have to remember is that conditional expectation yields a function f(Y) of the thing you conditioned on, and f(Y) is itself a random variable that you can take the expectation of. This property is canonical and baked into the formal definitions. It's not an abuse.

However, I do fault some machine learning types a bit for abusing probability and statistical notation. For example, the Elements of Statistical Learning book extensively overloads E[], P(), and other symbols and operators in ways that it doesn't even bother to define. They randomly throw subscripts and decorations onto all sorts of symbols and don't even bother to tell you if those decorations mean they're marginalizing, conditioning, or something else. The book has no glossary of symbols and operators and no preliminary chapter setting out notation, which is unusual for such an enormous book full of hundreds of equations. It would be impossible because the book is a hodge-podge of symbols that change from paragraph to paragraph.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: