Hacker News new | past | comments | ask | show | jobs | submit login

I actually agree with your premise: mathematics and it’s notations (plural) are ancient. The current notations are those that were found to be most convenient by centuries of practice by expert practitioners.



This is of course very true but doesn’t take into account invention of computer that is relatively recent.

Since it in principle never makes mistakes (in practice there are of course bugs, but they are usually different in nature than human errors) it changes what is possible and most convenient. You no longer have to optimise for simplicity as heavily for example. On the other hand computers basically can’t deal with ambiguity, so the rules and statements have to be stated very simply and clearly.

EDIT: One example that comes to mind are indexes in functions. Usually they are just additional arguments that are different somehow from the “main” arguments, for example often being non-negative integers. For humans it makes it easier to think and operate about indices separately from the rest of arguments. But for the computer it’s all the same, as all arguments are treated just as argument, (of course it depends on the implementation etc) and there is no need to treat them separately, since every argument is “special”.

I believe computers can change the landscape of what’s best notation. This is an interesting, interdisciplinary topic to explore.


I think APL was originally created as a fix to this problem. A completely revamped math notation to make it more fit to computers as a medium, instead of pen and paper.


If it was created thus then it could hardly be considered successful. To all but the dedicated and obsessive cognoscenti APL is nothing but gobbledygook.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: