Hacker News new | past | comments | ask | show | jobs | submit login

I agree and this was my first thought too. Yet I feel like I do know the mathematical notation. I know what each "thing" does and yet the whole feels needlessly dense.

I definitely agree that once you are "used" to reading the notation it will be faster. That said, just because it's possible to get faster at it, does that mean it's the best possible notation for the job?

In creating programming languages and using them for tens of thousands of man hours per day we have envisioned and tested a multitude of different ways to express logical ideas, formulas and algorithms.

By contrast, how many kinds of mathematical notation has humankind tried and actively used?

If the latter number is substantially higher than the former, how can we be so certain that academic mathematical notation is the best, most clear way to express algorithms and formulas?

I would wager that the amount of logic expressed in programming languages vastly exceeds the amount of logic expressed in mathematical notation, every day. More logic is both written and read in the field of programming, so the medium we express the logic in is more rapidly iterated on.

Again, I'm not a mathematician. I'm a programmer and so of course my opinion about what is easy to read and write is subjective and biased. Yet even thinking about it objectively in terms of "what did we work harder on refining", I think there's some food for thought here. Wouldn't the rapidly iterated and widely used form seen in programming languages be more finely honed and at a higher stage of evolution in terms of understandability and writability?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: