Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Why is math so obsessed with ancient syntax and frameworks?
13 points by diminium on Dec 5, 2012 | hide | past | favorite | 19 comments
In the world of programming, the syntax and forum of the language changes by the people who use it. Everything about the language is redesigned so people can understand it better.

Math (well, math to undergraduate college level) doesn't. To even have a chance of understanding a concept in math, you are forced to use an Ancient Greek symbol system. Once your done with that, the formulas are written in a very specific forum and very specific pattern that your forced to learn in order to understand it at a higher level.

In a way, it's like learning the concepts of object oriented java by only having access to assembly language syntax. Yes, it's possible to learn it that way but there are better ways to understand it.

Why is math so obsessed with ancient syntax and conceptual frameworks? Why isn't there a Python for math? Why isn't there new conceptual frameworks for math being used out there being created? Why does math feel so obsolete? Is the methods used for math today really the best way to present math to modern day people?




[Mathematician here]

Greek leters: It's a feature not a bug! The advantage of Greek letters is that they don't collide with normal variables, you can use short (1 letter) names for symbols. That is also the advantage of using special characters for number (and not using something like a->1, b->2, ...).

An additional problem is that some letter has an expected behavior, for example \epsilon usually is small, and M (capital M) usually is a big number, p is a prime number and z is usually a complex number. The problem is not that they are Greek letters, the problem is that you have to learn the expected behavior. (It's like using i for a counter in a for loop. Obviously i is integer.)

The problem with the syntax is that mathematicians (and humans in general) are too good parsing DSL (domain specific languages). So each mathematical branch has its own DSL (we call it "notation"), and the DSL can barely be combined, but usually a calculation doesn't combine many areas of mathematics. (For example there is a special notation for "function integrals" and "modulo congruence" and "fields towers", but I hope I should never combine the three in the same calculation.) (A similar problem is why regular expressions are so popular, they can be implemented with normal constructions, but sometimes it's easier to use regular expressions. Or the special syntax for here strings.)

The problem is that you have to learn all the DSL, but usually the specific details of the DSL have some insight of the main ideas of the area. There is an (perhaps apocryphal) quote from Gauss: "A good notation is half of the solution.".

The problem is not that you must use Assembler to write Java classes. The problem is that you must use 50 almost incompatible Lisp packages, that define its own macros and syntax reader (that hides almost all the parenthesis) and define each one a very useful but limited DSL that compiles to the JVM.


I don't know how to explain it but - thank you. You answered so many questions I had about math that I didn't know before until you said it this way.


Is programming really so much better off for having so many more languages and means of describing algorithms? At least the language of math that one writes today can be expected to be understandable and readable a hundred or more years from now. A program source code, on the hand, is unlikely to have that kind of long life unless you get lucky and write it in the right language.

Also, math is declarative and rigorously manipulable whereas programming is usually imperative and you're lucky if you get any kind of refactoring/manipulable tools.

I was introduced to programming before I really got a handle on math, and I definitely preferred programming initially. When I did learn math I thought about it imperatively. I saw summations as for loops that were just written strangely, but I now see math as being much more interesting than programming. And I can get a lot more work done by working with math's symbolic language, more quickly, and more rigorously than I can with a programming language.


I find this extremely interesting. I'm a programmer who has had some mental block when it comes to "real math" (whatever that means). I'm very eager to change how I think about math to try to break through the mental barrier. Do you have any books or blogs or anything to recommend that reflect your change in thinking?

The summations as for loops is what got my attention, as I am guilty of that. Also the idea of math as more manipulable than programming is intriguing.


Historically, my biggest obstacle to understanding pieces of "real math" was the intimidation factor of the very dense material and the fear that I wouldn't ever be able to understand it.

The best tip I have, one that helps me not be intimidated by "real math," is to find a method of forcing yourself to slow down while reading. I've found that I get intimidated and frustrated only when I'm trying to rush through something. The best method I have for slowing myself down is to keep a notebook open, and write down definitions as I go (and lemmas, and theorems, but from here on I'll just call them definitions). I try to write a definition that is more clear (likely only to myself) than how it was presented in the text I'm reading. I also try to write down important implications of the definitions.

I think the procedural way of thinking about mathematics is natural for someone without much exposure, and is indicative that you're still trying to work out minor details of each definition you encounter. That's fine. If you have to think about a summation as a for-loop to understand it, go for it. After seeing a bunch of summations, eventually they'll carve out their own mental space. I think that writing down definitions will also help to solidify the declarative nature of mathematics, and definitely helps me to move through math more fluidly.

When I'm learning new math, I find that I'm least frustrated when I plan on moving very slowly. I feel that I'm making good time through a mathematics text if I read a page an hour. Next time you encounter some "real math", grab a pen and paper and really dig in to the text.


You might try a functional language such as Haskell. It allows you to feel denotational semantics (common in mathematics), rather than operational semantics (common in imperative programming).


We use Greek characters simply because it's convenient to have additional symbols at hand. The choice of Greek is historic, but one also sees a little bit of Cyrillic and Hebrew.

Notation has evolved since ancient Greece. Euclid, for example, is quite readable in translation, but he did not use modern shorthand. See for yourself:

http://aleph0.clarku.edu/~djoyce/java/elements/elements.html


That assumes single characters are the only appropriate variable names. There's no reason maths couldn't use a * symbol as multiplication instead of assuming multiplication for any string of characters.

People who enjoy the current state of mathematics will probably think the OP is trolling, but it's a reasonable question.

Perhaps it could be better put as 'Why doesn't maths concern itself with readability like programming does?'


It only assumes that single characters are desirable variable names. Especially if (as in Lisp) one does not draw a conceptual distinction between variables and functions, it is common to see multi-letter variable names.

Here is a typical example of a modern research paper:

http://math.stanford.edu/~conrad/papers/locchar.pdf

Most variables are one letter, but notice, e.g., GL in the fifth line (the General Linear group, not G times L), ker in the seventh, etc.

One reason for brevity is that we like to write on paper and/or the chalkboard.

And to answer your last question, I think math does concern itself with readability. Many mathematicians are obsessive about making sure that their work is as readable as possible, although unfortunately this is not universal.


An important difference is that to operate with the term is much easier to use short names. If a term is adding in the right side of a equation, and you want to put it on the other side subtracting, you must copy all the equation (and Ctr-C Ctr-V doesn't work in paper.)

A typical program is written one, read a few times (perhaps years later) and executed many times.

A typical math calculation is typed once and read only a few times (a few minutes later) and discarded ("archived").

In the draft, I usually find myself replacing sin(x) and cos(x) by s and c, because it's only a little calculations and in the following 10 minutes I will remember what it means and there is no other s or c in sight and I get bored of writing all the letters.


> Perhaps it could be better put as 'Why doesn't maths concern itself with readability like programming does?'

Readability is a personal opinion in some ways. I guess I have grown up using mathematical symbols in Physics classes since middle school so those symbols didn't feel arcane to me when I hit my first real math classes.


Math is not obsessed with ancient syntax, but, being written by hand, it is obsessed by short and 2d syntax.

If you did math with a keyboard, you could use bigger identifiers, and you would use a. ore linear syntax.

See for example Gerald Sussman's works eg. in SICM http://en.wikipedia.org/wiki/Structure_and_Interpretation_of...

Now this won't really happen because while theorems and algorithms are basically two faces of the same thing, programs are boring, while theorems are deep: in a program, there are a lot of small trivial theorems, that don't bring anything new to human knowledge in general (just some green printed paper to their "owners"), while in mathematicians' works, theorems are scarce, hard to demonstrate, and bring sometimes revolutionary new knowledge to humanity.

This means that apart from the still exceptionnal case of computer assisted theorem proofs, there is much more thinking time than typing/writing time in maths than in programming. Therefor mathematical writing is done by hand and short notations are not a problem.


I'm afraid you are overstating the importance of alphabets. For example, try this: http://nedhardy.com/2012/04/25/learn-korean-in-15-minutes-ye... It's not really hard to learn alphabets.

I feel the traditional notation achieves a good tradeoff between clarity and conciseness. I don't want to write the string "integral(sin(x),0,1,x)" on paper - 22 symbols - traditional notation is much more concise. I hate to write parens in (a+b)/c; traditional fractions are clearer. On the other extreme, think about programming in APL.

Also see http://terrytao.wordpress.com/career-advice/there%E2%80%99s-...


When mathematicians investigate new types of problems, they often develop new notation. Computer programming languages are an example of this in a very strong sense.

The proliferation of programming languages relative to other systems of mathematical notation may be due to programming being more frequently applied to new types of problems - including the problem of developing new types of notation using computers.


there is a more general problem here. why does <insert specialty> insist on using domain specific language when ordinary words do fine?

derivative or gradient = slope

is a classic example. mention the word calculus and someone panics in a fit of imagined complexity. describe it in english, clearly, and its very easy...

however the classic counterargument is that it is easier to invent words with context specific meanings for clarity. this makes more sense if you try and describe a functional integral for instance... you need the concept of a function, and an integral which themselves rely on other concepts etc. and would require several sentences to explain well...

also, you have made a faulty assumption about programming - new syntax and language is invented, but the old syntax and language is not replaced - we are stuck with the old stuff too. there are still plenty of areas where C is the only choice - nobody has yet come up with an alternative which addresses the same problems - so 30 years on we still use C - and its use pollutes all of the terminology associated with programming.

(tangent: i still don't understand why we can't compile e.g. Python, C#, Ruby, JavaScript etc. to native code with the benefits of C [Haskell does this quite well for instance - but is held back by general quality of compiler vs. C compilers having billions of dollars and thousands of man years poured into them] - VM or JIT should be a choice if you like spunking away performance - it would be nice if C would die... as much as I enjoy it - GCC, LLVM etc. make the same code gen available for any language with a little work - this feels to me an infinitely better solution than any VM or JIT can hope to be...)


I agree that this can be frustrating. One of the great benefits of programming languages is that, even though I don't know VisualBasic, for example, I can probably look at the code and know what values the variables represent.

It's very difficult to jump into a new mathematical structure, because very often you have to learn the lexicon from scratch.


I'd say that is because when you see a structure you already understand written in a new language (e.g. VB) yu can use your existing mental model.

But when you meet an entirely new structure, as you say, you are starting from scratch.


It's not about the structure. It's about the sigils. I have no idea what Xi means in the context that it's used, but if it's named in English it's very easy to understand.


well, there's mathematica/octave/matlab

using single letter variables is for convenience and done in another script to avoid confusion with the text and sometimes varies to encode information (real value vs vector vs matrix).

There's plenty of work to be done making math better but I don't think you'll get away from these conventions any time soon.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: