Hacker News new | past | comments | ask | show | jobs | submit login

Math in general is incredibly sloppy with this. I believe it's similar to why churches only read the bible in latin - There's a sort of perverse incentive in keeping arcane symbols around that make the life of the practitioner easier at the expense of the lay-person.

Most formulas students need to learn are actually quite intuitive when you replace the symbol with a one or two word description. But they'd rather force you to memorize the symbol.

It's like forcing you to read obfuscated code. I find it draining with zero benefit to most learners.




Math is first and foremost a language, whose symbols are generally well-defined, although they differ between contexts in some cases. In many of those differing contexts, the symbols are re-used to give an intuitive sense. For example, re-using "x" for cross product.

Reducing math to words means English math is different from Chinese math. If I, a native English speaker, were to look at such math, I would have to more or less take it on faith that the translation I'm reading is accurate. And god forbid I'm trying to read a translation of a work done jointly by (say) a Romanian and a Chinese.

The symbols generally remove ambiguity. Consider the English word "bi-weekly." It's so muddied that it is almost useless. It means either twice a week or once every two weeks. It means, literally, both multiplication and division; and context is rarely helpful for this particular word.

You can certainly use the symbols poorly and ambiguously, as many viral "Solve this math problem!" memes exemplify. But used properly they're generally clear, concise, and well-defined.

I do, however, have one nitpick about symbols, and that's with the use of the ellipsis in math. You'll often see things like {1, 2, ..., n} which is generally intended to mean (say) the natural numbers up and including to n. But does it? Couldn't it also represent powers of 2? Or the set of fibonacci numbers? And since it is a set, order doesn't matter. Almost anything at all could be in that set. We can only be sure it's not something like "all the odd numbers" because 2 is in the set. Frustratingly, it's not even necessary in most cases, since we have set-builder notation and other tools.


Sure, but this is also exactly why latin was used for as long as it was - It was a language that the educated elites from many different and diverse countries used as a command ground.

I'm not saying that math shouldn't be using symbols, I'm saying it's near intentionally hostile to those first picking up the subject. If the goal of math education is to provide the groundwork of understanding to the general populace... Cater to the general populace. The vast majority of those folks are not going to be discussing complex math across several languages.

They're going to be using it for accounting, taxes, construction, cooking, etc. They don't need to memorize an entire language to do that effectively, and we shouldn't be wasting their time in school trying to make them.

Those that choose to specialize are welcome to. In my opinion that doesn't justify the use of specialized language in basic education settings.


Can you give an example of which symbols you object to and what you anticipate as a replacement? The most difficult things I can think of that someone doing accounting or cooking might have to do is compound interest or converting units; so we're talking about ()+-x÷ and exponents. I'm guessing you have something else in mind or I'm not getting it.

I do have criticism with symbols insofar as math is taught which roughly fall in line with "A Mathematician's Lament" by Paul Lockhart[0]. Loosely, that much math is taught as symbol manipulation rather than... actual math. But that's not a problem with the symbols themselves so much as with their use to obfuscate what's actually being done mathematically.

You might be interested in the book Burn Math Class by Jason Wilkes [1]. I don't really recall what his arguments against symbols were. He ends up inventing his own notation as he goes along (which is what ultimately turned me off the book about halfway through -- it just became an exercise in translating notation for me).

[0] - https://www.maa.org/external_archive/devlin/LockhartsLament....

[1] - https://www.amazon.com/Burn-Math-Class-Reinvent-Mathematics-...

Edit: Fixed list of links.


I think you're vastly over estimating the capability of many students (particularly young students without previous exposure).

It's easy to fall into a trap where you forget what it's like to be completely new to a subject. It's particularly hard when you tend to surround yourself in educated communities where this "in-knowledge" becomes assumed and standard (eg hacker news).

Take just less than ( < ) and greater than ( > ). Think about how many stupid rhymes or memorization techniques you see in classrooms to help learners memorize which is which.

Here are three separate sites, with an entire page dedicated to helping students remember which is which:

https://math.wonderhowto.com/how-to/remember-greater-than-le...

https://numberock.com/lessons/comparing-numbers-to-100/

https://myhomeworkdone.com/blog/greater-than-less-than-sign/

One of them includes a whole damned song for the purpose. All to avoid writing out smaller/bigger.

Like any language, once you learn it's hard to remember all the places you struggled.

Why make "change" Δ

Why make "square root of -1" i

In how many classes do we see rote memorization of the quadratic formula, with no context around why you should even bother to learn it? (I've seen quite a few).

Now, not all of those are really the fault of the language (math), but using the language for each of those problems facilitates lazy teaching, and it changes the goal from "understand how math relates to the world" to "memorize this language construct". One is much more helpful than the other.


Because once you learn it, notation gets out of the way rather than in the way. I can't imagine doing physics and having to write out words rather than symbols to express rates of change.

It makes sense that specific knowledge uses specific language to be more easily used and manipulated. Imagine writing (or even proving) Euler's equation without using i or pi or e. Imagine what mess math would be if we never used Greek symbols.

Sure, it would be easier for middle schoolers. But if you're arguing that we should make it easier for them since they're not gonna need ease of manipulation for math that they're not gonna use... Then just do the teach it to them. Maybe middle schoolers don't need < as a concept any more than they need it as a symbol.

But if you're going to solve equations and inequalities, then yeah, you need = and <. Everything else would be needlessly verbose and get in the way of actually manipulating concepts you know.


I've tutored many college students (of various ages) in basic algebra, almost all of whom were convinced they couldn't ever possibly learn it, and only one of whom didn't end up getting an A in their course. I'm pretty aware of the struggles some students can have with it and none of them really had trouble memorizing the handful of symbols that are actually used at that level. Some struggled with the concept of a variable, but many just struggled with understanding the relationship between the actual concepts and the manipulation of symbols. Off the top of my head, equations being balanced can be a difficult one, but certainly not the only one.

As for younger students, I have much less experience, but some; and it's interesting you mentioned less than and greater than; since I actually remember learning those symbols. We learned that the "alligator" always eats the "bigger" number. It's not surprising there are countless ways of learning it, including song. That's true of almost any abstract concept. The idea is to link a metaphor the person understands to the abstract concept. Not every metaphor will work for every person; and this is true of all abstract concepts, not just math symbols.

Of course, it's not actually true that the alligator is eating the "bigger" number, and it actually demonstrates why we need the symbols. ">" and "<" actually refer to "greater than" or "less than" which we much later learned is a way of saying "which number is further right on the number line"; which, of course, requires the abstract concept of the number line and accepting the more-or-less arbitrary decision of a left-to-right number line. "Bigger" means "has a greater distance from zero on the number line in either direction" which we'd represent as a comparison of absolute values. I don't recall when I learned about absolute values, but it was definitely years after learning about < and >. Using the proper symbols lets us be explicit, concise, and precise and avoid issues like using English synonyms (bigger, greater) or whatever pitfalls exist in other languages.

The choice of symbols < and > are, of course, largely arbitrary other than the symmetry between them. (We could have, for instance, always put the greater number underneath the smaller number so the structure is more stable in an imaginary gravity; but that, too, would be arbitrary.) But so is the letter S, or the number 9. They're all arbitrary symbols that have particular meanings in particular languages. "9" is interesting, because it's a number, versus the Roman numeral system. The Roman numeral system could arguably be called non-arbitrary. "I" clearly represents a single thing, "II", two things, etc. That works until you get up to "IV". What? "IV"? So if a lesser value is in front of a greater value, you subtract it? And how does "V" represent five anyway? It's arbitrary! But the Romans found it much more useful to be able to write VII + VI = XIII rather than IIIIIII + IIIIII = IIIIIIIIIIIII, which can pretty quickly get unruly. Turns out, memorizing digits 0-9 makes it (and more complex math) even easier: 7 + 6 = 13; which is why the entire world uses numbers instead of numerals.

(We could also have a side-discussion on why base 10 and not something like base 12, binary, a mixed radix system that uses the prime numbers or the sexagesimal system used by the Sumerians. The answer is basically that its mostly arbitrary, simpler than some systems, and we have ten fingers/thumbs.)

> Why make "change" Δ

> Why make "square root of -1" i

Largely historical reasons, expediency, and lack of better alternatives. Why represent the sound "ssss" with the symbol "s"?

You could swap i for √-1 and people would understand you, but you'd very quickly wish there was a shorthand that you could use to represent this rather special value.

> In how many classes do we see rote memorization of the quadratic formula, with no context around why you should even bother to learn it? (I've seen quite a few).

You won't see me objecting to this; but this is not an issue with mathematics. It is an issue with teaching mathematics and is part of the "lament" I linked to above. This is quite a different issue than the issue of symbols. The symbols, while arbitrary and arcane, actually make the mathematics more manageable and precise. Saying that "facilitates laziness" is like saying a clothes washer facilitate laziness since it removes the need to manually provide friction and agitation. It's true, in a sense, but I'll keep my washer.

Mathematics is taught very poorly in many places; but making it hopelessly complex and less precise by removing the symbols of the language is not going to help. I learned algebra, officially, my freshman year of high school. Yet there are many high school graduates who come out of high school not even having a rudimentary understanding of algebra (and, actually, even basic mathematics - tutored a few of those as well). Many of them learn it in college, so they're obviously capable of learning it. Those high schools failed those students. Many university professors equally fail their students.

But blaming this on the symbols of the language is too far of a stretch for me. Blame the teachers.

Edit: Fixed display of symbols.


> Couldn't it also represent powers of 2?

You'd use a different notation then. `{1, 2, 4, ..., n^2}` or `{2^0, 2^1, ..., n}` or something similar which indicates what you mean. There's nothing wrong with elipsis, if you spend a second thinking about what you're writing.

If someone wanted to be obnoxious and confusing then they could say: Haha, `n` is not a symbol either; I'm using base-50 and it's digit 23(10). So you can't really stop bad / intentionally misleading communication.


Like I said, it’s a nitpick. I can know what a person means most of the time, but it’s imprecise and technically the ellipsis can be almost anything depending on the explicit values/symbols. In the cases where the pattern is clear it’s clear. Where it’s not, using proper set builder notation in the first place would have made it clear. I’ve also seen it used in contexts other than sets, including for summations. In those cases often just having the summations in proper form can make consequences and manipulations plainer and easier.


In general, at least through high school, probably way too much teaching involves basically memorization of trivia.

Yes, some foundations of facts are needed. For example, there's some very basic operator precedence that students should internalize. And history inevitably does involve names and dates. But too much attention is probably devoted to remembering whether a battle was fought in 1746 or 1750.

It's somewhat understandable because testing for those things is easy and unambiguous. But it's unfortunate anyway.


Pedantic point: The whole "reading the bible in Latin" is a very Catholic thing (and maybe Orthodox?). The Protestant churches I've been in has read from the bible in the native tongue, and it was indeed one of Luther's criticisms of the Catholic church.


I'm not Catholic, but I think they stopped requiring the reading of scripture in Latin around 1960 and The Second Council of Vatican, AKA Vatican II.

I think Latin is still used as part of the traditional service, but it's mostly ritualistic. What I remember from attending Catholic services years ago is a lot of standing up and sitting down while chanting in Latin, followed by a sermon delivered in English.


It's not universal either. I've grown up in a catholic-obsessed country and not a single mass I've been to had Latin elements or chanting. All in native language in mid 80s.


> Most formulas students need to learn are actually quite intuitive when you replace the symbol with a one or two word description. But they'd rather force you to memorize the symbol.

Can you give some examples of this?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: