Hacker News new | past | comments | ask | show | jobs | submit login

>Mathematically, an n-dimensional space is R^n, that is, the cross product of the real numbers (R) with itself n times. Each copy of R lets you specify a number independently of all the others.

See, that's exactly what I'm talking about. This gibberish makes sense to no one except a mathematician. The explanation I've given - while maybe the same in essence - conveys the meaning in a comprehensible way that's not completely removed from reality.

My point is that mathematicians often do neither realize that they are talking gibberish that makes no sense to anyone but themselves nor do they accept that math without proper context is devoid of any meaning (outside of math, that is).

NOTE: I would like to clear up that I'm not attacking math itself. I neither think math for math's sake is bad, nor do I think math is useless. I respect math as a self-contained field. The problems I'm pointing out are all happening where there is application within other fields - such as CS.




I certainly agree that an intuitive understanding of a concept can be helpful as a guide and emotionally satisfying. However, I don't think that it is at all sufficient and it is precisely "mathematical gibberish" which resolves the problem.

An isolated concept is worthless. It is only when you are about to apply it by reasoning with it that becomes valuable. The problem with intuitive explanations is that they don't nail down enough details to allow a person to reason with them.

"I think of points in an n-dimensional space as objects holding n different types of information"

Imagine walking into a room and drawing a straight line on the floor. What is the dimension of that line?

Answer = One dimensional. Proof: We can describe each point by one type of information. Point = (Distance of that point from the start of the line.)

Answer = Two dimensional. Proof. We can describe each point by two types of information. Point = (Distance of that point from the East wall, Distance of the point from the North wall.)

Answer = Three dimensional. Proof: We can describe each point by three types of information. Point = (Distance of that point from the East wall, Distance of the point from the North wall, Distance of that point from the roof.)

Answer = Four dimensional....

So by the intuitive explanation we can make this single line any dimension that we want.

This isn't just a problem within mathematics. For a simple programming example:

Question: How does a computer program work?

Intuitive Answer: You give the computer a list of instructions for it to carry out.

Result: The guy opens up notepad and types in "Make a computer game where I walk around shooting Zombies."


>So by the intuitive explanation we can make this single line any dimension that we want.

And if this makes sense in the given context - sure, why not?

You seem to be missing my point. What I'm saying is that it's useless to have totally generalized abstractions (outside of pure math) since more often than not, they are so far removed from reality that most people can no longer make any connection to use cases.

My entire argument is that there's a heavy communication failure between mathematicians and scientists of every other field where math is used as a tool. Sure, it's convenient for a mathematicians to be able to use shorthand gibberish to talk to other mathematicians. It doesn't justify pushing this jargon on other fields.

Besides which, convenience is no excuse for making something hard to understand. Sure it's convenient to name all your variables in a program a, b, c etc but you're going to get lynched by any programmer that tries to read your code later, including yourself.

When it comes to a point where gibberish becomes the only way to explain mathematical abstractions, then you should step back and ask yourself "where the hell did this go wrong?".

>This isn't just a problem within mathematics. For a simple programming example:[...]

I believe in giving simple explanations and expanding them whenever there's a loophole that needs to be fixed. You don't make programs that cover every single niche use case, either (if you don't have to, that is). That's the problem with math - the generalizations, while useful in math itself, are sheer overkill in many situations outside of math.

Lastly, sorry taking this out of order, but:

>An isolated concept is worthless.

So is a generalized abstraction without any context. I'm saying that the sweet spot is somewhere in between for most people to understand and apply concepts, and that it's better to generalize upwards from reality and actual use cases instead of starting utterly removed from reality and trying to apply the generalizations downwards.


The way you are talking about jargon in mathematics suggests you have a limited experience of what mathematicians do. Here's an illustrative example of mathematics as done by mathematicians.

~~~~~~~~~~~

Define: An integer n is `even' if there exists some integer m such that n = 2m.

Theorem: For any two even integers n and a, the sum n + a is an even integer.

Proof: Since n and a are even there exist integers m and b such that n = 2m and a = 2b. Now,

  n + a = 2m + 2b; by assumption
        = 2(m+b); by the distributive property
        = 2z; for the integer z = m+b
Therefore there exists some integer z such that n+a = 2z. Hence n+a is even.

~~~~~~~~

That is to say, in mathematics we introduce some definitions/gibberish/jargon (in this case `even') and then we use logic to reason about the implication of our choice of definition (the sum of even integers being even.)

The important thing is that the definition plays an essential role; definitions are the building blocks on which all of mathematics operates. To emphasize: if you strip away the definitions you literally have nothing to build on - we can't apply logic to nothing and arrive at something.

This leads to the point I made in my earlier comment: the reason we need definitions rather than intuitive explanations is that you can't logically reason about a concept unless you nail down the relevant details of what that concept is exactly. We can't do the 'proof the theorem' part of the above example.

So how does mathematics then fit into application?

Guy 1: In this basket I have as many stones as I have fingers and in that basket I have as many stones as I have toes. For each basket I can pair up the stones so that each has a partner. Will this still be the case if I combine the stones from each basket?

Mathematician: Well, lets represent the number of stones in each basket with the integer 10. Pairing stones corresponds to the integer being even and combining the baskets corresponds to adding the two integers. I note that 10 is even since 10 = 2x5 and so I can apply my theorem to conclude that the sum 10+10 is even. Thus I conclude that when you combine the baskets you will still be able to pair each stone with a partner.

Guy 1: Wait, wait, wait! I don't understand this 'even' jargon. Do it again without the jargon.

Mathematics: The definition of 'even' was central to my whole processes. Without it I can't even set up the problem, let alone apply the theorem used to justify the answer. I could perhaps just give you an answer, "MATHEMATICS SAYS YES", but then you wouldn't be able to repeat it yourself for different numbers of stones.

If the above is understood then I can quickly address the claims you have made.

> it's better to generalize upwards from reality and actual use cases instead of starting utterly removed from reality and trying to apply the generalizations downwards.

Mathematics is generalisation utterly removed from reality. This is why we have "Adding integers" and not "Adding together collections of dogs" and "Adding together collections of apples" and "Adding together collections of hats" and ...

> Sure, it's convenient for a mathematicians to be able to use shorthand gibberish to talk to other mathematicians.

Mathematics is the practice of defining new gibberish and then reasoning about that gibberish. The gibberish isn't a shorthand for something, it is the thing.

> It doesn't justify pushing this jargon on other fields.

Mathematics is definitions/gibberish/jargon. Applying mathematics to a field thus means applying definitions/gibberish/jargon to that field.

> When it comes to a point where gibberish becomes the only way to explain mathematical abstractions, then you should step back and ask yourself "where the hell did this go wrong?".

At least since Euclid's formulation of geometry.

>> So by the intuitive explanation we can make this single line any dimension that we want. > And if this makes sense in the given context - sure, why not?

The problem is that it doesn't. Your explanation of an n-dimensional space is more a description of the larger space in which our space of interest is embedded.

In all instances the space (the line) remains unchanged, the only thing which changes is how we are describing it. For the dimension of the space to be a property of the space it needs ignore how we choose to describe it.


Thank you! I've had this same feeling for a long time. It strikes me as odd that probably a lot of the same people that would find it unacceptable to write cryptic C code with one-letter variable names, find typical math notation/jargon to be completely fine and legible. If code is so important that we go out of our way to make it comprehensible to future maintainers, why don't we have the same feelings about math? (which I would argue is much more important for people to comprehend, seeing as how it's the core foundation of all of science)

Personally, I'm really into physics, so I've always really wanted to like math so that I could delve deeper into the subject more easily. It's not that I'm terribly bad at it, but most math texts are so obscenely terse and cryptic, that it makes you wonder whether the authors are actually even trying to teach people about what they're talking about...


I was writing it mathematically as I thought you would appreciate how your intuitive grasp of what n-dimensions is described mathematically.

The terminology (gibberish) I used is nothing more than a convenient but very precise shorthand for communicating abstractions. The reason these particular abstractions are given special names by mathematicians is that they occur over and over again. There is a lot of conceptual leverage to be gained if you are able to see the same patterns in seemingly unrelated problems.

As Poincaré put it, "Mathematics is the art of giving the same name to different things". I'd argue it's worth knowing some of those names.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: