I feel like I have some kind of mathematics dyslexia. I can understand all the concepts, but I find formal mathematical notation almost completely incomprehensible. Expressions with all kinds of strange symbols, often variables introduced that are either completely undefined or defined ad hoc locations remote from the usage, integrals and sums with no subscripts / unclear scoping, etc etc. Clearly other people just read these and I even know people who will ignore all the text in a paper and just look for the equations. I wish I could master it because in some areas it is a genuine barrier to me achieving my goals.
Don't feel too bad about it, there are far too many notations for the same concepts - completely dependent on the background of the author.
This becomes especially noticeable in hodgepodge fields like Machine Learning / Deep Learning, where you essentially have scientists from all walks of life/sciences, trying to convey the same ideas, but using their own standards of notation.
Point is - he listed 18 different notations, on the spot, all "belonging" to different fields. Now imagine 18 different scientist, from their respective fields, writing research papers using their preferred notation.
Sure, there's going to be a lot of overlap - but there's still room for confusion, even for seasoned engineers and scientists.
(I just used ML and DL as an example, because that's the field where I've noticed the most of this - because of the nature of those fields)
> integrals and sums with no subscripts / unclear scoping, etc
That’s just seems like bad writing. Out of math subjects I learned (discrete math, linear algebra, real analysis, statistics), only in the last one it may be unclear what’s going on, whether you are dealing with a random variable or its realization, and arguments of random variables are almost always omitted, so they don’t look like functions.
Does the sum include both terms or only the first? I happen to know what Covariance is, but if I didn't I wouldn't be clear. I'd probably guess right, but scroll down a bit to this:
I absolutely do find infuriating when math is written down vaguely/confusingly/poorly, and ML papers are a great example of this happening way too often.
BUT, for your specific examples (in case that is an actual question):
The first one can be parsed using the fact that y_i in the second term wouldn't make sense if it was outside the summation, since the i index is only defined inside
The second (and actually, the first one, too) uses the very well established convention that multiplication always has precedence over summation, in every kind of expression. So the 2 multiplies the second sigma, and then the product is summed to the first sigma. This rule should make a lot of cases much clearer.
> Expressions with all kinds of strange symbols, often variables introduced that are either completely undefined or defined ad hoc locations remote from the usage, integrals and sums with no subscripts / unclear scoping, etc etc.
This is what mathematical notation looks like if you are lacking some of the fundamental knowledge to read it. I know what that's like because I've been there. Learning mathematics like you would programming doesn't really work - because often you can't just look up something you don't understand (how can you Google a mathematical symbol when you don't even know it's name?).
Pretty much the only way to learn mathematics is from the ground up.
> and I even know people who will ignore all the text in a paper and just look for the equations
Mathematical notation is mostly a substitute for words and that's why those equations can be embedded in natural language and "read". Ignoring the words in a mathematical paper and just looking at the "funny symbols" doesn't make any sense. You're only seeing tiny glimpses of the actual reasoning and are probably missing most of it.
A mathematician may chose to write "B contains all elements of A, and A contains the element x.", or they may chose to write "A ⊆ B and x ∈ A". Same thing. Also note how the second example still had an english word in the middle of it. If that "and" was an "or" the sentence would have an entirely different meaning. Can't just ignore that stuff.
Sure you could replace that "and" with yet another symbol, and in something as simple as that nobody would really care, but in something more complex the mathematician would just be trying to produce an unreadable mess on purpose.
I understand needing to learn the basics from the ground up but I always get bogged down by going back too. I have started to wonder if its worth trying to self learn mathematics at all since even when I do make progress its so hard to remember. If there were a well categorized Leetcode for math problems some more might stick but I feel like thats not really the point since people always say higher math is about proving or understanding things not mechanically solving problems.
yeah I have a similar thing - I’m trying to read a computational geometry textbook (for a code project) and I end up skipping the equations and trying to glean the meaning from the prose... it’s a slow process. There must be a resource out there for decoding unfamiliar equations but I don’t know what it is
> There must be a resource out there for decoding unfamiliar equations but I don’t know what it is
Honestly? To me this has always been an indication that I am being over-optimistic/reading ahead of my paygrade. If I can't understand the equations, then I'm not really grasping the fundamental concepts and I'm lying to myself.