The professor who wrote this is Jean Gallier, and I had him for advanced linear algebra at Penn. I am also pretty close to him in so far as a student can be close to a professor. On a personal note, he is one of the funniest professor I've had, and all math professors are characters.
For the people who are interested in ML, the thing to remember here is that he is a Serious mathematician, and he values rigor and in-depth understanding above all. A lot of his three star homework problems were basically impossible. He writes books first and foremost so he can understand things better. In math books, there's the book you first read when you don't understand something, then the book you read when you understand everything. This is book in the link.
As someone who never took an under graduate linear graduate course, videos of Strang's lectures got me through a couple graduate machine learning courses.
"In the following four chapters, the basic algebraic structures (groups, rings, fields, vectorspaces) are reviewed, with a major emphasis on vector spaces. Basic notions of linear algebra such as vector spaces, subspaces, linear combinations, linear independence, [...], dual spaces,hyperplanes, transpose of a linear maps, are reviewed."
If anyone needs to start even earlier than this, I've actually found "3D Math Basics for Graphics and Game Development" to be a good true intro for linear algebra-related stuff. I think this would probably hold even if your primary interest is something other than graphics/game dev. Some of the text in that book's intro is a little cringey with its reliance on kind of juvenile game references, but I didn't find that sort of writing continuing during the actual text. So just push past that stuff.
I got a copy of it to act as a refresher before diving into Real-Time Collision Detection since it's been quite a long time since formal math for me (as in, high school, because I'm self-taught in CS). I've managed to make up a lot of ground by working hard and finding classes to audit online (Strang's linear alg course on OCW is a good one), but I have found that depressingly few math texts which claim to be "introductory" are actually truly introductory.
This isn't a slight against the linked work, I absolutely love when profs make resources such as this freely available.
"How to Prove It" and "Book of Proof" are also great intros to formal math, if less immediately practical.
> If anyone needs to start even earlier than this, I've actually found "3D Math Basics for Graphics and Game Development" to be a good true intro for linear algebra-related stuff.
Did you mean to write "3D Math Primer for Graphics and Game Development" [1]? If you did, I agree 100%. I got a lot out of this book and was able to put it to good use for several projects.
I would disagree about the gamedev book reference, unless you are referring to the real basics of linear algebra.
The really important concepts for ML are least squares, eigenvalues and vectors, and SVD. Those concepts are not very relevant to game programming.
Well, least squares can be solved with projection, which is relevant for converting between coordinate spaces. But game dev isn't going to give you that intuition.
I believe the person you're replying to was attempting to help people like me, who haven't had a math lesson since leaving high school (in my case over 16 years ago) and whose level of math is roughly "You want me to multiply something? Let me get my phone."
So while the book in question might not be the best resource, it probably is a better starting point than the linked doc.
I wish someone would put together the "basic math" resource for people like that. I didn't do any maths beyond 16. I was an maths idiot at school (me? the teacher? I dunno) but since then I've worked 20 years in programming and had to learn more and more maths just to make a living. However, my maths is a rickety hodge-podge house, with no real foundations. I can't be a total idiot, because I've done some decent work in distributed systems, even research in CRDTs (which made me learn some math) but still, I'd love a self-study plan that took me from 16 to today with the _real basics_.
Thanks for your comment, it made me feel less alone.
Such a thing exists
https://m.youtube.com/playlist?list=PL5A714C94D40392AB though he's using nonstandard notation it doesn't matter. His lectures are like learning lisp to better understand programming in general but for math. Combine them with a Sheldon Axler college algebra text or something.
Honestly, this is what Khan Academy is. Start at high school algebra and work your way up the skill tree. There is no shortcut though, so you won't find a single book that will teach you undergraduate level maths in a week.
Not to look a gift horse in the mouth, but I'm always irritated by math books that include practice exercises but no answers in the back of the book to check your work against.
I believe they do this because the real target audience is other professors. The author wants those professors to use their book for their own courses, and a bunch of problems with no answeres saves a lot of time. There seem to be very few books written with the autodidact in mind. Sometimes you can find the solutions manual via eBay or torrent though.
“Math Basics” is quite the misnomer—it gives the impression that one would need to study all of the contents of this book to be an effective practitioner in CS or ML. Memorizing every definition and theorem in this book would be neither necessary nor sufficient for that purpose.
Keep in mind it can take an hour, and sometimes way more, to really absorb a single page of a math book like this (do the math). This is more of a reference text.
I think it is a matter of perspective. The book covers stuff you learn in the first two semesters when studying maths. So for someone who studied maths (probably the author) these are the basics.
I learned the spectral theorem with a couple applications in the first semester of my electrical engineering graduate studies. First year graduate material is commonly called basics in my experience. Compared to the state of the art, it is. Compared to what an undergraduate freshman knows, not so much.
Right, it's material I covered in my senior year of undergrad taking hpc grad courses. If it were covered anywhere in the second semester of undergrad I would be shocked.
Actually. many "advanced" texts are rather short (perhaps because they are more specialized or they do not need to be verbose or "entertaining" like many elementary texts are).
That reminds me of the seen in the first Antonio Banderas Zorro movie. Anthony Hopkins' character asks the wannabe Zorro how to fence, and wannabe Zorro says you stick the pointy end in the other guy. The "basic" stuff is actually everything you have to master to avoid getting killed in your very first duel.
It reminds me of Introduction to Algorithms, a classic book by Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest. It is over 1000 pages long. But they call it "Introduction..." :)
To be honest, Cormen et Al. is an actual introductory book. While it is humongous, it covers all of the basic math, logic and algorithms you see in the first 2 or 3 years of an computer science undergraduate course. It is an introduction in the sense that knowing and mastering the tools that the book provides you will set you up for more advanced topics in the many areas of computer science. For instance, a big number of important algorithms in machine learning, computational geometry and other topics use the basic strategies of greedy, divide-and-conquer or dynamic programming algorithms.
> “Math Basics” is quite the misnomer—it gives the impression that one would need to study all of the contents of this book to be an effective practitioner in CS or ML.
To me basics mean that if you study this entire book you won’t be able to understand ML otherwise it would say comprehensive. Furthermore the math presented in this book are all taught in 1st year courses for most CS programs I’ve encountered.
> Keep in mind it can take an hour, and sometimes way more, to really absorb a single page of a math.
Learning is a personal experience and happens at differing rates for different people. While I do agree this book is rather terse and would serve as a good reference any added explanations around the proofs would force a split to multiple publications so I can see why the authors chose to present it in the way they did.
Overall I have found this text easy to digest and well formulated and thank the authors and poster.
Which CS program? I think you’re more than exaggerating. They don’t teach topology, FEM and abstract algebra in “most” CS program. These are just three random examples; most of the book is wayyy more advanced than what I would expect to learn in freshman CS.
This book seems to be designed as a primer for PhD research students - which might include some talented final year undergrads heading in that direction.
If that's your level, it might reasonably be called "basic."
> Furthermore the math presented in this book are all taught in 1st year courses for most CS programs I’ve encountered.
From what I see this book is much more complete than a first year course, or even the whole curriculum of a classical CS education.
I'm quite familiar with math, but I never encountered wavelet theory, Gauss-Seidel method, Rayleigh-Ritz theorem, and many more. My knowledge about other subjects such as Hermitian spaces, quaternions, finite elements is quite superficial.
I have been writing software for over 20 years and I very regularly find that I am presented a challenge to learn new math, but am still rewarded mightily even if I do not engage it.
I think it's a good time to mention a couple of nice books (related)
1. Elementary intro to math of machine learning [0]. Its style is a bit less austere than that of OP's. It also has a chapter on probability. It could possible serve as a great prequel to the book linked in the OP.
2. The book on probability related topics of general data science: high-dimensional geometry, random walks, Markov chains, random graphs, various related algorithms etc [1]
3. Support for people who'd like to read books like the one linked in the OP, but never seen any kind of higher math before [2]. This book has a cover that screams trashy book extremely skimpy on actual info (anyone who reads a lot of tech books knows what I am talking about), but surprisingly,it contains everything it says it does and in great detail. Not even actual math textbooks (say, Springer) are usually written with this much detail. Author likes to add bullet point style elaboration to almost every definition and theorem which is (almost) never the case with gazillions of books usually titled "Abstract Algebra", "Real Analysis", "Complex Analysis" etc. Some such books sometimes attach words like "friendly" to their title (say, "Friendly Measure Theory For Idiots") and still do not rise to the occasion. Worse yet, a ton (if not most) of these books are exact clones of each other with different author names attached. The linked book doesn't suffer from any of these problems.
[0] Mathematics For Machine Learning by Deisentoth, Faisal, Ong
2] Pure Mathematics for Beginners: A Rigorous Introduction to Logic, Set Theory, Abstract Algebra, Number Theory, Real Analysis, Topology, Complex Analysis, and Linear Algebra by Steve Warner
Very first sentence of 2.1 is full of notation, symbols and terms that I, as a prospective student, might not understand.
So many teachers seem incapable of stepping outside their sphere of knowledge and seeing what they know and others do not. And so much work went into this.
Even as someone who did their undergrad in math, it's a bit tough to digest. I'd have to look up a lot of the terms again. It's been five years since I was in college. Shows how little I've used it all since I graduated.
It definitely looks more like, "math basics" for x field. Kinda like "automotive basics" for Honda or Ford vehicles. Where it's presumed that you know a lot of automotive lingo to begin with and you just need to know what spark plugs go with what engine. And not, "what is a spark plug?"
It sounds like you correctly identified a mismatch between audience and material, but I don’t think it’s appropriate to describe the teachers/authors as having made an error. Why would you assume that you are the intended audience of this text and thus that the authors have made some mistake?
That's a valid question, but the title of the post begins with "Math Basics..." The title of the paper is totally different. Maybe my point should have been directed at OP rather than the authors.
As mentioned in the paragraph above it, the chapter is a review, i.e. it's trying to quickly refresh the reader's memory on things they're already supposed to know; it's not trying to teach something new. So the sentence (“The set R of real numbers has two operations +: R × R → R (addition) and ∗: R × R → R (multiplication) satisfying properties that make R into an abelian group under +, and R − {0} = R∗ into an abelian group under ∗”) seems fine for that purpose.
If you look at any of the later chapters that are trying to teach something new, they are much more gentle and motivate the topic of that chapter: see e.g. “24.1 Affine Spaces” on page 759, or “26.1 Why Projective Spaces?” on page 823, etc.
Other chapters that are meant as a review are similarly terse and quick to the point (like Chapter 2), e.g. Chapter 37 “Topology” on page 1287.
I think it's good when books make conscious choices about what they're teaching versus assuming as a prerequisite (and communicate it to the reader, by using terms like “reviewed” — presumably the yet-to-be-written Introduction chapter will also mention this more explicitly).
I purchased this. I've been trying to brush up on CS fundamentals (it's been a long time since college), but I get stuck just on trying to understand what I'm being asked to learn.
Here's what he is doing. He wants to start with the set of real numbers, intuitively the points on the line, usually denoted by R, maybe typed in some special font.
Then he wants to define, say, addition of real numbers. So, given two real numbers, x and y, that might be equal, he wants to define x + y.
So, here he wants to regard addition, that is, +, as an operation. Then, as is usual for defining operations, he wants an operation to be just a special case of a function. So, he wants to call + a function. So, + will be a function of two variables, say, x and y. With usual function notation we will have
+(x,y) = x + y
The set of all (x,y) is the domain of the function, and the set of all x + y is the range.
So, that defines the function + except commonly in pure math we want to be explicit about the range and domain of the function.
For function +, the range is just the set of all pairs (x,y) with x and y in R. That set is also the set theory Cartesian product of set R with itself and written R x R. So, the domain of + is R x R. The range is just R. Then to be explicit about the range and domain of function +, we can write
+: R x R --> R
which says that + is a function with range R x R and domain R.
We learned how to add in, what, kindergarten? So, why make this so complicated?
Well, he wants to regard the real numbers as just one example of lots of different algebraic systems, e.g., groups, fields, vector spaces, and much more, with lots of operations and, possibly, more that could be defined. E.g., later in his book he will want to add vectors and matrices, take an inner product of two vectors, and multiply two matrices.
So, back to addition on the real numbers, he wants to regard that as just a special case of an operation on an algebraic system.
IMHO there's not much benefit for making adding two real numbers look so complicated.
Whatever he did in that chapter for defining addition on the reals, soon he is discussing matrix multiplication with no definition at all -- assuming the reader already understands that, that is defined and discussed many pages later in his book.
So, in his notation
+: R x R --> R
and matrix multiplication, he is using material before he has defined it, even before he has motivated, explained, exemplified, indicated the value of, and defined it. In good math writing and in good technical writing more generally, that practice is, in non-technical language, a bummer.
But from the table of contents, it appears that the book has quite a long list of possibly interesting narrow topics. And maybe for the routine material, his proofs and presentation are good -- maybe. I thought enough of the book to keep a copy of the PDF. It's there; if someday I want a discussion of some narrow topic, maybe I'll try his book!
In mathematical writing, it used to be common for the word processing to be much more work than the mathematics! Now with TeX and LaTeX, and I'm assuming that the book used one of these two, the flood gates are open!
And you've summed up almost perfectly everything I hate about how math is taught, how math is discussed, and how ideas about math are communicated, when it really should be one of the most beautiful, insightful, and rewarding subjects of study in the known universe.
Most of the best math is not trivial and, thus, usually takes some effort to understand.
There are some good authors of math for, say, calculus, linear algebra, differential equations, advanced calculus, advanced calculus mostly for applications, real analysis, optimization, probability, some topics in stochastic processes, introductory statistics, various more advanced topics in statistics.
Mostly the books are short on motivation and applications, and as a result it is too easy to spend time on material likely not worth the time unless you can be sure both to live forever and remember forever.
For calculus I liked Johnson and
Kiokemeister. I taught from Protter and Morrey, and it was easier than J&K. Lots of people liked Thomas.
For linear algebra, I liked E. Nering and, then, P. Halmos, Finite Dimensional Vector Spaces which really is baby Hilbert space theory. Take Nering seriously -- he was a student of E. Artin at Princeton. His treatment of linear algebra is balanced and polished. For one of his editions, he has some group representation theory in the back, good, and some linear programming, really bad.
A lot of people like the MIT Strang book.
For advanced calculus to help when studying physics, especially electricity and magnetism and engineering, I very much liked
Tom M. Apostol,
'Mathematical Analysis:
A Modern Approach to Advanced Calculus',
Addison-Wesley,
Reading, Massachusetts,
1957.
He has more recent versions, but for physics and engineering I like the 1957 version and don't like the later versions at all.
For ordinary differential equations, I liked
Earl A. Coddington,
'An Introduction to Ordinary
Differential Equations',
Prentice-Hall,
Englewood Cliffs, NJ,
1961.
He makes variation of parameters look really nice -- then can understand the remark in the old movie The Day the Earth Stood Still. Ordinary differential equations is a huge, old field, and there is some question about how much of that deserves study now. Do notice that for systems of ordinary differential equations, get to apply some linear algebra in cute ways.
For advanced calculus for applications, there is the old MIT Hildebrand -- he knows what he is talking about, is easy enough to read, and a good place to go if need one of his topics.
In recent decades, the pure math departments wanted to teach advanced calculus as the theorems and proofs for freshman calculus. So there is Rudin, Principles of Mathematical Analysis, third edition (not the first two, maybe a later edition if there is one). So here's what is going on: He wants to develop the Riemann integral which is the one in freshman calculus. For that he wants to integrate over a closed interval on the real line, that is, some [a,b] which for real numbers a <= b is the set of a real numbers x
so that a <= x <= b. Rudin will argue that [a,b] is a compact subset of the reals and show that the Riemann integral exists on all compact sets. So, the first chapters are big on compact sets. Then he talks about functions that are continuous and then ones that are uniformly continuous. With uniform continuity, the Riemann integral follows right away. Later he does some infinite sequences and series and then uses these for careful treatments of some important results, the exponential function, the number e, the sine and cosine, etc. Later he does integration of functions of several variables on manifolds for Stokes theorem and the related divergence theorem, a fully careful treatment of these theorems used in E&M. He does the Cartan exterior algebra: What is going on is that he wants to integrate a function g: M --> R where M is a manifold, that is, the range of some function f from some box, triangle, etc. to the space with the M. So, for this need the formula for change of variable for integrating with several variables, and that is a determinant of a square matrix. This integration is a multidimensional version of the line integral where direction of integration is important -- the exterior algebra is the multi-dimensional version of that. Can see that again in some treatments of general relativity in physics.
I like Rudin's third edition: Once know what the heck he is driving at and how he is getting there, say, as above, then his high precision is welcome.
For statistics, I suggest using some popular elementary book as a start. Then learn probability really well and from then on study particular topics in statistics as needed. The current directions in machine learning promise to make lots of particular topics important.
For a first book on statistics, consider
George W. Snedecor and William G. Cochran,
'Statistical Methods,
Sixth Edition',
ISBN 0-8138-1560-6,
The Iowa State University Press,
Ames, Iowa,
1971.
My wife did really well with that. So, get a good start on statistics and, then, get to learn some analysis of variance (experimental design), an underrated topic.
For a second book on statistics, consider
Alexander M. Mood,
Franklin A. Graybill,
and Duane C. Boas,
'Introduction to the Theory of
Statistics, Third Edition',
McGraw-Hill,
New York,
1974.
Here, go quickly and get only the high points and don't expect the math to be very good -- in places it's pretty bad.
For regression analysis and linear multivariate statistics more generally, there are several books, Maurice M.\ Tatsuoka, Donald F.\ Morrison, William W.\ Cooley and
Paul R.\ Lohnes, N.\ R.\ Draper and
H.\ Smith. So, in particular, get enough to understand that regression is a perpendicular projection and, thus, get the Pythagorean theorem again.
For more on such statistics aimed at machine learning, get the Breiman CART -- Classification and Regression Trees, maybe much of the start of ML.
With that much in statistics, will have seen a lot of applied probability and may be ready for the real stuff. For that, need measure theory, e.g., the first half of Rudin, Real and Complex Analysis or Royden, Real Analysis. Then read Breiman, Probability. After that might read some of Chung, Loeve, Neveu, and maybe some more. Then return to applications including statistics with a really solid foundation in probability, random variables, the classic limit results, and much more. Then can read and/or write lots of advanced topics in statistics.
For optimization, a similar review is possible, but it's getting late.
I love that they have problems you can solve as well at the end of (almost) every chapter.
This IS a lot of math (1,962 pages) and it’s missing a preface/introduction which would have been helpful to understand if I need to go linear or if a la carte is okay. At the moment I’d assume each major section is independent.
I'm wondering - is this a genuine request, or a snarky, implicit reference to an online resource for learning math somewhere?
I'd love to know about the existing resource, if it exists.
(The only thing that comes to mind is Wolfram Alpha, which didn't seem 'systematic' the last time I skimmed the main page)
Whoah, this covers a lot. I was expecting some linear algebra, calculus, and discrete math, but there's actually some stuff in there I don't know after doing a masters in math.
That makes me feel somewhat better - I saw the title of 'Maths Basics' and thought 'Great!'... then I saw it's 1,962 pages - if that's the basics, how much is the intermediate and advanced bit?
Thought the same. 2000 pages - Basics was probably an understatement...
Just looked at a few pages and it seems really illustrative. I am just a light-weight mathematician as a computer scientist, but I really would have liked such a comprehensive script for studying. I hate it when profs reduce everything to minimal definitions and expect studends to make sense of it. There are countless books but it is always a gamble that they focus on the topic at hand and don't suffer from the same problems.
This even gives you "motivational examples" which are extremely helpful for comprehension in my opinion.
I love professor gallier! He's an incredible person.
That being said, this is faaaaaar beyond basics. It'd be more appropriate to call this an incomplete (aiming to be comprehensive) guide to almost everything you need to know in computer science (related to math).
That's almost 2,000 pages of math...I don't know why and how, but somehow I forgot most of the Statistics knowledge I obtained as a graduate student (in Stat) 10 years ago.
I remembered that I took an advanced course about Bayesian Inference, and one course about Multivariate Statistics (PCA, Factor analysis, these kind of things), and my project is about Bernstein Polynomial. That's it...
You forget complex things that you don't use regularly. Math, spoken languages, written languages, coding...of course you can re-learn it, and re-learning is faster than learning it for the first time.
Based on speaking to my managers in the past, it seems like a year-long lapse is enough for you to lose an incredible amount of retained knowledge/skill. But it's not a permanent loss.
Yeah agreed, sometimes reading a research paper from the DS team would actually ring a bell somewhere and I know where to look at. I'm re-learning Statistics from bottom up at the moment lol but this 2,000-page book really looks daunting. I'm pretty sure I didn't take any advanced optimization course back in university.
In basic calculus one can burn countless hours memorizing mechanical rules to derive and integrate different function forms, or one can just plug the function into something like wolfram-alpha and get, for a lot of useful cases, a symbolic answer, or at least some approximate answer for a point or interval.
The point is, understanding integrals and derivatives doesn't require one to memorize all the mechanical rules. Using software to compute those functions can be a huge time saver. No one should go with pen an paper double checking if that polynomial integral is correct or not!
With a book almost 2000 pages long, I wonder if this books leans more heavily on the mechanical-rules side of math. In my mind, is the difference between writing a book such that you can write your own wolfram alpha, or writing a book so you can just use it.
You don't need to memorize rules when studying math. Just like you don't need to spend any time to memorize syntax for programming languages. You automatically remember things you use a lot.
Once you have spent countless hours doing exercises to the extent that you understand the math, you already remember the rules. If you have not spent countless hours doing exercises, you don't understand anything at this level.
You don't hire a programmer who has read all the books and 'understands' programming but has never programmed. It's the same with math. You don't just read a math book from start to finish. You can use wolfram alpha for visualizing functions, not for learning math.
Programmers have spent countless hours practising programming to the point where they have forgotten how difficult it was in the beginning. A non programmer might think of programming as "memorizing hundreds of rules" to get anything done, but one doesn't learn programming by sitting around explicitly memorizing hundreds of rules and then begin to program.
Actually writing programs with a minimal set of 'rules' memorized and then adding more as needed is how one typically learns programming.
I've been teaching programming for five or six years now. I always start with HTML, then add CSS, and then add JavaScript. That way they experience mastery all the way, and see how they can be creative with the code. It's so great to see a pupil "get it" -- and sometimes even pupils that "suck at math" or even pupils who have problems spelling the most basic sentence correctly. In fact I've found that there's a strange correlation where pupils who have dyslexia often seem to be better than others at programming.
I wrote something similar to what you are saying. For instance, I said: "understanding integrals and derivatives doesn't require one to memorize all the mechanical rules".
I'm not against memorization though. Memory is very useful when studying math or programming or any other subject. You don't want to have to "reason your way through" every time, shortcuts are very important!. I think of this like brain-memoization. Without it, it would be very inefficient to make progress. A lot has been said about this relationship [1]. Also, I think this is how some breakthroughs happen, "connecting the dots", so to speak.
Maybe when you say: "...spending any time memorizing syntax...", you are thinking flashcards or something like that? Sure, you don't need flashcards, anything you do often enough is gonna be easier to remember.
My comment was more in line with the fact that, with 2000 pages, maybe the author elaborates a lot on things that are very mechanical in nature and maybe require a few pages to describe (and are very inefficient for humans to compute? Just use a computer! :-). Say, Gaussian elimination; couldn't one be told: this is a matrix, this is a determinant, this is the relationship between them, this is what it means to invert the matrix, etc. and skip the full description of Gaussian elimination? (put in an appendix? on a second book? less pages!). I don' think is super helpful to, say, spend a lot of time inverting matrices with pen and paper in order to get proficient in linear algebra.
I'm coaching my son through high school math. One of the things I'm trying to impress upon him is focusing on understanding _why_ these formulas work rather than just memorizing the formulas themselves - if you understand why they work, you can always re-derive them if you need to, and you may forget the details of what they do, but you'll never forget the details of why they work once you understand them.
The best strategy for this in my opinion is to always ask “what’s the picture” (as in visualization)? I can’t remember the mechanical rules for say Newton’s method, but I have the picture of what Newton’s method does in my head. From that, I can quickly work out the algorithm.
> No one should go with pen an paper double checking if that polynomial integral is correct or not!
Hm - maybe I'm fortunate that I studied calculus before there were (accessible) software packages that could just do this stuff for you, because back then, the only way to solve these was to do them on paper. I'm sure I would have been tempted to just "skip ahead" to letting the computer do it for me, but I definitely learned a lot more going through all of the steps myself than I would have if I had just gotten a high-level understanding of what was going on and plugged the rest into a computer. Because, honestly, integrating polynomials is really, really easy - if you know how to do it, you can do it on paper faster than you can load up wolfram-alpha, type it in, and wait for an answer.
This one does not. it's proof heavy, and there isn't really a mechanical way to do proofs that's efficient. The understanding comes first and often all in bunches when "the light turns on," then the proof follows.
I suspect there are better resources for each topic covered (e.g Gilbert Strang books and OCW lectures for Linear Algebra), but it is definitely interesting to peruse and get a sense of relevant topics.
Nice to see wavelets in here - but it's a shame that he seems to be encouraging people to actually use Haar wavelets.
They're fine for teaching - but there are usually better choices in real life. Daubechies are a good default
The writing style of this book i.e. rigorous math notation and proposition/proof presentation is going to put off the great majority of potential CS and ML readers. At almost 2000 pages it sure makes a great door-stop though.
Basics should be concepts that get you to 80% and tell you where to look for the rest 20%. This book tries to get you directly to 95% and is best treated as a reference book.
The vast vast vast vast vast majority of this book (which is more of a reference and encyclopedia than an actual book for learning) is not required for implementing most ML libraries.
Okay, so this looks potentially awesome. But given that it is a reference work rather than some introductory "basic" little quick read-through, I'd prefer to have it in paper form.
this is an incredible reference for a machine learning researcher who wants to fill in some gaps in their existing mathematical knowledge.
But I would be shocked if this would be of any use for someone trying to learn a little linear algebra in order to play with neural networks. For that I think you still want Strang.
I think "foundations" might have been a better word than "basics" here. "Basics" in any case is not in the printed title, only in the filename.
Second this. If it has exercises, even better. About to undertake my MSc Comp Sci and would like to learn Discrete Math as I'm coming from a non-math background (did B.IT at school and only took two math courses).
No, you can be an ML practitioner with just an intuitive understanding of, say, gradient descent works and you would do fine. You can even pick up that intuitive understanding on a strictly need-to-know basis, when it's needed for learning an ML technique. That's what fast.ai teaches.
For being more than a practitioner, like an implementer of new ML libraries or a researcher, of course you'd need to know more.
No, but there are some good fundamentals. I.e, The optimization bit for when dealing with SVM, Kernel methods, etc. All parts that college ML courses cover in depth
For the people who are interested in ML, the thing to remember here is that he is a Serious mathematician, and he values rigor and in-depth understanding above all. A lot of his three star homework problems were basically impossible. He writes books first and foremost so he can understand things better. In math books, there's the book you first read when you don't understand something, then the book you read when you understand everything. This is book in the link.
for linear algebra, this:https://www.amazon.com/Introduction-Linear-Algebra-Gilbert-S...)