Hacker News new | past | comments | ask | show | jobs | submit login
Introduction to Linear Algebra for Applied Machine Learning with Python (pabloinsente.github.io)
425 points by Anon84 on Nov 11, 2020 | hide | past | favorite | 52 comments



Hi, I did that article :)

https://twitter.com/CodeBug88

Saludos


Is the 'sente' in your GitHub handle the opposite of 'gote', or something else entirely?


Oh yes, I used to play a lot of Go/Baduk :)


Ola


I like that this touches on some of the algebra aspects. I always find it strange when I see something about “linear algebra” that never mentions linear maps, vector spaces or bases. Perhaps the mechanics of computing with matrices is better for an introduction for people who are only interested in machine learning but the mathematician in me feels like it is important to get across that in some specific sense the numbers in a matrix are arbitrary and that the essence of the thing is simpler.

I’m a bit surprised that this doesn’t mention suffix notation, especially as multilinear maps are so common in machine learning and suffix notation is such a good way to work with them.


I just recently finally bought Strang's introduction to Linear Algebra. The book itself is very dense and, dare I say, scary to a newcomer, but his videos are amazingly clear in combination with the book.

I am in the beginning, but so far, I'd recommend the book. Though, the steep price (almost 100USD over here) makes it a bit difficult to recommend. I might buy Boyd’s and Vandenberghe’s Introduction to Applied Linear Algebra later and write a comparison.


Jim Hefferon, who is also a HN member, has a great free Linear Algebra textbook. People all have their own preferences but I actually prefer his to Strang's.

https://hefferon.net/linearalgebra/


+1 for Heffron's book, which was posted here 2 weeks ago [0]. He also published all the solutions to the exercises! [1]

[0] https://news.ycombinator.com/item?id=24892907

[1] http://joshua.smcvt.edu/linearalgebra/jhanswer.pdf


As a heads up, the Boyd book is available free on his website.

http://vmls-book.stanford.edu/


My sense is that Vandenberghe is great for getting your feet wet and building intuition, but it's not as deep as Strang in its coverage and has fewer problems. Definitely still worth reading through especially since the problems lend themselves well to working through with Julia or Python or similar.

For further reading, try Matrix Analysis and Applied Linear Algebra by Meyer.


I was scrolling through to see if you explain how you typeset the math (notoriously hard on webpages), but I happened to instead find a small mistake. You write: "Elements of $\mathbb{R}^n$ are sets of real numbers." That is an incorrect definition.

- It is inaccurate because n doesn't enter into the definition (and thus all Euclidean spaces would be the same).

- You want ordered n-tuples, not sets. If you actually defined elements of R^n as sets of n real numbers, you would not be able to represent the diagonal (or, if you fix that, not be able to represent the difference between certain points on the diagonal of R^n and certain points on the diagonal of certain subspaces).

Moreover, figure 5 talks about "overlapping vectors". This is highly non-standard, and definitely would need to be defined.

Further, you're setting your readers up for trouble by defining vector arithmetic in terms of bases.


Totally agree with this post...however, to be extremely pedantic (and thus not suitable for the article in question), a tuple in the foundations of mathematics is typically defined as a set. That is, (x,y) := {x,{x,y}}, where the latter is the set containing the element x and the set {x,y}. That is how one goes from axiomatic set theory to define tuples of numbers.


There is a big difference between your type of pedantry versus the OP... I don't think the OP was pedantic. It is simply not a set of real numbers. He did say that it was a "small mistake".

The first is more an error of understanding (for the lack of a better term), whereas talking about tuples as being de facto sets is truly about being pedantic.


Absolutely! In this sense you are absolutely right, but a reader who knows how to construct such tuples (i.e. fill in the blanks in the article) also in all likelihood knows the content of the article. And I bet readers who don't know that already, will not realize that those are the sets the author means.

After stumbling across the author's twitter where he already complains that people are "being mean on HN", and seeing his responses there, I have some serious doubts about whether he is fit to be teaching people mathematics. I honestly applaud him for writing the article, and don't hold the math mistake against him at all. But the incredible defensiveness when confronted with a small mistake is absurd.


Who goes from axiomatic set theory at all, if their goal is not to be a logician/type theorist? One does not need to know axiomatic set theory to study linear algebra any more than one needs to know about semiconductors in order to program in Python.


To be even more pointlessly pedantic: sure, but that doesn't give you "sets of real numbers", in the sense of sets where every element can meaningfully be interpreted as a real number.


Oh and perhaps more immediately: with this definition of R^n, (x,y) and (y,x) are the same point. No good.


Keep in mind those are just notes I made for myself. I decided to put them out there just in case someone found them useful and to signal my skill... By practicing ML/DS in the last couple of years, I came to realize that what I put there is probably more than what is needed to know for applied ML/DS, and conceptual inaccuracies like those have little to none relevance


Well, wrong is wrong :-)

I mean no offense by saying that, and it's human nature to be wrong.

However, this is not a "conceptual inaccuracy". It's wrong. Straight up, old-fashioned wrong. And don't tell me it's of "little to no relevance" that your definition of R^2 does not distinguish between (0,1) and (1,0).


Great resource. I also want to post this series by Rachel Thomas of Fast.AI.

https://www.youtube.com/playlist?list=PLtmWHNX-gukIc92m1K0P6...

She's a phenomenal teacher, and has grounding in the practice of applying it to ML and deep learning.


Is there a website that anyone knows of for these types of open-source curriculums? Feel like it would be nice to have a place where people could post these and you could reference them when you wanted to learn something in the correct order and with the best resources.


They’re much narrower in scope than a full curriculum (or even a course, for that matter), but Better Explained [0] has some very good overviews of math topics. They are a useful supplement I’ve come across repeatedly while searching topics I found challenging over the years.

On the topic of open source learning, I take every chance I can to heartily recommend fast.ai’s course [1]. It’s a good intro to Deep Learning that leaves you informed enough to build things, and equips you to ask follow-on questions and dive deeper when/where you need to.

[0]: https://betterexplained.com/

[1]: https://course.fast.ai/


Perhaps more relevant for this topic is the computational linear algebra course from fast.ai:

https://www.fast.ai/2017/07/17/num-lin-alg/

It has a lot more detail on stuff like floating point storage, memory layout, sparse matrices, iterative methods, etc than most linear algebra courses, but doesn't go much in to proofs, geometric interpretations, and other stuff that's less needed for algorithm design and implementation.

(Disclaimer, I'm from fast.ai.)


For AI/ML, I think https://madewithml.com/ does the job. It has curated open source projects, courses, papers and topics.


Something like Kahn Academy (https://www.khanacademy.org/)?


Khan Academy is great, but I mean for all resources on a subject. In other words, a place where people can post curations of resources, not actual content. So say I wanted to learn linear algebra it would point to Khan Academy for the basics part and then other places as well.



The list of "awesome lists" on github [1] is generally quite helpful (haven't checked them all myself obviously).

[1] https://github.com/sindresorhus/awesome


Thanks for this Pablo. I find this a good intro to LA as well as a tutorial on Numpy! Just one feedback would be good to have the TOC handy as we click around or a way to jump back to the top.


This looks like an excellent resource. Thank you.

The rendering of some of the Latex code does seem to be bit of a challenge; I see some intermingled code/text (on latest Mozilla on Ubuntu 18). The CPU usage also shoots up.

What scripts are rendering the Latex code?


It usually takes a while to render the LaTeX code in my browser. If you scroll down immediately after entering the site you may find it looking weird. I just used mathjax syntax which supposedly renders on most browsers (my blog is hosted on GitHub, using Jekyll as site-generator). My knowledge of web-development is very basic, so I might be doing some things wrong ( ˘︹˘ )


Linear algebra: Everything is happy lad-dee-dah for x and y. The moment we go into matrices I go downhill from there. I saw a very cool youtube on Tesla Dojo and machine learning. I recommend.


One of the fun things about linear algebra is that you’re working with lots of simple equations all at once. It’s like moving from single variable calculus to multi-variable calculus. You realize that the additional complexity was there all the time, and you were just studying a special, restricted case before.


Yes, and it's super useful. The first times you find yourself working with several equations and you 'just' put that into a matrix form and solve that system is an amazing feeling of 'truth' having been hiding in plain sight.


Absolutely. It’s easy to forget how useful it is to be able to solve a system of equations like that. It lets you find optimal settings for complex systems with multiple constraints. Fitting a machine learning model is just one application for this... there’s a whole academic discipline called Operations Research that focuses on using optimization to find solutions to practical problems.

Just another example of the huge power in creating accurate digital models of the world.


We touched a bit of that. As an aside, when I was a student, we also had an applications module called "Numerical Analysis". We went through a bunch of algorithms (Cholesky, Seidel, Newton-Raphson, Gauss, etc.) and ran them by hand. We basically were "human computers". The exams were also solving problems using these algorithms, and given the iterative nature of many of them, if you make a tiny mistake it'll ripple and ruin everything for you.

As an aside, my background is in optimal control/control theory and instrumentation, and operational calculus, state space representations, and matrices were a huge part of the "Jiu-Jitsu" we did. Tinkering with RST control, robust systems, finding optimal systems and sequences (Hamilton-Jacobi-Bellman-Pontryagin) with matrices flying around on our exam papers solving these systems. It was a nice abstraction.

Very, very, powerful tools that command immense respect for Laplace, Lagrange, and people like them who invented things to solve problems we're facing now.


It's not on the syllabus, but what application does jordan canonical form have to machine learning?


probably none, since afaik the Jordan normal form is numerically very unstable. It's however quite useful in proofs (e.g. you can theoretically construct the matrix exponential out of it).


here's a Linear Algebra course from Columbia University with a bit of computer science emphasis. https://tonydear.github.io/teaching/coms3251


are there resources where linear algebra is discussed in the context of deep learning?


Accompanying the book in the sibling comment is Strang's MIT 18.065 course which is exactly what you're after (it specifically covers linear algebra with a focus on data science and ML): https://ocw.mit.edu/courses/mathematics/18-065-matrix-method...

OCW so it's free(!), including the videos and handouts.


And Strang (more than) knows what he's talking about. Unlike OP, who treats minor corrections as personal attacks and goes all defensive. This is part of the reasons why it's dangerous to replace education from established institutions with random blogs on the internet.


> This is part of the reasons why it's dangerous to replace education from established institutions with random blogs on the internet.

Please don't gatekeep education like this. I've caught teachers from established institutions in mistakes hundreds of times in my life, and often just as defensive. And I've found "random blogs" on the internet that explain things far clearer than the school recommended book. The only thing that is important is the quality of the resource, not where it came from.

Strang's free videos are great. That's the important thing to share.

(I edited to clarify which part of the parent comment I took issue with).


> Please don't gatekeep education like this. I've caught teachers from established institutions in mistakes hundreds of times in my life, and often just as defensive. And I've found "random blogs" on the internet that explain things far clearer than the school recommended book.

Absolutely! And I've had medical doctors make lots of mistakes, while random blogs on the Internet have been right. Buuuut, you know.

> The only thing that is important is the quality of the resource, not where it came from.

Right. And the blog author we're discussing is flat out refusing to accept that his material has basic mistakes. That's a gigantic red flag. Again: it's not about being wrong. It's about refusing to accept that he's wrong.


> Buuuut, you know.

No I don't. Could you please explain? If random blogs have helped you catch doctors making mistakes (like they have me) then it would seem to strengthen my point that gatekeeping sources of valuable information based on their origin instead of their quality is a mistaken approach.


What I mean is: even after experiencing that, I'm gonna see a medical doctor for medical problems. I bet you will too, if the problem is serious enough.


We are discussing how and where we learn things. So the fact that I go to a mechanic to fix my car has little to do with gatekeeping about where I should learn about how to fix my car, or learn enough to keep my mechanic honest.


Well, if the blogpost that taught you how to fix your car has an elementary mistake that the author refuses to acknowledge and gets super defensive about, it's sure time to take a step back. Does it mean all blogs are bad? Of course not. But if you have the option of listening to a world-renowned mechanic instead, for free, I'd say you should.


> This is part of the reasons why it's dangerous to replace education from established institutions with random blogs on the internet.

> If you have the option of listening to a world-renowned mechanic instead.

With the note that "world-renowned" has almost nothing to do with "education from established institutions" I feel like we've made progress here, so I'll leave it at that.


"Linear Algebra and Learning from Data" by Gilbert Strang [1] has a section on that.

https://math.mit.edu/~gs/learningfromdata/


This is so wonderful. Thank you for putting this up :)


This is sick! I'm saving this for sure.

I also really appreciate linking to other learning materials in the beginning (both free and non-free)




Consider applying for YC's first-ever Fall batch! Applications are open till Aug 27.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: