Seriously, I gained a new found appreciation for Linear algebra after going through these lectures. You should go over some of these lectures even if you already know linear algebra - it might give you insights you never had before (it did, for me). Absolute must-watch if you are into machine learning or related areas.
I'm a freshman in university right now, how is linear algebra helpful in computer science? I'm finding it hard to stay motivated as I can't think of any uses outside of graphics. Maybe I'm just not far along in the course though.
Linear algebra comes up all the time in advanced courses. Certainly in graphics, but also if you do any machine learning, optimization, probabilistic algorithms, any sort of scientific computation. I never took a linear algebra course, convincing myself that I could just pick it up as I went along and getting into grad level computer science courses was ultimately a rather painful experience because of this. If I were you, I'd try to take it seriously and really try to develop an intuitive feel for linear algebra because (depending on what courses you want to take), it can really save you time and headaches in the future.
Well, if I remember my university's beginner linear algebra course, there were many topics on the syllabus only due to historic accident, ancestor worship, and theoretical necessities: I remember parallelepipeds, Cramer's rule, solving eigensystems by solving for a polynomial's zeros... Let me tell you how many times I've used parallelepipeds, Cramer's rule, or found eigenvalues via the quadratic formula in the 12 years since linear algebra (and a career in statistical signal processing and machine learning): zero, zero, and zero. Most things in college are important only to the extent that they're gatekeepers for what's really important. What hiddencost says is true, but very little of what you're learning in class is relevant to those important and interesting things. Sorry: college sucks.
It's not that eigenvalues per se are useless--they're plainly not--but that no one finds eigenvalues in practice by computing the characteristic polynomial and solving for its roots. Unfortunately, that computation is often found in HW and exams in US undergraduate linear algebra courses.
This is especially tragic because matrix factorization algorithms are so deep and interesting, theory and programming-wise! LU, Cholesky, QR, eigendecomposition, SVD, mmmm. Round-off error tolerance, convergence criteria, stability, yum.
What is interesting is obviously personal, and I know you don't mean to denigrate some topics in general, but I want to caution people that characteristic equations and root finding are complex and far from useless. Numerical computation of eigen decomposition starts from finding the roots of the characteristic equations (in altered form), the eigen values, without which there is no computation of SVD. How the behaviors of roots change as coefficients vary are fundamental in control engineering. Newton's method holds up half of numerical optimization. Undergraduates don't have to learn them because others have worked out the details and implemented them in software.
By the way, Cramer's rule is useless for numerical computation, but it is immensely useful in theoretical work. It belongs to the vast body of work dealing with determinants before the rise of linear algebra. Determinant is the only obvious connection to algebra left in an undergraduate's linear algebra course, so I can understand people are turned off by it.
Not to mention that when your matrix is 5x5 or more, there aren't even general solutions for roots if for some reason you're still insisting on the matrix->polynomial->eigenvalues route.
Sure there are (in many reasonable senses). Polynomial root extraction just isn't expressible in terms of addition, subtraction, multiplication, division, and nth roots alone. But that's ok; there's nothing magic about that particular set of operations, so as to make it the end-all, be-all.
People also don't only solve for eigenvalues computationally. Knowing about all the perspectives of eigenvalues helps. I agree the computations are dumb, but if you propose an intro linear algebra class with no computations to soak up test scores you will get far more protests.
Seconded. Same for discrete mathematics and differential equations. Interesting to learn about, but pretty much worthless as soon as you set foot off of campus. I'd love to see comments from anyone who has practically used any of the information from those classes as a part of their daily duties as a programmer of any kind.
I watched Gilbert Strang's video lectures on linear algebra (the MIT freshman course) for preparation for my PhD qualifier exam, and as a third year grad student, I could appreciate the relevance of almost every single topic in the class. That is, seven years after freshman linear algebra and with countless applications programmed, papers read and implemented, and theoretical/applied classes taken, it "all made sense" (don't ask me what a freshman is supposed to make of that material, other than to acquire it at a very abstract superficial level).
The early classes are the prerequisites for every and anything you might wind up doing with math. Including becoming a math prof, or a web dev, or dropping out. Nobody tells you, for every section of every textbook you have to read, what its myriad applications might be, and you can't get a customized build of just the topics you want.
But we're all startup people here right? Can this shortcoming be fixed? Can we make a detailed dependency graph of topics in applied mathematics, which could potentially be used to generate custom learning builds?
I use it every single day. In the past I did all kinds of simulations for avionics and flight. I've used it for epidemiology studies, signal processing, augmented reality, machine vision for factories, and nowadays I'm using it in computer vision. I couldn't get anywhere without differential equations, linear algebra, and so on.
Discrete mathematics includes graph theory, discrete statistics, topology, OR, and so on.
College isn't meant to be votech. It's meant to expand your horizons. How would you go off and build a robot, write code for the NIH, work for a VR firm, write code for oil&gas exploration, program a drone if you didn't know this math? My only regret is that I didn't take more math.
I guess it is all taste, but I want the ability to just go and do what I want, and frankly, this sort of work is deeply interesting because it requires you to solve interesting problems. By that I mean that learning the API to Qt or Unity or something is not deeply interesting - it difficult to the extent that it is opaque/poorly documented. Once you learn the pattern to put something together in those frameworks the work becomes quite pedestrian (can you put a button here that ... yes, I can do that, yawn).
Sure. I work in ML, in industry. Singular value decomposition and related methods are huge. Understanding basis vectors. Most of the notation I use every day. The intuitive understanding of linear algebra and ability to read papers that rely on it. a lot of ML relies on understanding data as points in high dimensional space.
a lot of the stuff you're mentioning is required for what I'd consider the really interesting topics in CS, stuff like ML, operations research, scientific computing.
I've used differential equations building a physics-based optimization system for an industrial process. Symbolically solving parts of the system really increased its accuracy and stability.
A lot of people don't appreciate the utility of Linear Algebra because they fail to see how the core theorems generalize beyond Cartesian coordinates. Sure, the basic operators of a vector space such as "+" and "*" look boringly familiar, but these operations can be overloaded to carry out other calculations just like in computer programming. The fact that you can represent any "vector space" with a basis set and that every possible basis set for that vector space will have the same dimensionality is pretty cool and useful.
I write a blog on math and programming and I see linear algebra applied every day.
1. Ranking in search engines (more generally, any kind of random walk analysis) [1]
2. Fourier analysis, and as a consequence most signal processing involves some understanding of linear algebra because integrals are linear. [2]
3. Regression [3] and more generally linear modeling of anything.
4. Facial recognition [4]
5. Community detection [5], where most leading methods analyze the spectrum of a graph to find communities. In fact, applied network science in general has a ton of linear algebra.
6. Greedy algorithms are characterized by a kind of generalization of linear systems [6]
7. Linear programming, perhaps the most applied piece of mathematics ever, needs a strong foundation of linear algebra [7]
8. All of quantum computing is literally just linear algebra [8].
9. Cryptography has a ton of linear algebra in it, and a large portion of the techniques are reasoned about with linear algebra.
10. Most of calculus relies on linear algebra, most importantly optimization [9]
11. Recent data analysis techniques based on topology do so through linear algebra [10]
12. Coding theory, including the algorithms used to correct errors on DVDs. Basically, any time you want to encode data so that you can recover from white noise, you're going to use a linear code. [11] This includes compression techniques.
I absolutely enjoyed learning Linear Algebra from these beautiful lectures by Prof. Gilbert Strang (MIT): https://www.youtube.com/watch?v=ZK3O402wf1c&list=PLE7DDD9101...
Seriously, I gained a new found appreciation for Linear algebra after going through these lectures. You should go over some of these lectures even if you already know linear algebra - it might give you insights you never had before (it did, for me). Absolute must-watch if you are into machine learning or related areas.