Hacker News new | past | comments | ask | show | jobs | submit login
Graphical Linear Algebra (graphicallinearalgebra.net)
287 points by guerrilla on Aug 18, 2020 | hide | past | favorite | 82 comments



The short version appears to be that the author has managed to replace standard linear algebra notation with an alternative (but mathematically equivalent) graphical notation, which is harder to do computations and write programs in.

This submission is the latest in a trend of what I would classify as "math-lite" category theory and Haskell articles that reach the HN front page, which purport to explain something interesting but end up just rehashing standard mathematics in opaque ways.

I really wish this would stop. Fortunately (unfortunately?) I've been down this road a few times and know to avoid getting sucked in, but I can easily see a bright, curious person wasting a lot of time before realizing that the content is merely linguistic, not mathematical.

Is (for example) category theory a useful organizational tool in certain abstract branches of mathematics? Sure (those branches being, basically, algebraic topology and algebraic geometry). Is it a grand unified theory of math? No, it's just some useful vocabulary. Most mathematicians go their entire lives without writing a paper mentioning categories. Getting excited about category theory is like getting excited about matrix notation – useful, sure, but not where the meat is.

I also find any claims that category theory is relevant to the average working programmer to be dubious at best.

(And yes, I realize the graphical notation presented in this article is not category theory. I am making a broader point about certain kinds of articles I see, which also applies here.)


I often go back to this post by John D Cook

Category theory can be very useful, but you don’t use it the same way you use other kinds of math. You can apply optimization theory, for example, by noticing that a problem has a certain form, and therefore a certain algorithm will converge to a solution. Applications of category theory are usually more subtle. You’re not likely to quote some theorem from category theory that finishes off a problem the way the selecting an optimization algorithm does.

and

I had been skeptical of applications of category theory, and to some extent I still am. Many reported applications of category theory aren’t that applied, and they’re not so much applications as post hoc glosses.

At the same time, I’ve seen real applications of categories, such as the design of LINQ mentioned above. I’ve been a part of projects where we used category theory to guide mathematical modeling and software development. Category theory can spot inconsistencies and errors similar to the way dimensional analysis does in engineering, or type checking in software development. It can help you ask the right questions. It can guide you to including the right things, and leaving the right things out.

https://www.johndcook.com/blog/applied-category-theory/


Here are some other applications. Look at: http://zxcalculus.com/ & https://www.youtube.com/watch?v=iC-KVdB8pf0


> ...the content is merely linguistic, not mathematical.

I disagree with the "merely" here. Half of mathematical practice seems to be finding the right way to think about a thing. If the author were claiming this was a new grand theory, I'd be right there with you, but they seem to be claiming that it's just another way to look at things. I don't see any claims of importance. There's always room for new viewpoints that are content-wise equivalent to old viewpoints but different in how you work with, interpret, and think about them.

> ...but end up just rehashing standard mathematics in opaque ways.

I'd say it's rehashing standard mathematics in strange interesting ways, making it a very fitting curiosity for a site where things like a history of methods of communicating with submarines gets a ton of upvotes.

It's a bit surprising to realist that it seems like we completely agree on what it is and what its contribution to existing standard material is, but for some reason I think it's interesting while you "wish it would stop."


I kind of agree with you (especially on "math-lite category theory" being upvoted by people who read it and feel smart for understanding it), but I think the point has slightly been missed. The blog is a sort of side project, a popular treatment of the linear algebra stuff coming out of the "real research", which is applying the string diagrams to find graphical, compositional axiomatisations of concurrent systems. The programme has had some success with signal flow graphs (even making it onto the blog: https://graphicallinearalgebra.net/2016/09/07/31-fibonacci-a... ) and they were starting to look into Petri nets as my study was wrapping up.

(source: I was a postgrad supervised by Pawel)


What "success" has this research program had with signal flow graphs? What problems has it helped solved?

My impression is that it ends up being largely some kind of linguistic translation project, or abstraction for the sake of abstraction, but I'm happy to be corrected on this point.


We have a way to write traditional SFGs as string diagrams and an isomorphism to behaviours which respects the kind of graphical reasoning used in the blog. So you can show that two SFGs are behaviourally equivalent, or, using some more recent work, that all the behaviours of one are also behaviours of another, i.e. inclusion of sets of behaviours. String diagrams are also more general than SFGs, but can still be mapped to behaviours, so you could have an "unimplementable" string diagram to serve as a specification, and use the axiomatisation of inclusion to show that a particular SFG is a valid implementation of that spec.

As for whether anyone outside the research programme is actually doing that, I don't know. But in principle it's a useful formal methods type theory.


> "math-lite"

The series is intended for a general audience and starts by laying the groundwork necessary for exposition of this paper: [1]. Article 4 [2] explains why it is introduced this way.

> end up just rehashing standard mathematics in opaque ways

It's just an explanation of string diagrams [3][4], pretty transparent and standard.

I find it funny that your response to this is so similar to that of the abacists to Fibonacci.[5]

[1]. https://arxiv.org/abs/1403.7048

[2]. https://graphicallinearalgebra.net/2015/04/29/dumbing-down-m...

[3]. https://ncatlab.org/nlab/show/string+diagram

[4]. https://en.wikipedia.org/wiki/String_diagram

[5]. https://graphicallinearalgebra.net/2015/04/26/adding-part-1-...


I don't believe the exposition is written well (judged as either an exposition of linear algebra or just the graphical calculus the article develops). But this is perhaps a subjective point, and others here have already commented on this in detail.

More importantly, I would like to remark that string diagrams are not standard. The vast majority of mathematicians have never read the definition of a string diagram (or even a monoidal category), precisely because it is not standard and not needed for most (any?) useful mathematical work. They also have obvious disadvantages when compared to the usual notation, as I pointed out earlier.

When I read articles by category theory boosters, I get the sense (rightly or wrongly) that they think the world revolves around them and they have stumbled onto some deep and fundamental truths. This is not the case. It's a niche of a niche.


It's interesting to suddenly get all these hits.. I haven't touched the blog in a long time.

I'm sorry that you didn't find it well-written -- it wasn't written with you in mind. Originally I wanted to write about my research in a way that was understandable to a lay person, but I quickly abandoned that and went for the mythical "second year undergrad" level.

You have pretty strong thoughts about what is "useful mathematical work". Are you the high priest and decider of the usefulness of mathematics? To be honest, it almost sounds like some category theorist was super mean to you...

One more thing. Arguments like "niche of a niche" are sociological - just because something is niche doesn't mean it's not important (or, god forbid, fun!). Maths is one of the most conservative fields in this sense; what is and is not considered "standard" is extremely political.


I'm enjoying it so far. Thanks for taking the time to do this!


"Are you the high priest and decider of the usefulness of mathematics? To be honest, it almost sounds like some category theorist was super mean to you..."

Interesting and totally not ad hominem response... I believe Kevin Buzzard (an actual mathematician) had a few words about this last year: https://youtu.be/Dp-mQ3HxgDE?t=1039


Note that Kevin Buzzard is talking in the context of convincing "mainstream" mathematicians to use a proof assistant. He specifically clarifies somewhere (not sure where in this video, but I've seen other videos where he clarifies it) that when he talks about "proper mathematics" he is doing so in kind of a tongue-in-cheek, British way and does not mean to suggest that other kinds of mathematics are not proper (don't remember the exact wording).

In the context of that video, his comments about category theorists and type theorists make a lot of sense: type theory people and (perhaps to a more limited extent) category theory people tend to be more easily sold on using a proof assistant since the kind of mathematics they do translate more easily into current proof assistants than, say, analysis or topology.

I definitely do not think that Kevin Buzzard is suggesting that type theorists and category theorists are not doing interesting and/or useful work (after all, the proof assistant he is advocating for is based on type theory). At best, he is making a sociological observation that there is a gap between the type theory/category theory community and "mainstream" mathematicians making the widespread adoption of proof assistants in mathematics more difficult.


Given the tone of the poster who this reply was for, it seems entirely warranted.


feanaro: fine, if you think so.

I find it interesting that actual mathematicians working in Category Theory, such as Tom Leinster (who wrote a lovely little introductory text on Category Theory [not covering monads though] and made it freely available on arXiv: https://arxiv.org/abs/1612.09375) are able to engage in polite discussion about the contentious viewpoints on the use of CT without resorting to personal attacks (see 2-3 minutes from his talk from a few years back: https://youtu.be/UoutGluNVlI?t=410) ... which stands in sharp contrast to some of the evangelists, whose attitude in response to criticism often reeks of arrogance and puts people off taking CT seriously, which I think is a great shame.


> More importantly, I would like to point out that string diagrams are not standard.

They're standard in category theory, which is what the series is about.

> precisely because it is not standard and not needed for most (any?) useful mathematical work.

Uh oh, you better let everyone using it know, especially all those pesky type theorists.

> When I read articles by category theory boosters, I get the sense (rightly or wrongly) that they think the world revolves around them and they have stumbled onto some deep and fundamental truths. This is not the case. It's a niche of a niche.

I think this says more about you than the series. You seem upset that a category theorist is presenting their work in the language of category theory and that people are interested in it. Nobody mentioned anything about utility, depth, fundamentality or generality. In fact, he refers to some of his work as "hard and useless." [1]

> There has been intense political pressure on the academy to stop working on hard and useless things over the last 30 years or so. The result is a new generation of academics who do easy and useful things, things that impact the economy in the 5-10 year scale, and which bring in research funding. This is by design, because research funding is closely tied to academic career progression. Unfortunately, it’s not very hard to disguise easy and useless things as easy and useful. This has resulted in a totally out-of-control epidemic of easy and useless research. A symptom of this disease is the ever expanding use of increasingly ridiculous buzzwords. Easy and useless never leads to hard and useful. I’d much rather the government invest my tax money in the hard and useless.

[1]. https://graphicallinearalgebra.net/2015/04/30/spoilers-addin...


> When I read articles by category theory boosters, I get the sense (rightly or wrongly) that they think the world revolves around them and they have stumbled onto some deep and fundamental truths.

(Professional mathematician here)

My sense, from talking to category theory boosters, is not typically that they regard category theory as "deep and fundamental truths". Rather, they often find it a useful way to declutter and describe their work, allowing them to more easily focus on deep truths without getting bogged down in details.


Indeed. A high-level programming language (e.g. Python) does not allow you to do _more_ than assembly but _less_. But it helps you declutter the presentation to such a great extent that realistically you could not write Python program in assembly.


Funny thoughts have truth in them. Can you share what truth did you realise ?


Have you ever done much electronics? An electronic circuit diagram is a wonderful thing - a huge page of paper that abstracts the entire operation of a circuit that you can trace through. I'd be very excited to have something similar for mathematics, I don't know if this implementation is that, but its certainly worth striving for.


This graphical notation for linear algebra isn't like the usual diagram notation category theory at all.

> Is category theory a useful organizational tool in certain abstract branches of mathematics?

Yes. Abstracting math in a very general way lets you explore patterns and a result in category theory (e.g. Yoneda lemma) generalizes immediately to other instances of categories.

> I also find any claims that category theory is relevant to the average working programmer to be dubious at best.

I would say there is value in knowing some category theory, especially if the programmer is working in a strongly-typed functional language. Knowing what a functor is, what a catamorphism is and so on, can be useful ways to write more declarative code and teach about abstractions in the same way OOP uses Design Patterns to teach programmers about reusuable code.


I realize that, obviously, that the diagrams presented in this article are different from the ones found in category theory textbooks. Moreover, my later comments were not really aimed at this article so much as others that are posted here. The criticisms just happen to fit this article, too.

Also, I don't really see how learning the abstract definition of a functor is going to help someone write better OCaml programs. But it is possible I am just not imaginative enough. Do you have a concrete example of this?


> But it is possible I am just not imaginative enough. Do you have a concrete example of this?

Well, in the case of functor it's not terribly interesting, but here's an example. A functor F has the property F g . F h = F (g . h), so this means that if you were writing

  map f (map g l)
you could write

  map (f . g) l
instead, (in Haskell the compiler optimizes it anyway) but if your compiler actually traversed the list twice, now it traverses the list once.

I could list other things like how monoids, monads, etc. can help make programs more efficient (monoids allow exponentiation with squaring) or structure programs better (monads allow composition of partial/stateful/exceptional functions).


Your example for the application of the "functor" concept is trivial enough that it doesn't need category theory.

Monoids are very simple objects, and you don't need to use category theory to define them.

Monads might be a more meaty example. But I'm not sure if knowing CT makes it easier to use monads in a programming context.


They're trivial constructions yes. But the difference is the shift in vocabulary, instead of understanding functor fusion in an ad-hoc way, the concept can be made more precise, also it can lead you to find other "non-trivial" functors for instance the Hom(a,-) covariant functor or the Hom(-,a), but equipped with the right vocabulary these things are easy to express.

Sure, monoids are trivial, because they are, but once you're in the categorical setting you can begin to talk about the category of monoids, monoid homomorphisms and so on.

This isn't to say that CT is the answer, but I'm pointing out that it provides a way to understand more complicated concepts.

Functors are trivial, but if you take their fixpoint you get ADTs. Monoids are trivial, but if you apply them to the category of endofunctor you derive monads, and so on.


> Your example for the application of the "functor" concept is trivial enough that it doesn't need category theory.

That's kind of the point.

> "Perhaps the purpose of categorical algebra is to show that which is trivial is trivially trivial." That is, category theory aids in making the softer bits of mathematics look utterly natural and obvious, so that one can more easily isolate the harder nuggets and attack them with abandon ... This is a tremendous service that category theory provides"

https://mathoverflow.net/questions/28788/nontrivial-theorems...

In isolation applications of category theory to programming are going to be trivial, typically. Category theory gives a language and framework for extracting trivialities so one can "attack the harder nuggets" without distraction.


> I can easily see a bright, curious person wasting a lot of time before realizing that the content is merely linguistic, not mathematical.

While I rarely find graphical methods useful for performing calculations, I do appreciate visual interpretations of how mathematical concepts work. I personally find that whenever I have to learn a new topic in math, good visualizations help me form an intuition regarding how things fit together, and can often be useful when reasoning about how to attack a problem (back-of-the-envelope sketch before a more rigorous calculation). For actually solving the problem, on paper or in code, I agree that a symbolic approach is superior.

I also think that purely linguistic improvements in math can have a positive effect: a mathematical notation that is easy to learn, understand, and use, will clearly make the math it represents more impactful. I can't find the exact quote at the moment, but I think it was Dirac who said something along the lines that "a good mathematical notation should make correct statements obvious and wrong statements impossible".


These types of notations have been around for a long time. In fact they fist started to study geometric objects (such as knots and braids) with algebraic tools. The graphical notation does the opposite - represents algebraic objects by geometric ones. These may have some benefit for human eyes, but for machines they are quite useless indeed.

Incidentally, I partially agree with you on category theory, especially on computer science. In abstract maths, however, category theory has more significance than pure language. It can serve more or less like an algebraic tool. There are abstract mathematicians who see deeper meaning in it too.


> In abstract maths, however, category theory has more significance than pure language. It can serve more or less like an algebraic tool. There are abstract mathematicians who see deeper meaning in it too.

One of the things that has made me very successful as a problem solver and software developer has been the ability to apply category theory abstractly to projects and work that comes in. IOW, Using category theory well in day-to-day software engineering isn't about which monad to pick, but which abstraction (or series of abstractions) are the best fit to a problem.


> an alternative (but mathematically equivalent) graphical notation, which is harder to do computations and write programs in.

But this does need to be the case. Sometimes a graphical notation is a great help. For example, many tensor computations in differential geometry become straightforward using Penrose's graphical notation. Is that the case here? It is difficult to say, as the authors fail to present that notation separately to the results.


Yes, it is the case here as well. Reducing terms can be tricky because of distributive laws. The graph-like representation makes reduction algorithmically easier.


As someone who knows some Category Theory, Algebraic Topology, and a smidgen of Algebraic Geometry, I totally agree. Category Theory is a very useful and essential tool in those subjects.

However, to me it reminds me of String Theory, in that it has been sold as Theory of Everything for Mathematics and adjacent fields.

Category Theory might even be somewhat useful for language designers, but, I agree that it offers little benefit for programmers even with languages like Haskell.

Even Bartosz said that Adjoints aren't that interesting in Haskell because every functor is an endofunctor.


> Even Bartosz said that Adjoints aren't that interesting in Haskell because every functor is an endofunctor.

Every functor (in the standard library) is an endofunctor, but is also a functor to a specific flavor of full subcategory in Hask. It might be useful to support other flavors of subcategory, and I think there's some ongoing work on this in the SubHask module.


Very cool! I get excited about Category Theory just like the rest.


> This submission is the latest in a trend of what I would classify as "math-lite" category theory and Haskell articles that reach the HN front page,

That's pretty arrogant and rude. This is some mathematicians' PhDs and research, not a blog put in front of overpaid programmers to provide them with category-theory porn. Admittedly, the dumbed down language of the blog is misleading and makes it very hard to read.

https://dl.acm.org/doi/pdf/10.1145/3290338

https://www.sciencedirect.com/science/article/pii/S089054011...

Fabio Zanasi. 2015. Interacting Hopf Algebras: the theory of linear systems. Ph.D. Dissertation. Ecole Normale Supérieure de Lyon


With the caveat that I learned this material the old-fashioned way and probably can't adequately put myself in the shoes of a new learner, this feels overly pessimistic. A few general points come to mind:

(1) The human mind is pretty good at visual pattern recognition. For the same reason that people advocate for a big status monitor that every developer can see for your live service (since you can quickly spot anomalies your monitoring might miss) and for the same reason visualizing a data set is one of the first things you ought to do on your way to understanding it, creating a diagrammatic representation of linear algebra seems helpful for immediately getting a sense of where something feels "off" or for quickly grasping the intuition of a problem.

(2) The right notation (pictures in this case -- not that the idea I'm about to highlight couldn't be presented textually) can help highlight the portions of a problem that matter. Thinking about matrices as boxes of elements that you can individually manipulate is IMO actively harmful to understanding them and puts you in a situation where you miss the forest for the trees.

(2a) The matrix wiring diagram presented here naturally extends to higher order tensors, and for the life of me I can't find the source right now, but it was precisely that representation that made tensors click for me and gave me a foothold into other material about them.

(3) I think the current consensus is that multiple representations of a problem (via some mechanism -- call it "magic") are actively beneficial for learning a topic. From that point of view alone a diagrammatic representation doesn't seem especially bad.


Yeah, I have to agree. It's frustrating because CT and these kinds of diagrams can actually be kind of interesting... They are just not nearly as grand or as useful as their proponents would have us believe.

(My spicier opinion is that the machinery of CT is often completely unnecessary for many applications, and actually only serves to obfuscate simple concepts.)


ZX-calculus is a rigorous graphical language that has applications in quantum circuits.


I think category theory is a language you need to learn at some point if you are serious about math or computer science. I've refused to do so for quite a long time, but there is just too much interesting material out there that uses this language. The biggest problem is not falling asleep while learning it :-)

The most interesting part of category theory is topos theory, in my opinion. This is a generalisation of set theory, and can thus claim to be indeed useful for foundations.


> The short version appears to be that the author has managed to replace standard linear algebra notation with an alternative (but mathematically equivalent) graphical notation, which is harder to do computations and write programs in.

> This submission is the latest in a trend of what I would classify as "math-lite" category theory and Haskell articles that reach the HN front page, which purport to explain something interesting but end up just rehashing standard mathematics in opaque ways.

This is, coincidentally, the ambition of APL.


> I really wish this would stop > Getting excited about category theory is like getting excited about matrix notation – useful, sure, but not where the meat is

Patience. These kind of work is not worth for verbal readers, but for visual learners and visual creators this is very educative. It's a work of visual learner and these kind of work will not stop.

> replace standard linear algebra notation with an alternative (but mathematically equivalent) graphical notation.

Yes. Modern computation is based on AI which is based on matrices which runs on GPU which does computation on graphical input. This work is a masterpiece for simple solutions for complicated computations.


Good visual explanations are useful to all learners. I'm sorry, but the concept of "visual learner" when used in contradistinction to "verbal learner" is just "lacks attention span to concentrate on something hard".


God forbid we show any kindness to people with attention deficits.


We should definitely be kind, to as many groups of people as humanly possible. But erecting artificial categories merely because someone thinks it will make some other subset of individuals feel better (a) is patronising to those people, and (b) introduces confusion for everyone.


When I took a course on linear algebra, the professor would introduce the notion of vector spaces and linear transformations first before talking about matrices, and actually one can derive the matrix multiplication algorithm from the specification of how it must correspond with composition of linear transformations. I thought that the algebraic notation is very concise and appropriate for linear algebra.

While new notation like this is interesting, I do wonder how useful it is from a pedagogical standpoint for linear algebra. Category theory OTOH is one of the areas of mathematics with diagrams rule over equations because a single diagram can encode so many equations.

For excellent introductions to linear algebra I recommend Linear Algebra Done Right[0], which Springer offered for free for a while (it doesn't seem to be anymore though, contact me if you want the PDF).

[0] http://linear.axler.net/


Regardless of the usefulness of category theory or not, the author seems to imply that "traditional linear algebra" is about matrices.

This is not true. This may be true from an engineering perspective, or maybe if you've studied in the US, but over here, linear algebra starts with fields, vector spaces, homomorphisms. We establish pretty early on that matrices and linear transformations are in essence the same thing (up to choice of basis), and then we go on to prefer coordinate-free proofs. But, of course, we do discuss the computational aspect (i.e. the Gauss algorithm) because this is exactly what makes linear algebra so useful: many questions can be answered extremely efficiently.

Whether you want to frame things in the language of category theory or not, is up to anyone, but the "standard mathematical treatment" is quite beautiful and intriguing already (if I want to feel inspired, I open up my copy of Halmos's "Finite Dimensional Vector Spaces", not necessarily the best introduction for someone who is new, but very beautifully written for someone who has already seen linear algebra).


This is not true. This may be true from an engineering perspective, or maybe if you've studied in the US, but over here, linear algebra starts with fields, vector spaces, homomorphisms. We establish pretty early on that matrices and linear transformations are in essence the same thing (up to choice of basis), and then we go on to prefer coordinate-free proofs. But, of course, we do discuss the computational aspect (i.e. the Gauss algorithm) because this is exactly what makes linear algebra so useful: many questions can be answered extremely efficiently.

This is how I was taught linear algebra in the USA too. Almost to a fault, where I lacked practical intuition for how matrices and vectors of real numbers work and had to re-learn that stuff later.


Just curious, what country would this be? I know many European countries favor a more theoretical approach.

I definitely learned linear algebra from the numerical and matrix perspective (though yes, we did cover the idea of vector spaces and linear transformations too). In engineering school, the focus tended to be on the craft rather than the theory, and in retrospect I think it was the right approach for engineering majors, whose primary concern was execution. It helped engineers use MATLAB to solve problems. The syllabus looked like this:

https://ocw.mit.edu/courses/mathematics/18-06sc-linear-algeb...

And this:

https://math.nyu.edu/media/math/filer_public/a4/00/a4008ffa-...

A syllabus for math majors might look quite different.

The US approach to linear algebra continued to be useful for me in grad school as an engineering major (I have an engineering Ph.D.). To be honest, for what I was/am doing, I don't think I would have derived much benefit from a more theoretical approach (but that's just me. People who work closer to theory may have a different opinion).

In my work, math has a retrospective role -- I first build something and later recognize patterns that fit a certain theory.

For theory-builders however, math has a constructive role. Theory builders however are relatively rare.


I studied engineering with a European-style curriculum, which starts with groups, rings, vector spaces, linear transforms, eigenvalues, etc - and only then proved that matrices can be used to represent any finite dimensional to finite dimensional linear transformation, and mostly went on from there. It did seem roundabout at the time.

But then, everything was using the same terms, and was relatively simple and straightforward. Fourier transforms? Laplace Transforms? They are all just linear transformations. Functional analysis? It's a lot of inner products, hermitian forms and eigensystems but we were familiar with all the properties, so we only concerned outselves with what's different about it (e.g. the spectral theorem). Coding thoery? It's finite fields, we did those in linear algebra, now it's just applications. Stationary distributions on markov processes? It's application of linear algebra+probability, but works mostly the same whether it's discrete (and representable by a matrix) or continuous (when it isn't).

The syllabus for engineering students was basically the same as for math students, except we only covered proofs in class & homework (but did not have to be able to reproduce them in tests), and that we mostly went through a simple logical progression of ideas rather than a historical one (e.g., cauchy's theorem of analytic complex functions was essentially a two-line application of Green's theorem in the plane, and we spent ~30 minutes discussing it; math students spent ~6 weeks proving it in the way Cauchy historically did, progressing from simpler to complex structures)


Germany, but I think it's similar across much of Europe.

The US has a different university structure: the way I understand it, many people don't necessarily choose a major early on or can easily switch it which is why engineers, mathematicians, biologists, etc. often take the same course. Please correct me if I'm wrong.

Over here, this is much rarer. Sometimes computer scientists or physicists take some of the same courses as mathematicians, but hardly engineers. Therefore, we can probably afford to be more theoretical right away (although the computer scientists sometimes complain xD).

I can totally understand that for someone with a focus on application theory is less important, and that some people can get more excited about what you can do with matrices e.g. in computer graphics or machine learning than about abstract morphisms and rigorous proofs.

Incidentally, over here I sometimes feel we learn stuff the other way around: first we hear all the theory, but it's only in the later ODE and numerical methods courses that we actually get substantial practice in calculating more complicated integrals or learn about e.g. QR factorisation or Simpson's rule (although we do prove error bounds by then).


You’re not wrong.

OTOH the other day I heard someone in the Netflix doc about Alberto Nisman’s murder that described a terrorist network as a “matrix”.

I grumped as card-carrying pedants are duty-bound to, but then realized that if there’s a 1:1 correspondence between graphs and (adjacency, incidence) matrices, then there’s very little loss of meaning in referring to graphs as “matrices”. Maybe in some communication contexts it’s even clearer.


That's funny, I would have just chalked it up to "they are using a different sense of the word matrix", knowing that most dictionaries contain multiple definitions of the same word. But nice work making this mental leap, I have never thought about it!

But it's true, you can envision it as a NxN matrix of N terrorists, where the number in cell i,j represent's the strength of the relationship (it's a weighted graph!), and that might actually be useful for something. I don't doubt that "matrix" might actually exist in numeric form in some CIA or military computer somewhere.


Maybe the term Disposition Matrix was more precise and less euphemistic than we were concerned it was. I guess you could compute a kill list from a genuine matrix of dispositions.

https://en.wikipedia.org/wiki/Disposition_Matrix


A few months ago I was imagining having a square matrix representation for fractions/attenuation of viral loads (e.g. per week) being transferred among a large population. I pictured a sum of isolated groups, after a sort, causing this larger matrix to be composed of a set of on-diagonal square submatrices.


Yes I would think that you are dealing with sparse matrices with little clusters of nonzero values in a model like this.

One could say that I'm "part of a terrorist matrix", and that my value in the matrix is zero. We are all part of a terrorist matrix. I hope the CIA is not reading this, because they probably are not mathy enough to know what I'm talking about, and they may just take my quoted statement at face value, but in a mathematical sense we probably all actually have some nonzero value in our rows and columns of the terrorist matrix. You know someone, who knows someone, who knows someone, who is a terrorist. It really just depends how many people and how many degrees of separation you want to include in your matrix.


Meh. Sorry.

Start with [Essence of Linear Algebra](https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2x...) by [3 Blue 1 Brown](https://www.youtube.com/channel/UCYO_jab_esuFRV4b17AJtAw) for an intuitive overview of all the major concepts.

While working through a full Linear Algebra course, it is important to develop your own software to do things. That is, some sort of command tool to enter matrices and run operations on them. As you go through a course, you will keep adding operations.

You should work on some of the most common tricks, like using 4 by 4 matrices for computer graphics and understanding what a PCA does.


As much as I like 3blue1brown, watching those videos is not a good way to learn. It's maybe a good way to get some intuition about a topic and much closer to the "infotainment" category. A good textbook + solving problems + maybe writing code works much better for actual learning.


> While working through a full Linear Algebra course, it is important to develop your own software to do things. That is, some sort of command tool to enter matrices and run operations on them. As you go through a course, you will keep adding operations.

That's definitely possible, but not everyone approaches linear algebra from such a number oriented perspective.

In fact, my linear algebra prof in Uni tried to avoid coordinates (ie numbers) as much as possible. Eg talking about abstract linear transformations instead of emphasising that you can represent them you can represent them as a rectangle of numbers.


The usual notation for a linear transformation

  X' = aX + bY
  Y' = cX + dY
minimizes irrelevant information. But the diagram for one seems to bring it to the fore: a diagram is basically a system of fully parenthesized, unsimplified expressions eg

  X' = (3X + (2(X+4Y) - 6X)) + Y
  ...
In fact, it contains even more information than that, since it also tells you if, in (X+Y)+(X+Y), you are supposed to compute (X+Y) once and reuse the value or compute it twice.


Depends on what you mean by relevant information. GLA exposes the many compositional natures of linear transformations. The fact that there is a compact matrix representation is a wonderful treat.


Can you be more concrete?


A lot of category theory is just about composition, how different structures compose and how compositions of those structures continue to be composable.

So in this case, GLA builds the theory of linear transformations from trivial/simple pieces and their various ways of being composed. It also discusses the mechanisms for proof and reduction (you noted how GLA is clear about whether a sub-computation is reused or not, it'll also be clear about how those two choices are equivalent).

So it's really not studying "just" linear transformation (which are, in finite cases, summarized by a matrix of numbers/field elements) but also the theory of their construction, manipulation, simplification. It gives you a rich language for talking about how two linear combinations might be related to one another, something that's more challenging to access from a matrix.


Well, that wasn't very concrete. How about this: can you give an example of a linear-algebraic fact that is more easily shown by reasoning about these diagrams than by just using the numbers-moving-along-wires interpretation to immediately turn it into regular equations?

For example, the only thing I can think of is maybe (AB)^T = B^T A^T.


It would be nice to have a single-page complete summary of this notation. The whole text is way too verbose and boring (at least the first two chapters).


There's a paper linked to here which he says contains all the "spoilers": http://arxiv.org/abs/1403.7048

> If you are impatient and want to get to the conclusion quickly, here’s the link to our paper where the basic theory is explained (Spoiler warning!). We will also discuss several applications of graphical linear algebra that have been developed in recent years.


Diagrammatic notations seem to pop up all over the place, but in my mind they all share a difficulty that interesting complex graphs tend to not be planar (can't be drawn without criss-crossing lines). That makes it hard to organize in a skimmable/glanceable way. I'd love to see what a future with ubiquitous 3D volumetric displays or VR or whatever could do for diagrammatic notations. You might suddenly be able to organize larger diagrams and I wouldn't be surprised if diagrammatic notations became more useful and started popping up more in that hypothetical future.


I would also invite people to check out Chu spaces (http://chu.stanford.edu/). They are essentially matrices with closure under norm (meaning you always have an inverse). This puts them on the same side as probability, linear logic, C*-algebra, and constructive mathematics.


As a software developer, I remember basically nothing about Linear Algebra from my time at University :( Besides graphics, what are some applications for linear algebra in software?


I believe you'll find some applications here: https://codingthematrix.com/


Wow, that's exactly what I was asking for :) Thanks!


This is not nearly interesting enough to justify the Stephenson-esque asides on "English snobbishness" and bits on if Fibonacci had to apply for a grant for Arabic numerals.


I loved the Makelele story in the intro. I hardly remember him, and didn't recognize that Real was so much worse without him. Linalg really is the most underappreciated technique.


I would like to read this but I really need a version without the verbiage and humour. I'm really not convinced it's necessary to try to make these things so "accessible".


Then read the paper that it's based on which is linked to by the blog and my other comments.


Thanks, I didn't mean to offend. Which is the best starting point? Based on the blog post I was hoping to get some introduction by way of familiar concepts such as linear algebra and homomorphism, before confronting applications to theoretical computer science. Anyway, is the best starting point "Diagrammatic Algebra: From Linear to Concurrent Systems" or "The Calculus of Signal Flow Diagrams I: Linear relations on streams" or something else?


Reminds me of petri nets.


Not entirely a coincidence - see this paper (coauthored by the guy behind the blog in the OP): https://dl.acm.org/doi/10.1145/3290338


And coauthored by you obviously :)

Thanks a bunch, very interesting. I've stumbled upon petri nets as a state machine alternative when looking for graphical modeling tools for functional programming (I'm a designer learning Elixir). Here's a talk: https://www.youtube.com/watch?v=aWnGPaputGE



This looks interesting! I get the impression there would not be much friction to implement some of these mechanisms in Haskell.


I think it would be a challenge without dependent types.


Linear algebra is often taught so poorly. So I am happy someone took the time to create such relevant content. Thanks.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: