The Wikipedia page on Algebraic structures is rather more detail than I usually want, but the page "Outline of algebraic structures"[1] is a great quick reference.
Interesting seeing this defined from a functional programming/category theory perspective. Personally I like examples for my algebraic structures along with some simple proofs why they're that structure. What was extremely useful for my Algebra class was a Wikipedia note detailing the hierarchy of rings via set inclusion:
I really think names without a fairly obvious way of saying them out-loud should just be avoided/deprecated. I can't pronounce this as "ring" (because then you'd think I was referring to a ring, not a rng). Do I just spell it out? R-N-G? I'd rather just call it a non-unital ring. That involves more letters but is much more descriptive (to a mathematician, anyway).
"If we may speak frankly, the rig and rng nomenclatures are abominations. Nevertheless, you may see them sometimes, but we will speak of them no more."
It's "ring without i". i is of course "identity", which is conventionally denoted e, because i is usually the square root of unity. This works pretty well if you pronounce i as in French so it's a homophonic with English e. Isn't math a wonderful language of use for computer programming? It's so international, breaking the hegemony of English.
> I really think names without a fairly obvious way of saying them out-loud should just be avoided/deprecated. I can't pronounce this as "ring" (because then you'd think I was referring to a ring, not a rng).
But 'nearring' doesn't bother you? If I'm at a talk on rngs where the distinction from rings is important, then I'll know that's the kind of talk that I'm attending, and hear accordingly; or else the speaker will make a huge deal of it (and then probably not use the word).
As mentioned, it should be pronounced "rung," but in any case, removing the i and suddenly making it a word and pronouncing it in a very unexpected way is really pretty stupid. I think, if one's really insisting on using "ring" but without the i, at least call it "rung" not "rng" so that its pronounciation is clear.
If it's so important to have a "rng" instead of a "ring" then using its name in some example of its usage should probably be better.
Well, words, terms, notations and conventions come super cheap in math. You can call/denote "rng" anything you want. You just have to let your readers/interlocutors know about your conventions at the outset which is a standard practice in math anyway. For another thing, "r(u)ng" to me it's no different from writing "colonel" and reading it "kernel" (historical accident aside) :) There's also "lieutenant" which is pronounced " leftenant" in UK.
I pronounce it /r@N/ -- i.e. like 'ring' but with the 'a' in comma, or the o in button. Although if it was a Rng formed by a pseudo-random sequence, then R-N-G would be apt ;).
it's a ring without (i)dentity. makes perfect sense to me. Besides this is a mathematical object not a programming thing - math has its own culture and conventions and pronunciation isn't a cultural value.
That is beautifully written and a nice recap on those things having studied maths a long long time ago. The Heything Algebra is quite interesting. I learned about that more recently on some lectures about logic and category theory online (I forgot the location now).
I really like this, but one change I'd like to see: given how many structures are defined in terms of simpler structures (which are defined in terms of yet simpler ones), it would be nice if a structure listed all of the laws, not just the ones that simpler structures don't. That way, you would have to jump around the page to get a complete picture of a single structure. Maybe have them be a different color, so you can see what is new vs. what is inherited?
It's somewhat confusing to see it this way the first time, but it's absolutely the right way to think about these structures. It's better to do the work to get used to it early on.
It's the right way to think about the structures, sure. But when you want to look up the laws for what makes something a Ring, for instance, having them all in 1 place instead of having to jump between: Monoid, Semigroup, Group, Abelian Group, Quasiring, Nearring, and Ring. (And some of those laws occur twice, for different operators) makes it a lot more useful as a reference document.
That's like saying when implementing a function you shouldn't call another fn, you should copy and paste its implementation so you don't have to flip to another location in the code.
The right way is to learn what the function means so you can set see calls to it and know what it does.
If the page had arithmetic structures, then obviously yes. Groups don’t include those properties, which is why they are not listed, and that is a very important distinction.
Oh wow, this looks great. I had started working on my own FP glossary, but this has a ton of information and is clear enough that I'm sure many will find it very very helpful.
I'm learning category theory to understand haskell better.
I'm finding category theory to be almost a theory about program structure in the context of composition. It's giving me a whole new perspective on one of the least concrete things about programming namely design.
How relevant is abstract algebra to programming? Will it change my perspective on everything related to programming? How much of a mind bender is it compared to category theory?
> I'm learning category theory to understand haskell better.
I would consider myself a fairly expert Haskell programmer and I have (for fun/curiousity) spend some time reading up on/studying category theory and I can say, without a doubt or hesitation that if your goal is to either 1) understand Haskell better and/or 2) become better at writing Haskell, then studying category is a MAJOR waste of your time. I would advise you to spend that time instead on reading up on the lambda calculus, type theory, and some basic algebra (like this post).
Haskell is not/has never been based on category theory (and I keep being baffled by how many people on social media will claim that it is, given how well documented it's origins are) and the common terminology of Functor/Monad that have been pilfered from CT via Wadler have only a passing resemblance/relation to their CT friends.
Some things that I instead would recommend reading up on are:
- Type theory (Benjamin Pierce's "Types and Programming Languages" is the de facto introduction to this. It covers everything from untyped lambda calculus to things way more complex than standard Haskell, including example implementations of type checker, etc.)
- Computer assisted proofs/formal verification of programs (the Software Foundations book series, co-authored by Pierce are a good (and free!) intro: https://softwarefoundations.cis.upenn.edu/)
- The Spineless Tagless G-machine (if you are a more low level/C minded person, this talks about how we compile a lazy functiona language like Haskell to an Intel CPU: http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.53.37...)
- The Typeclassopedia (which talks about how various CT inspired classes relate to each other and their laws: https://wiki.haskell.org/Typeclassopedia)
EDIT: All of the above is not to say that you shouldn't learn category theory, but that you should have realistic reasons/expectations (even if that reason is just "I'm curious and it's cool"). I just hate seeing people get burned out trying to "get" category theory and (as a result) deciding Haskell must not be for them...
ugh, I see HN bollocksed my list and I can't edit it to fix it, so repeated for readability:
- Type theory (Benjamin Pierce's "Types and Programming Languages" is the de facto introduction to this. It covers everything from untyped lambda calculus to things way more complex than standard Haskell, including example implementations of type checker, etc.)
- Computer assisted proofs/formal verification of programs (the Software Foundations book series, co-authored by Pierce are a good (and free!) intro: https://softwarefoundations.cis.upenn.edu/)
Your insight is valuable and interesting. I'm certainly not an expert on Haskell and I'll take a look at your book recommendations. Thank you.
I have to say though I can't agree with you on category theory yet. Especially the part about functors. The fmap for the functor f looks to me to be 100 percent the definition of the morhphism that maps arbitrary categories to the morphisms in functor f. The resource I am reading on category theory makes it seem highly relevant to Haskell and software design in general. I'm curious as to your opinion on why the functor in Haskell only has a passing relationship to functors in category theory.
> I'm curious as to your opinion on why the functor in Haskell only has a passing relationship to functors in category theory.
If we assume Hask, the universe of Haskell types and functions operating, is a category (which is hotly debated with some regularity, as there is no formal definition of Hask). Then the Functor class in Haskell really only describes one specific subset of functors: endofunctors on Hask (i.e. mapping objects (types) in Hask to objects in Hask).
This is a rather limited subset of all functors. You can also see this in the laws for example. You probably know the 2 functor laws "fmap id x = id x" and "fmap f . fmap g = fmap (f . g)", but Haskell's Functor is limited enough that these are actually redundant. In Haskell having a proof of just 1 functor law gives you the other one for free (as a free theorem, Wadler's "Theorems for Free!" is a good intro, and another paper. It basically covers "why is more generic code easier to reason about").
> The resource I am reading on category theory makes it seem highly relevant to Haskell and software design in general.
I'm not saying there aren't some useful ideas to be gleaned at both the superficial and deeper levels. The whole lawfulness of functors is great for being able to reason about code without knowing what it actually does, and all of lens is deeply inspired by category theory.
All of CT is about composition and reasoning about it, which we could certainly use more of in programming. So I encourage anyone curious to look into. I'm just trying to provide some nuance to the...enthusiastic overhyping of CT, because I don't want people to be scared away of wonderful tools and languages like Haskell, just because they think they're not smart enough to get all that fuzzy math :)
>If we assume Hask, the universe of Haskell types and functions operating, is a category (which is hotly debated with some regularity, as there is no formal definition of Hask).
Isn't the consensus that "Hask is a category as long as you wave your hands profusely and pretend that exceptions and non-terminating functions never happen"?
I was wondering about something. If I have the sum x+x+x...+x (n times), then that is the same as x * n. If I have the product x * x * x ... * x (n times) then that is the same as x ^ n.
What is it called when this is generalized? E.g. call + op1, call * op2, call ^ op3. What would op0 be? And what would op0.5 be?
How does the unit element for these operations behave?
And the rules for associativity, commutativity, for increasing order of the operation?
The generalisation that you're hinting at is known as Knuth's up-arrow notation [0].
There's a number known as Graham's number [1] which is defined in terms of up-arrow notation and was for a while the largest specific positive integer to have been used in a mathematical proof.
The operation beyond exponentiation is known as tetration: towers of exponentials. It's not associative, and therefore it's difficult to work with and hasn't received much interest from the mathematical community at large.
This might sound harsh, but unfortunately it does tend to attract 'cranks'. I think the reason for this is that there's a clear pattern (as you picked up on), that doesn't require formal mathematical training to spot. Amateurs get excited about the prospect of discovering something `new', without realising how hard it is to say anything deep about the topic.
My favourite abstract algebra textbook is Fraleigh's A First Course in Abstract Algebra. Unlike the article here, which just dumps you a bunch of definitions, the book introduces each structure with proper motivation. It's not a CS approach though.
If you want a CS approach I suggest learning basic Haskell then tackling the fantastic Typeclassopedia. The downside is you'll be missing on structures/theorems that are super useful in math but not that useful in programming.
Wow, if the kindle edition looks anything like the preview, I understand the 1-star reviews - it's not just bad, it's a travesty. The pages aren't even in order, and some of the diagrams are missing.
'Topics In Algebra' by I.N.Herstein. Back in college years this was my first book. Despite many other books (which might look easier) I always came back here because of the care it takes to walk through well crafted progression of concepts. (Not a 'computer science' book but you will need something like this for the Algebra part)
Just as a warning most maths algebra beginner textbooks will focus on groups, rings and fields and not talk about most of the other structures mentioned (including Fragleigh and Herstein recommended by others - they are incredibly different styles, and although I have extremely fond memories of Herstein, there are better books than those two especially if you aren't actually doing a maths degree, but my knowledge of textbooks is 20 years out of date). On the other hand once you have a good grasp of a few structures, learning about and understanding others becomes much easier. I guess it all depends on why you want to learn about them and what you want to do with them
Many of them feature prominently in Stepanov's books, Elements of Programming and From Mathematics to Generic Programming. 'Elements' is a technical monograph, while the other is a more lightweight read, halfway between semi-pop-science and a textbook.
There's also an old textbook by Birkhoff & Bartee called 'Modern Applied Algebra' that i like. It's not super duper computer science-y but might be worth browsing through the table of contents at least.
Edit: I snapped some photos of the table of contents since you can't see it on the amazon page:
Check out “Seven Sketches in Compositionality“ by Spivak and Fong. You can find a free copy provided by the authors online. It is an introductory book to applied category theory.
[1] https://en.wikipedia.org/wiki/Outline_of_algebraic_structure...