Hacker News new | past | comments | ask | show | jobs | submit | antioedipus's comments login

You (or others) may like Struik’s A Concise History of Mathematics [1]. It’s also quite easy to read cover-to-cover. My other favorite, Stillwell’s Mathematics and Its History, is excellent as well.

In my opinion, Stillwell’s book is better to read as a way to motivate a particular topic, whereas Struik’s book really tries to illustrate the arc of mathematics through history—of course, Struik’s approach has its limits.

History can motivate otherwise inscrutable or dry mathematics in a way that would probably interest many students. Why isn’t the history of math a serious part of secondary school? I don’t think US students remember that much of whatever we do teach there anyway.

[1]: https://www.google.com/books/edition/A_Concise_History_of_Ma... or https://archive.org/details/concisehistoryof0000dirk


That sounds like a great idea. It might help reduce the Dunning-Krueger effect which says that people who don't know what they don't know are overly confident about the knowledge they (think they) possess.

Teaching the history of math would tell the students what they don't know and would no doubt motivate many to learn more.

I had a short course on History of Philosophy in high-school which I liked but which was kind of fuzzy because that's the way philosophy is. It was hard to discern any progress or direction in the study of philosophy over the centuries.

Whereas with math, there is no fuzziness about it, except in Fuzzy Logic of course.


> Why isn’t the history of math a serious part of secondary school?

Because 4 years already isn't enough time.


First of all, many rolling release distros don’t follow upstream as fast as possible—there’s often some testing window.

People go on and on about backported security patches and stability, but I’ve had to handle so many buggy patches or issues that never got a backported fix that I now think this is basically a fantasy. The distro maintainers just don’t have the time (or experience with all the software they ship!) to backport patches for every single issue. I’d really rather get a fix by the actual software maintainer than a year-old mystery meat version that still has a bunch of known non security bugs fixed in upstream that the distro maintainers don’t care about.

Even worse, being able to stay on essentially outdated software puts a lot of organizations into a tough spot when their LTS version finally becomes unsupported. Practice makes perfect, and I think lots of small, regular updates result in a lot less pain than a mega-update every few years (really: I’ve had to manage one of these more than once, and it’s a total nightmare figuring out which of 1000 changes in the new LTS version caused a performance regression or something).


> distros don’t follow upstream as fast as possible

Well yes, the testing is of course part of the 'as fast as possible' part. But I could've made that more clear indeed.

> many buggy patches or issues that never got a backported

I have yet to see those in CentOS. But I guess we won't be seeing a lot of CentOS at all in the foreseeable future :p

> many buggy patches or issues that never got a backported

I feel like thats mostly on them. There is a huge temporal overlap between LTS versions and you should have plenty of time to test. I think I'd rather dedicate one month every year to fully test and then roll out a new LTS version than be interupted by unexpected updates at random intervals.

That being said: What a boring job it must be to backport security patches all day.


For something with “math” in the title, this article sure has a lot of basic statistical mistakes (like, things you learn not to do in an intro course).


* You can think of it as a factor graph with linear residuals and Gaussian noise functions in factors that connect a chain of variables, with all but the most recent variable marginalized. It’s a well known fact that linear, Gaussian factors result in a closed-form expression that gives the optimal maximum a posteriori estimate. The Kalman filter exploits this very special case. You can also write a LQR down with a factor graph (as the parent commented, the KF and LQR are duals).


That's the same as the first point in the parent comment's list: a factor graph is a visualisation of the conditional probability distribution. But yes it is very helpful to draw out the factor graph (or Bayesian graph) for the Kalman filter, probably more useful than just writing out the equations.


By the way (as if my original comment above isn't already nitpicky enough, this is even worse...):

It bugs me when people use the word "optimal" in the Gaussian / Bayesian formulation. As the top-level comment above says, if you assume the various prior and conditional distributions are Gaussian then the posterior distribution is Gaussian too. This is not optimal, it's exact, just like you wouldn't say x=2 is optimal solution to x+1=3.

It is the optimal solution in the quadratic optimisation formulation, as the top-level comment also correctly said.


I'm not a mathematician at all (mechanical engineer), but to me, "exact" sounds like "deterministic" as an opposition to stochastic.

I though optimal conveyed the idea of "literally the best possible solution but you're still in the presence of a fully random system here".

Which might be the wrong interpretation, but hopefully it explains why some people (who aren't necessarily familiar with rigorous mathematics) use optimal.


I do see your point. But if you're talking about a probability or probability distribution, it can still be an exact solution to a model. For example, if I throw two standard dice, what is the probability of throwing two sixes? The answer is 1/36. To me, it sounds odd to describe 1/36 as the "optimal" solution to that problem, even though it's stochastic. Even "exact" solution is a bit odd, I'll concede, but a lot less so. "The solution" or "the answer", with no more qualification needed, sounds best to me.


That's a known probability distribution. The optimal is about being the minimum variance unbiased estimator for the unknown probability distribution.


1/36 is indeed the most optimal estimator of the expected frequency of two sixes, it's not odd at all.


1/36 is an estimate, it is not an estimator at all. An estimator is a formula based on data from the rolls.


It's an estimator in this case. A fixed number is an estimator too, it's just not going to be desirable in most cases. But single numbers are absolutely and unquestionably also valid estimators.

In any case, what I'm trying to get at there is that in estimator theory there is a concept of optimality for an estimator over a distribution.


Sure, a single number is a trivial example of a procedure/formula.

But an estimator estimates an unknown parameter from data (or in such a trivial estimator possibly without data) - and I believe this is central to the confusion.


Oh yes, of course.


Also, why run all of Microsoft’s spyware and deal with their forced updates just so you can get through to a slightly slower and less compatible version of Linux.


Simple answer, being forced to use Windows as the primary OS by corporate policy.


This website never fails to amuse me…

What is the intention of this comment? That Debord is “far right” or a reactionary? That people who read him are? That it’s interesting that a Marxist philosopher is read by reactionaries?


Just giving context to why Debord is apart of the zeitgeist these days. And Society and the Spectacle is certainly a reactionary text; no surprise it is read by other reactionaries.


An explanation for recent popularity. I was confused out of my mind when the entire country started talking about Saul Alinsky all of a sudden, until I heard that an alcoholic right-wing crackpot was screaming about him while drawing on a blackboard nightly on television.

This happens a lot if you read books that aren't bestsellers. A thing that for decades you couldn't find a person who had heard of it suddenly is in television commercials and image macros because a character mentioned it in a movie, or some twitch streamer decided to rail about it for an hour, and it caught fire.


I am about the age of the people this article is talking about. This is laughably out of touch. Take a look at the right communities on TikTok or Instagram or Discord (they’re not small) with people aged around 13-20 and you’ll see that Marx to Mao to Deleuze to Fisher are alive and well in the minds of today’s young people. There are people my age who are reactionaries, but there are more who want out of the late-capitalist hellscape that’s been foisted on us.


I’m not even sure where to start with this because it’s such a vulgar misreading of Hegel… Hegel generally rejected the “thesis, antithesis, synthesis” triad (which mostly comes from Fichte.) Dialectics, and his overall system of immanent critique, have considerably more subtlety and complexity than the determinism and teleology that you suggest. An even cursory reading of any of Hegel’s major works reveals how entirely misplaced most of your post is. I agree that a lot of Hegel’s political philosophy is marked by naivete, but this doesn’t mean his system can be rejected wholesale. That we should avoid this wholesale rejection in the face of apparent contradictions is actually one of Hegel’s basic points!


"vulgar" is a much classier insult than I'm used to here


You must know you're advancing a recent idea about Hegel? I believe your viewpoint has about fifty years of age to it, and few observers made these claims prior. Now, as popular as this 'recovery' of hegel has been- it has been strongly argued against with very manifest arguments from the major hegelian works. I guess it could be that people misinterpreted hegel for a century and then we managed to figure him out correctly in the post war.... but what I am convinced actually happened is that Hegel was so popular and influential that after his dialectical methods became seen as silly, in order to preserve the myriad fields that had been founded by their use , a hasty revision was developed so that we didn't have to throw away all his ideas had spawned.


It’s good to have something like this. Last time I worked with libav the only available good documentation was the official examples and the ffmpeg (the tool) source code itself, along with the comments in the headers.


Yep. Animals are the most common way that diseases are introduced to humans, and industrial animal agriculture brings these animals into close proximity with themselves and us. Removing animal products from our diets not only mitigates this huge disease reservoir but also removes a huge source of greenhouse emissions, land use, pesticide use (for feed), and animal and labor exploitation (the meatpacking industry is extremely exploitative in the United States.)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: