There is no way to overestimate the impact of Dr. Dongarra and LINPACK. If you're active in the high performance computing world you know them both quite well. Well done Jack!
Linpack helped me solve an in general intractable problem 15 years ago and kick-started my career short after I got my degree in CS, it turned out that the real world instances of the problem were indeed tractable but that was not revealed untill I tried Linpack. Needless to say I am very greatfull, and I am very happy to see Dongarra got this well earned reward.
Ever since I started using R and Pandas/NumPy, I always wondered who wrote and maintained all those linear algebra libraries. Embarrassingly I didn't know until now.
There's a very good chance that you're actually using third-party modern high-performance reimplementations of these libraries, like MKL or OpenBLAS. The original Netlib libraries are often a compatibility fallback. For example the Windows installer of Octave asks you which one you want to use.
While I do think the parent commenter’s callout was unnecessary, it’s also true that if someone’s pronouns deviate from the norm, you’ll find them, meaning not finding them likewise has high signal.
In the absence of knowledge, you can guess or you can use the default ("they").
There are a few ways to read mpfundstein's post, and I was responding to those on the more charitable end, such as "I personally know the author, who goes by 'he'" or "'Austin' is clearly a dude's name".
I visited Tennessee once, walked by a bar and saw a sign saying «Jack lives here». Fun story.
Also, well deserved. I still recall the one talk by Jack Dongarra I ever saw. He threw out there that the iPad 2 had (at the time) the highest performance-per-watt CPU in the world. I’ve waited for the «apple M1 moment» since then - and now, suddenly… fantastic talk covering the whole HPC topic!
Sadly he's not some kind of local hero. The area much prefers sports to academics. I studied CS there for undergrad and knew of him but rarely heard his name mentioned or celebrated.
I mean, if you asked random passersby in Stanford (my alma mater) about Donald Knuth, 4 out of 5 wouldn’t have heard of him. Or 9 out of 10, more pessimistically. Approximately nowhere are academics hailed as local heroes.
> Dongarra led the field in persuading hardware vendors to optimize these methods, and software developers to target his open-source libraries in their work
I’ve got the gut feeling that this work might be 90% harder and longer than the whole math.
Also wondering how much work was indeed made within a team rather than particular contributions.
"The whole math" in the libraries in question is several centuries of work by history's greatest minds, such as Newton and Gauss, including long decades of blind alleys like quaternions, so it depends on what you mean by "harder and longer". Surely Gauss wouldn't've swung much weight in the IBM boardroom, but Sam Palmisano couldn't've made much progress on solving large systems of linear equations either.
This man is indirectly responsible for Python becoming the lingua-franca of data science. Without Scipy/numpy wrapping his libraries, Python would not have achieved the success it has today.
Actually, all of the scripting languages developed packed arrays and BLAS bindings around the same time -- Python with numpy, Ruby with narray, Perl with PDL, and I think Common Lisp and some of the Schemes did, too. Python won that race for other reasons, I think mostly because its syntax looked pretty.
I'm not so sure that the underlying libraries has much to do with it compared to the well designed interfaces of Scipy/Numpy and Python's expressive syntax / batteries-included standard library.
It's a two-way street. 100% agree that python lent itself to being a great wrapper language - but there has to be something valuable to wrap for python to be useful.
Ah, I personally preferred gotoblas to linpack as it was faster (keep in mind this was a few years ago when I did research, no idea how it kept up). I also enjoyed the story of how an unknown Japanese student ( Mr Goto
- cool name ) outdid all the top linear algebra implementations by hand writing everything in assembly. It was an amazing story.
This was when everyone said assembly was dead.
I'd guess that previous machine-specific BLAS used assembly. The fundamental feature of the GotoBLAS approach is the blocking structure; that can provide >80% of the performance of hand-tuned assembly/intrinsics-based kernels with plain C in BLIS with vectorizing compilers. Not to minimize Goto's work, but I wonder if van de Geijn, who I think was Goto's supervisor, deserves more credit than he gets.
That's the first time I hear his name. For people in the same situation as me, from the article:
> For over four decades, Dongarra has been the primary implementor or principal investigator for many libraries such as LINPACK, BLAS, LAPACK, ScaLAPACK, PLASMA, MAGMA, and SLATE.
My CS professor at MTSU Ralph Butler (helped write the MPD in MPI) was good friends with Jack, iirc Butler brought him in for a talk. Very knowledgeable in all things HPC.
I think LAPACK and later work were much more important than his work in the 80s. But those were intensely collaborative and required enormous amounts of collective work. Well deserving of the Turing, but how can it be attributed to one person?
One person gets the award each year. Gotta find somebody who deserves it. Fortunately the problem is in choosing who among the deserving gets it this year, not in identifying anybody who might.
The field has gotten much much bigger since then. There were so many foundational contributions to the field of computer science that we’re still recognizing them today. There are subfields within subfields of computer science that attract as many researchers as were in all of computer science fifty years ago (think about some of the ML subfields like graph neural networks or style transfer for great examples of this).
This means that new researchers not only have a lot more to learn (and also bigger shoulders to stand on), but also that it’s a lot harder to make your research generally applicable across enough of the breadth of computer science.
That’s not to say that the work that Knuth performed wasn’t worthy of the Turing award, but we’re in a world now where you could easily be 37 and only recently have been awarded tenure (if you’re even lucky enough for that), making him an outstanding exception.
I mean, the prize for passing the actual practical test is now you're a person.
I like being a person, but seems like some people I know don't so much, so maybe whether this is a good prize is a matter of opinion.
But yes people have offered prizes for numerous toy protocols similar to Alan Turing's "parlour game" idea, many Cognitive Scientists doubt this is an effective protocol for testing personhood, not all of them in ways I agree with (e.g. Professor Harnad and I disagree vehemently about whether his big-T test is necessary) but clearly at some level "passing" as a person is satisfactory because that's all everybody else is doing.