And maybe the only person to ever successfully troll Linus Torvalds and then get an apology from him (ok just an excuse to link the epic debate in [1]).
Thanks Mr. Tanenbaum, your various works have been a huge inspiration as well as incredibly interesting to read or tinker with.
"Don`t get me wrong, I am not unhappy with LINUX. It will get all the people who want to turn MINIX in BSD UNIX off my back. But in all honesty, I would suggest that people who want a MODERN "free" OS look around for a microkernel-based, portable OS, like maybe GNU or something like that."
It's easy to be smug now knowing how things turned out, but lots of really brilliant people agreed with Professor Tanenbaum at the time.
The difficulties that microkernel projects ended up encountering were not easy to forecast and ended up taking virtually everyone by suprise.
It was in the spirit of progress towards better ways of architecture software that Tanenbaum and Stallman (as well as many others) chose to try a new architecture rather than just build yet another monolithic OS kernel. Being on the pointy end of technology means you often end up being the one to discover what doesn't work.
Ya...part of the design goal is that a microkernel as envisioned by AST should be able to able to have interchangeable userlands, including multiple different userlands running at the same time. So in that sense, AST was spot on.
My point is that writing a new operating system that is closely tied to any
particular piece of hardware, especially a weird one like the Intel line,
is basically wrong. An OS itself should be easily portable to new hardware
platforms. When OS/360 was written in assembler for the IBM 360
25 years ago, they probably could be excused. When MS-DOS was written
specifically for the 8088 ten years ago, this was less than brilliant, as
IBM and Microsoft now only too painfully realize. Writing a new OS only for the
386 in 1991 gets you your second 'F' for this term. But if you do real well
on the final exam, you can still pass the course
Au contraire, Tanenbaum was 100% right. And porting linux to the first new architectures turned up a ton of hard-to-fix dependencies on x86, fortunately the UNIX system call interfaces were all copied from more mature instances and lived on unaffected.
Heh, I can explain that. :) I wrote Prof. first but then thought it sounded like I had actually attended his classes, so I removed it. Then, in spanish there is no different wording for PhD vs MD, so we normally only use "doctor" as a title for physicians, or in extremely formal circumstances for other academics. I have always known him as "Andrew Tanenbaum" since the late 80s, so I thought Mr. would be a proper form to show respect.
"I still maintain the point that designing a monolithic kernel in 1991 is a fundamental error. Be thankful you are not my student. You would not get a high grade for such a design :-)"
Since every microkernel in use (Windows NT and XNU, really) have taken much of their code monolithic, I think this is the true legacy of Tanenbaum's career.
Flamebait at its' worst, but acceptable because he was a respected member of the academic elite.
If you think that this is the bulk of AST's legacy, or that real microkernels aren't in general use, you really should get out and learn a bit more about computer science.
And in all honesty, Linux has adopted a number of elements over the years that would not have flown with the original fully monolithic kernels.
No, we don't have userspace drivers. (Apart from proprietary GPU drivers and a number of enterprise hardware systes .. and at least Canon printer drivers, and and and and ...) They're not isolated in the sense that microkernels would enforce but they provide a shim for the kernel and then do a lot of their processing in userland.
We can load and unload drivers at runtime. That's what insmod/rmmod do.
We don't do message passing between kernel components, but in order to prevent messaging from becoming a bottleneck we have signaling mechanisms, we have netlink, and we have $deity knows what else.
We DO have userspace filesystems: FUSE. I'm still waiting for userspace block device drivers but that's probably not going to happen. :)
What do we have with Linux then? Not a microkernel - just a very modular and runtime-modifiable mostly monolithic kernel instead. A hybrid? Ish?
I think you have to straight up call Linux a monolithic kernel. It's got some nifty modularization features, but none of them (other than maybe FUSE) are really what the microkernel folks are going for. And that's fine. Both approaches have obviously evolved to fill niches the other doesn't satisfy as well. What's clear is that nether approach is a panacea for all compute needs.
He's written several excellent books on operating system design and they are considered by many to be excellent resources - he's taught a lot of people about how an OS works beyond the decision of microkernel vs. monolithic. He's a respected professor, seen as the authority on the topic, his less-popular opinion on microkernels notwithstanding. Most OS's today may have gone in the direction of being monolithic, it is true, but as other commenters have pointed out, Linux is actually very dynamic and modular, Minix 3 has some very advanced functionality, and this all validates the vision he had for microkernels, even if it didn't pan out to be commercially optimal.
Xen, Minix 3, QNX, several l4 implementations, etc. OKL4 alone has been shipped on over 1.5 billion devices. QNX was huge. Xen is running everything at AWS, the largest webhosting company in existence (nevermind everywhere else it is used).
QNX is huge. If you took QNX out of the world it would stop to function just about immediately. So much stuff runs on it that you typically wouldn't even guess. In that sense it is just like other soft real time/hard real time OS's one of the unsung success stories of IT simply because it works so well it tends to disappear.
Most things that use QNX or other OS's like that inside simply work rather than that they require constant upgrades and bug fixes. Reliability by design is so much better than reliability by trial and error, and the OS is a huge factor in that. It's a world where bugs are felt as 'egg on your face' rather than a 'wontfix'.
No...based on your reaction I think I phrased it pretty much spot on.
I added a new question to my interview process a number of years ago. Anyone who put "computer science" on their resume get's asked "What do you think of Andrew Tanenbaum?" It's free-form question intended to see what about computer science interested them enough to remember; I have similar questions about Turing & Knuth. Our team requires a lot of broad knowledge and original thought.
If they don't know who AST is, the interview is probably over. We continue if they can discuss pretty much any of the 6-7 seminal, award winning CS textbooks he wrote, the other major projects he lead like the Amsterdam Compiler Kit or the Amoeba distributed operating system, or the other contributions he made to networking, distributed systems, languages, computer architecture, operating systems or information security (he did publish nearly 200 journal papers over 43 years as a professor). If they know about Electoral-vote.com, bonus points.
If all they can come up with is "Minix" and "that pissing contest with Linus", then I might see if the Linux devops guys have an opening. If they're that incurious, they'll do fine there; those guys think the world begins and ends with Linux, too.
Continuing in the "let me Google that for you" vein, both QNX and various L4 family microkernels are in use in a variety of embedded systems; QNX is also in the new Blackberry products. There's a number of very mature security oriented research microkernels (like L4se and K42) that could very well show up in commercial products eventually. But that's back to needing to know more about computer science than Windows and MacOS.
> If all they can come up with is "Minix" and "that pissing contest with Linus", then I might see if the Linux devops guys have an opening. If they're that incurious, they'll do fine there; those guys think the world begins and ends with Linux, too.
Do you actually believe that someone is incurious simply because they don't share your own interest in Tanenbaum? Perhaps they've focused their curiosity on one of the many other luminaries in CS, or perhaps they're more interested in the topics themselves than the personalities behind them.
Your contempt for your own devops team is also disquieting. Based on your comment your company sounds like a toxic place to work.
That's why we ask the question about 3 people in broadly different areas. Frankly, if you don't recognize at least one of those names and understand the foundational contributions they made, then yeah I'd call that a kind of incurious.
As to devops, you may think whatever you want. I give them shit about the "all the worlds Linux" attitude, they give me shit about "fucking research projects" (e.g. anything that isn't Linux). We understand our respective views, and it works.
> That's why we ask the question about 3 people in broadly different areas. Frankly, if you don't recognize at least one of those names and understand the foundational contributions they
made, then yeah I'd call that a kind of incurious.
Well that's a lot more reasonable! Your original comment left no ambiguity that candidates insufficiently knowledgeable about Tanenbaum would be shunted over to devops.
> As to devops, you may think whatever you want. I give them shit about the "all the worlds Linux" attitude, they give me shit about "fucking research projects" (e.g. anything that isn't Linux). We understand our respective views, and it works.
That could be the basis of some good-natured ribbing, which would be OK. What's not OK for a healthy company culture is the suggestion that devops people are inherently incurious, and the strong whiff of intellectual elitism which came across in your original comment.
I'm totally with you that questioning Tannenbaum's legacy is pretty poor form, but your interview questions sound designed to filter out anyone who doesn't share your exact interests, which is a real shame. A better follow-up than ending the interview upon a candidate not knowing who he is would be to describe his achievements (as you did here) and then ask the candidate to tell you what they know about someone else interesting who you may or may not already know all about.
This has nothing to do with cultural bias. It's just basic CS stuff that anyone with "CS" in their resumes should know about. Heck, even undergraduate students will probably have "AST" tattooed inside their brains in the first semester alone.
I'm sorry to be picking on you but this is one of the things that is absolutely wrong in our field: we don't learn anything from history. We don't know what was being researched in the 70's and proceed to reinvent the wheel over and over thinking we somehow have magical brains that are unearthing some concepts for the first time in human history.
The traditional CS curriculum should adopt a mentality of "ok, you now understand at which point in history we are in CS? Know most of the past inventions? Fine, now proceed to build on top of them and stop wasting everybody's time with your rediscoveries".
I don't think you're picking on me, and I wasn't trying to suggest that such a question is cultural bias. What I meant is just that people in general tend to think the things they know about are the most interesting things, and that people who don't know about those things are deficient. But by definition they can't know about things they don't know about, which may be just as interesting. So my proposed replacement question just acknowledges and tries to work around that phenomenon. I doubt it is actually critical for people to be super familiar with Tannenbaum's work specifically, it is just an indirect way of assessing intellectual curiosity and CS chops, which I think my question would also achieve.
I pretty much agree with everything else you said, and I wish I knew more about the history of computing myself, since I've lost a lot of my memory of my college course on it to the sands of time. I wonder if there's a good survey book. Maybe AST wrote one...
Another good interview technique is "did you actually read what I said". Things like when I said it's an open ended question, asked about 3 of the seminal minds in CS at least one of which a CS graduate would have bumped up against, and that the candidate can focus on whatever they want to. Just a tip for your next interview.
> If they don't know who AST is, the interview is probably over. We continue if they can discuss pretty much any of the 6-7 seminal, award winning CS textbooks he wrote, the other major projects he lead like the Amsterdam Compiler Kit or the Amoeba distributed operating system, or the other contributions he made to networking, distributed systems, languages, computer architecture, operating systems or information security (he did publish nearly 200 journal papers over 43 years as a professor).
> Continuing in the "let me Google that for you" vein, both QNX and various L4 family microkernels are in use in a variety of embedded systems; QNX is also in the new Blackberry products. There's a number of very mature security oriented research microkernels (like L4se and K42) that could very well show up in commercial products eventually. But that's back to needing to know more about computer science than Windows and MacOS.
Let's be fair here. Your claim was that microkernels are "in general use". Ongoing research, however mature, does not support this claim. And Blackberry is hardly the heavy hitter they used to be. Meanwhile, the major OSs for computers as computers -- and as phones -- have backed away from the microkernel design. Maybe they shouldn't have; regardless, they did.
That leaves embedded systems. And there you have a point. So: microkernels are in common use in embedded systems. But let's not overstate their successes.
Let's be fair here, embedded systems are computers like every other and there are far more of them than there are regular computers. If you walk into any slightly larger production plant and you take QNX and other soft real time or hard real time controllers out then that plant becomes so much scrap metal.
Besides, QNX works very well on PC hardware and is used extensively in the communications industry.
Please do not take your own limited exposure to the world of IT as proof that certain things are true, especially when they are emphatically not. I know of several thousand QNX installs within 10 km from where I'm sitting.
Denying the success of micro kernels such as QNX by disqualifying applications is like claiming linux is a failure by excluding mobile devices.
Let's be fair here. If you get to redefine "computer" to exclude embedded systems, even though they vastly outnumber "computers as computers" (whatever that really means), then you get to be right. But really all that does is show your limited view of the the field. For example, QNX runs 10s of millions of cars alone and who knows how many Cisco routers running IOS-XR. If that is "overstating" success, I'm not really sure what success looks like.
> Let's be fair here. If you get to redefine "computer" to exclude embedded systems, even though they vastly outnumber "computers as computers" (whatever that really means), then you get to be right.
Actually, I don't think I disagreed with you, except to note that a research kernel that might be used someday, does not count as "general use".
You might count the insinuation of overstatement as a disagreement. The point of that is that context matters. When I talk about choice of OS, the Mac in my living room has rather more weight in my mind than the embedded controller in my garage; I know I'm not alone in this. So if we say only that microkernels are heavily used, then we are correct, but we will be misunderstood. It is better, I think, to make a statement that is both correct and understandable, than to make one that is merely correct, while looking down on those who misunderstand.
EDIT: A quote[1] from me, giving an example from a rather different topic:
> If I open up a restaurant that serves General Tso's chicken and chop suey and sweet and sour pork and fortune cookies, and I advertise that I serve "American food", then my description is accurate, but my customers will be confused.
> When I talk about choice of OS, the Mac in my living room has rather more weight in my mind than the embedded controller in my garage
This is pretty much exactly what I'm selecting against. It's not that your concept of "general use" in computer science excludes embedded systems (frightening, considering you're apparently teaching this stuff). It's that when more than one poster tells you that you're wrong, and provide concrete examples of why, your response isn't "that's something I need to consider" or "perhaps my knowledge of the field isn't what I thought it was" or best "I've got more to learn". Nope, you decide the "context" of the discussion is whatever you want it to be and to trot out a contrived bit of sophistry which boils down to "I might be wrong, and I'm not saying I am, but because lots of other people would be wrong, I get to be right". Or something. Doesn't matter. We weren't opening a restaurant.
Double bonus points for focusing on a throwaway, tangential comment and pretending it's a central flaw of argument. This clearly isn't your first specious Internet argument.
I can't really emphasize enough how well-written his books are. I picked up some of Tanenbaum's books early in high school (specifically: Structured Computer Organization, Modern Operating Systems, and Computer Networks). They were so well-written and engaging they I could hardly take my eyes off of 'em. They really cultivated an interest and love for Computer Science in me.
I always liked how he would crack the ocassional joke every now and then in his books. "Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway."
It would be nice if we could just plug a crystal ball into
a free PCI slot to help out with the prediction,
but so far this approach has not borne fruit.
Lacking such a peripheral,
various ways have been devised to do the prediction. [...]
It amazes me that the majority of computer related books I've read are really not fit for human consumption. It's really refreshing to see that he understood this and made his work and lectures - I've seen a few on YouTube - as engaging and entertaining as possible while still being clear and concise.
Code is for computers, but in the end, programming is for humans.
The witty humor scattered all over in his books really did make them a lot more fun to read! :-)
I remember him describing Intel's primary design consideration in developing the Pentium processor series as being "1. Backwards compatibility. 2. Backwards compatibility. 3. Backwards compatibility."
On another note, Linus Torvalds described "Operating Systems: Design and Implementation" as "the book that launched me to new heights".
Or explaining the OSI network protocol stack by two philosophers talking about their love for rabbits in different languages via translating secretaries: "Ik hou (sic) van konijnen" and "J'aime les lapins".
Jokes don't have to be false — it's actually in vogue these days for jokes to be accurate observations of reality stated in funny ways. For example, Louis CK's joke about the unfairness of banking fee structures:
Ever get so broke that the bank charges you money for not having enough money? The bank calls up and says "Hi, we're calling because you don't have enough money." And you go, "I know!" So they say, "You have insufficient funds." "Yep, that's a good way of putting it too, I'd agree with that. I find my funds to be grossly insufficient." "You only have $20, so we're going to have to charge you $15." Fifteen dollars, that's how much it costs to own $20. But here's the fucked-up part — now I only have five. Now I don't even have the money that I paid to have.
I think Tanenbaum's joke is a good example of this sort of humor.
My small Tanenbaum story. When Prof. Tanenbaum came to speak at my college I showed up a bit late to a very packed room.
Many people were standing in the back, but it looked like there was one seat left in the first row, a few seats in. I proceeded to walk up, force everyone to stand up and politely asked everyone to move so I could use the last seat. Some old crusty professor didn't hear me , so I had to explicitly ask him to move.
About 1 minute later the crusty old professor stood up, walked back behind me and started his speech :). Oops. All my classmates asked me afterward WTF was wrong with me...
Reminds me of when I was at Dark Carnival in Berkeley browsing books. There was this comfy chair in the middle of the room, so I sat down to read a little of the book I was browsing. It was something by Poul Anderson, from the display next to the chair.
So, I'm sitting there reading and a crowd starts to gather. I look up and a bunch of people are looking at me. There's this older gentleman closer than the others. Then, it dawned on me, it was Poul Anderson. Oops. There to sign copies of his new book.
My MOS book was stolen from my car when I went to college. I like to think a drug addict somewhere will go on to develop the next great operating system.
I put a few outdated textbooks outside with the trash in Amsterdam, and when I came back after getting groceries, there was this homeless guy sitting on the curb, enthralled in Structure of Computer Organization.
When I was sixteen, I attended open house classes for computer science for both universities in Amsterdam, the UvA, and the VU. At the UvA, they let us draw Mickey Mouse in Pascal on an ancient Mac.
At the VU, Andy Tanenbaum gave a lecture on operating systems, and handed out floppy disks with Minix at the end. All I knew then was DOS. Mind blown. Thank you ast!
In the US the university administration encourages even distinguished professors to retire in order to hire new faculty. It allows to hire another distinguished professor who will bring in more research money and quality students. Full professor slots are very limited even at the richest US colleges. A Supreme Court decision a couple decades ago prevents forced retirement of faculty at any age. But the administration guilts the professor about hurting department prestige. And the retiring professor can often negotiate some aspects of his replacement. Its resembles debates about when US Supreme Court justices should retire whom also have lifetime tenure.
Some context: Andy Tanenbaum is born in March 1944, so already 70. That's five years after the common retirement age for his age group in The Netherlands (assuming that he paid his social taxes here).
So, in all respect a well-deserved retirement... and I guess that retired or not doesn't stop him contributing to the CS field. Well, he will have less Ph.D.-students, so more time to writing, programming and lecturing!
So is this book the type of OS book that is fantastic knowledge for technical people to really know about their computer or is it really just aimed at those that want to move into OS development?
Operating Systems: Design and Implementation, the Minix book that is sort of famous for inspiring Linus Torvalds, is more targeted at those who want to move into OS development.
Modern Operating Systems is an amazing book but it works at a somewhat higher level. You'll basically read about stuff that happens, or might happen, in an operating system. You'll learn about threading, the elevator algorithm, dining philosophers, how virtual memory works, how a memory allocator works, and a bunch of other stuff.
It's a very well written textbook. The thing that struck me was the way it introduces things at a theoretical or high level and then trusts students enough to present them with real C code and real problems to solve on the next page. Just an excellent textbook.
Find a used one, mine costed $7, albeit an older edition.
It's an extraordinarily written textbook. For me there are 2 types of technical books - the ones that I have to plow through with sheer willpower, and the ones that I read kinda like an interesting novel. Tannenbaum's book falls in the latter category.
In my opinion a well rounded developer needs a good understanding of what happens at the OS level, even if he is never going to write a device driver. Tanenbaum's books are excellent for that.
I have a clear memory of my first Minix[1] boot. It was in an old PC-XT (Intel 8088) with 640Kb of RAM and 10MB Hard drive. Coming from MS-DOS, this was pure black magic to me.
I will never forget minix. This tiny unix booted from one floppy and ran on the Atari ST (among others). It even had a C compiler. Tanenbaum built it as an OS education tool.
Minix3 has adopted/is adopting parts of the NetBSD codebase (packages, libc, coreutils, etc etc) and is closer to the *BSD platforms than ever before. It can be considered a BSD microkernel nowadays.
There's still a lot to do and packages to port/features to add, but it's getting there. Since the release of Minix3 (I think it was in 2007) the project has gone more towards the pragmatic and practical world, instead of just academia.
Heh, I don't think parent's point is that minix is the most practical small UNIX option now so much as that it was a big deal for them back in their Atari ST days.
As someone who was really trying to get 386BSD when I was kid, I had no idea that minix already ran on my Amiga 500. :( Time machine me would have loved to run this.
Ah... I really enjoyed the courses by 'ast' at the VU Amsterdam: Computer Architecture, Computer Networks and Modern Operating Systems. With respect for the other teaching staff, but Andy somehow was the modest, funny and inspiring hero of the CS department. The clarity and completeness of the books, the relevant assignments and the humor and kindness in his lectures... let's hope that his retirement doesn't stop him making contributions to the field. Unbelievable that he's been at the VU Amsterdam for 43 years.
Thank you very much Andrew, I won't ever forget our talks at your house in Buitenveldert/Amstelveen they came at a very important point in my life and made a positive and lasting impact. You were amazingly accessible even to 'outsiders'. I regret never to have been able to study at the VU, I'd have happily done so. One thing you did is to instill a life-long habit of looking at the world as an asynchronous hard real time affair and to appreciate micro-kernels as a very elegant solution to a lot of problems.
I don't doubt it. When I worked at a university, some of the very old profs still had very old computers dating into the 90s. I was pretty amazed that they even worked, but most of them only ever used them once every few weeks or months.
Perhaps he's using it just as an X server. Around 15 ago, some of the other people on my systems programming team didn't care to run a fat Linux PC desktop -- they just had a X machine that booted over the network from a "big" UNIX server and logged into to their FVWM or such environment. That's handy as you could go and login on any other X workstation and get the same desktop environment.
Looking at the Wiki page, the Sun Blade 100 dates from 2000 to 2006, PCs from that era were reasonably powerful by most modern standards, just lacked 1) a lot of cores, 2) shitloads of memory. For most uses, they are still great for daily tasks, and at a university, he probably had access to some remote supercomputer if he really needed it :P
My ageist perspective here is that what you're describing takes a bit of work and most "older" profs can't be arsed to do that work, so the simplest explanation is that he just still runs an old computer :)
The man responsible for explaining OS and Network topics for masses. The professor who "attacked" programmer.
From Computer Networks Book DNS section
"Take the pro domain,for example. It is for qualified professionals. But who is a professional? Doctors
and lawyers clearly are professionals. But what about freelance photographers,piano teachers, magicians, plumbers, barbers, exterminators, tattoo artists, mercenaries, and prostitutes? Are these occupations eligible? According to whom?"
This is one of the guys that i have learned a lot from.
Even though i have never seen him nor have i ever met him.
But his words still echo in my mind. Modern Operating Systems and Computer Networks , both extremely well written. They certainly did taught me alot.
Here is to Prof. Tanenbaum , Thank you sir , for teaching me .Even though i've never met you nor i may ever meet you in this life but you should know that some guy from South Asia is thanking you from the bottom of his heart.
Feel privileged to have studied at his department, and his lessons on microkernels and distributed operating systems are at the heart of what I do today.
"In 1991, while attending the University of Helsinki, Torvalds became curious about operating systems[31] and frustrated by the licensing of MINIX, which limited it to educational use only. He began to work on his own operating system which eventually became the Linux kernel.
Torvalds began the development of the Linux kernel on MINIX and applications written for MINIX were also used on Linux. Later, Linux matured and further Linux kernel development took place on Linux systems.GNU applications also replaced all MINIX component" http://en.wikipedia.org/wiki/Linux#Creation
His Computer Network book is one of the few that I never sold back to the bookstore. Mine is less than 6 feet from me right now, should I need to consult it.
It says everyone is invited, so I'd assume it's not just for VU students.
The Aula at the VU is pretty big and I hope there will be enough space for everybody, but regardless they won't complain if non-students join as well (at least I assume so).
I really wanted to attend as well, but unfortunately I will be away from the country in those dates, you can take "my place" if you want :P
The Aula at the VU Amsterdam has 900 seats. With this kind of announcement (and showing up at HN) I guess the seats will be filled quickly. Registration is free, and it is polite and appreciated so the organizers know how many people they can expect, but registration is not required. Anyhow, if you want to attend the farewell lecture: show up early :-)
And it would not surprise me if they relocate the lecture to a bigger auditorium nearby, if interest or registration goes way beyond 900.
Never saw the guy face to face, just read his books. Thought when I'll go to US will meet him, sit for his lectures. Don't think it will happen now. :(
Thanks Mr. Tanenbaum, your various works have been a huge inspiration as well as incredibly interesting to read or tinker with.
[1]: https://groups.google.com/forum/?fromgroups=#!topic/comp.os....