If you think that this is the bulk of AST's legacy, or that real microkernels aren't in general use, you really should get out and learn a bit more about computer science.
And in all honesty, Linux has adopted a number of elements over the years that would not have flown with the original fully monolithic kernels.
No, we don't have userspace drivers. (Apart from proprietary GPU drivers and a number of enterprise hardware systes .. and at least Canon printer drivers, and and and and ...) They're not isolated in the sense that microkernels would enforce but they provide a shim for the kernel and then do a lot of their processing in userland.
We can load and unload drivers at runtime. That's what insmod/rmmod do.
We don't do message passing between kernel components, but in order to prevent messaging from becoming a bottleneck we have signaling mechanisms, we have netlink, and we have $deity knows what else.
We DO have userspace filesystems: FUSE. I'm still waiting for userspace block device drivers but that's probably not going to happen. :)
What do we have with Linux then? Not a microkernel - just a very modular and runtime-modifiable mostly monolithic kernel instead. A hybrid? Ish?
I think you have to straight up call Linux a monolithic kernel. It's got some nifty modularization features, but none of them (other than maybe FUSE) are really what the microkernel folks are going for. And that's fine. Both approaches have obviously evolved to fill niches the other doesn't satisfy as well. What's clear is that nether approach is a panacea for all compute needs.
He's written several excellent books on operating system design and they are considered by many to be excellent resources - he's taught a lot of people about how an OS works beyond the decision of microkernel vs. monolithic. He's a respected professor, seen as the authority on the topic, his less-popular opinion on microkernels notwithstanding. Most OS's today may have gone in the direction of being monolithic, it is true, but as other commenters have pointed out, Linux is actually very dynamic and modular, Minix 3 has some very advanced functionality, and this all validates the vision he had for microkernels, even if it didn't pan out to be commercially optimal.
Xen, Minix 3, QNX, several l4 implementations, etc. OKL4 alone has been shipped on over 1.5 billion devices. QNX was huge. Xen is running everything at AWS, the largest webhosting company in existence (nevermind everywhere else it is used).
QNX is huge. If you took QNX out of the world it would stop to function just about immediately. So much stuff runs on it that you typically wouldn't even guess. In that sense it is just like other soft real time/hard real time OS's one of the unsung success stories of IT simply because it works so well it tends to disappear.
Most things that use QNX or other OS's like that inside simply work rather than that they require constant upgrades and bug fixes. Reliability by design is so much better than reliability by trial and error, and the OS is a huge factor in that. It's a world where bugs are felt as 'egg on your face' rather than a 'wontfix'.
No...based on your reaction I think I phrased it pretty much spot on.
I added a new question to my interview process a number of years ago. Anyone who put "computer science" on their resume get's asked "What do you think of Andrew Tanenbaum?" It's free-form question intended to see what about computer science interested them enough to remember; I have similar questions about Turing & Knuth. Our team requires a lot of broad knowledge and original thought.
If they don't know who AST is, the interview is probably over. We continue if they can discuss pretty much any of the 6-7 seminal, award winning CS textbooks he wrote, the other major projects he lead like the Amsterdam Compiler Kit or the Amoeba distributed operating system, or the other contributions he made to networking, distributed systems, languages, computer architecture, operating systems or information security (he did publish nearly 200 journal papers over 43 years as a professor). If they know about Electoral-vote.com, bonus points.
If all they can come up with is "Minix" and "that pissing contest with Linus", then I might see if the Linux devops guys have an opening. If they're that incurious, they'll do fine there; those guys think the world begins and ends with Linux, too.
Continuing in the "let me Google that for you" vein, both QNX and various L4 family microkernels are in use in a variety of embedded systems; QNX is also in the new Blackberry products. There's a number of very mature security oriented research microkernels (like L4se and K42) that could very well show up in commercial products eventually. But that's back to needing to know more about computer science than Windows and MacOS.
> If all they can come up with is "Minix" and "that pissing contest with Linus", then I might see if the Linux devops guys have an opening. If they're that incurious, they'll do fine there; those guys think the world begins and ends with Linux, too.
Do you actually believe that someone is incurious simply because they don't share your own interest in Tanenbaum? Perhaps they've focused their curiosity on one of the many other luminaries in CS, or perhaps they're more interested in the topics themselves than the personalities behind them.
Your contempt for your own devops team is also disquieting. Based on your comment your company sounds like a toxic place to work.
That's why we ask the question about 3 people in broadly different areas. Frankly, if you don't recognize at least one of those names and understand the foundational contributions they made, then yeah I'd call that a kind of incurious.
As to devops, you may think whatever you want. I give them shit about the "all the worlds Linux" attitude, they give me shit about "fucking research projects" (e.g. anything that isn't Linux). We understand our respective views, and it works.
> That's why we ask the question about 3 people in broadly different areas. Frankly, if you don't recognize at least one of those names and understand the foundational contributions they
made, then yeah I'd call that a kind of incurious.
Well that's a lot more reasonable! Your original comment left no ambiguity that candidates insufficiently knowledgeable about Tanenbaum would be shunted over to devops.
> As to devops, you may think whatever you want. I give them shit about the "all the worlds Linux" attitude, they give me shit about "fucking research projects" (e.g. anything that isn't Linux). We understand our respective views, and it works.
That could be the basis of some good-natured ribbing, which would be OK. What's not OK for a healthy company culture is the suggestion that devops people are inherently incurious, and the strong whiff of intellectual elitism which came across in your original comment.
I'm totally with you that questioning Tannenbaum's legacy is pretty poor form, but your interview questions sound designed to filter out anyone who doesn't share your exact interests, which is a real shame. A better follow-up than ending the interview upon a candidate not knowing who he is would be to describe his achievements (as you did here) and then ask the candidate to tell you what they know about someone else interesting who you may or may not already know all about.
This has nothing to do with cultural bias. It's just basic CS stuff that anyone with "CS" in their resumes should know about. Heck, even undergraduate students will probably have "AST" tattooed inside their brains in the first semester alone.
I'm sorry to be picking on you but this is one of the things that is absolutely wrong in our field: we don't learn anything from history. We don't know what was being researched in the 70's and proceed to reinvent the wheel over and over thinking we somehow have magical brains that are unearthing some concepts for the first time in human history.
The traditional CS curriculum should adopt a mentality of "ok, you now understand at which point in history we are in CS? Know most of the past inventions? Fine, now proceed to build on top of them and stop wasting everybody's time with your rediscoveries".
I don't think you're picking on me, and I wasn't trying to suggest that such a question is cultural bias. What I meant is just that people in general tend to think the things they know about are the most interesting things, and that people who don't know about those things are deficient. But by definition they can't know about things they don't know about, which may be just as interesting. So my proposed replacement question just acknowledges and tries to work around that phenomenon. I doubt it is actually critical for people to be super familiar with Tannenbaum's work specifically, it is just an indirect way of assessing intellectual curiosity and CS chops, which I think my question would also achieve.
I pretty much agree with everything else you said, and I wish I knew more about the history of computing myself, since I've lost a lot of my memory of my college course on it to the sands of time. I wonder if there's a good survey book. Maybe AST wrote one...
Another good interview technique is "did you actually read what I said". Things like when I said it's an open ended question, asked about 3 of the seminal minds in CS at least one of which a CS graduate would have bumped up against, and that the candidate can focus on whatever they want to. Just a tip for your next interview.
> If they don't know who AST is, the interview is probably over. We continue if they can discuss pretty much any of the 6-7 seminal, award winning CS textbooks he wrote, the other major projects he lead like the Amsterdam Compiler Kit or the Amoeba distributed operating system, or the other contributions he made to networking, distributed systems, languages, computer architecture, operating systems or information security (he did publish nearly 200 journal papers over 43 years as a professor).
> Continuing in the "let me Google that for you" vein, both QNX and various L4 family microkernels are in use in a variety of embedded systems; QNX is also in the new Blackberry products. There's a number of very mature security oriented research microkernels (like L4se and K42) that could very well show up in commercial products eventually. But that's back to needing to know more about computer science than Windows and MacOS.
Let's be fair here. Your claim was that microkernels are "in general use". Ongoing research, however mature, does not support this claim. And Blackberry is hardly the heavy hitter they used to be. Meanwhile, the major OSs for computers as computers -- and as phones -- have backed away from the microkernel design. Maybe they shouldn't have; regardless, they did.
That leaves embedded systems. And there you have a point. So: microkernels are in common use in embedded systems. But let's not overstate their successes.
Let's be fair here, embedded systems are computers like every other and there are far more of them than there are regular computers. If you walk into any slightly larger production plant and you take QNX and other soft real time or hard real time controllers out then that plant becomes so much scrap metal.
Besides, QNX works very well on PC hardware and is used extensively in the communications industry.
Please do not take your own limited exposure to the world of IT as proof that certain things are true, especially when they are emphatically not. I know of several thousand QNX installs within 10 km from where I'm sitting.
Denying the success of micro kernels such as QNX by disqualifying applications is like claiming linux is a failure by excluding mobile devices.
Let's be fair here. If you get to redefine "computer" to exclude embedded systems, even though they vastly outnumber "computers as computers" (whatever that really means), then you get to be right. But really all that does is show your limited view of the the field. For example, QNX runs 10s of millions of cars alone and who knows how many Cisco routers running IOS-XR. If that is "overstating" success, I'm not really sure what success looks like.
> Let's be fair here. If you get to redefine "computer" to exclude embedded systems, even though they vastly outnumber "computers as computers" (whatever that really means), then you get to be right.
Actually, I don't think I disagreed with you, except to note that a research kernel that might be used someday, does not count as "general use".
You might count the insinuation of overstatement as a disagreement. The point of that is that context matters. When I talk about choice of OS, the Mac in my living room has rather more weight in my mind than the embedded controller in my garage; I know I'm not alone in this. So if we say only that microkernels are heavily used, then we are correct, but we will be misunderstood. It is better, I think, to make a statement that is both correct and understandable, than to make one that is merely correct, while looking down on those who misunderstand.
EDIT: A quote[1] from me, giving an example from a rather different topic:
> If I open up a restaurant that serves General Tso's chicken and chop suey and sweet and sour pork and fortune cookies, and I advertise that I serve "American food", then my description is accurate, but my customers will be confused.
> When I talk about choice of OS, the Mac in my living room has rather more weight in my mind than the embedded controller in my garage
This is pretty much exactly what I'm selecting against. It's not that your concept of "general use" in computer science excludes embedded systems (frightening, considering you're apparently teaching this stuff). It's that when more than one poster tells you that you're wrong, and provide concrete examples of why, your response isn't "that's something I need to consider" or "perhaps my knowledge of the field isn't what I thought it was" or best "I've got more to learn". Nope, you decide the "context" of the discussion is whatever you want it to be and to trot out a contrived bit of sophistry which boils down to "I might be wrong, and I'm not saying I am, but because lots of other people would be wrong, I get to be right". Or something. Doesn't matter. We weren't opening a restaurant.
Double bonus points for focusing on a throwaway, tangential comment and pretending it's a central flaw of argument. This clearly isn't your first specious Internet argument.