Hacker News new | past | comments | ask | show | jobs | submit login

I'm having a hard time dating this rant (despite its historical importance, I can't seem to convince WikiWikiWeb to cough up a page history), but I would guess that it's from ~1996. If so, it is a time capsule that represents the stagnant OS thinking in the mid-1990s: the belief was that Unix was dead (!), that Windows was going to be everywhere (!!) -- or that both of these systems were essentially useless and what we _actually_ needed was radical surgery like object-oriented operating systems or microkernels or exokernels or orthogonal persistence (or all of the above!).

If it needs to be said, this time was a terrible atmosphere in which to aspire to OS kernel development, a suffering that I expanded on (at far too much length!) in a recent BSD Now.[1] I do not miss the mid-1990s at all, and even now, I find that this ill-informed screed fills me more with resurrected anger than it does any sense of vindication or nostalgia...

[1] http://www.bsdnow.tv/episodes/2015_08_19-ubuntu_slaughters_k...




If you read the "WimpIsBroken" article linked from the same rant, you will see it reference, negatively, of course, the Windows 8 design, so it can't be all that old, or at the very least, it has been updated recently.

I generally dismiss a lot of the things the author is saying for 3 reasons:

1. The author is listing a lot of problems without actually specifically listing why they are in fact problems. For instance, he states that neither Windows nor Unix can teach you how to program. Why would my OS need to be able to teach me how to program? That's like complaining that my Car doesn't teach me how to rebuild the transmission, or my refrigerator doesn't teach me how to generate refrigerant. That's just an obvious example that jumped out at me, there are many others.

2. The author is pointing out a lot of problems without actually pointing out any solutions. People who point out problems but provide no solutions are generally useless. If you ever worked on a project with a person like that, you know what I am talking about.

3. c2.com lacks any sense of design or taste. The author also lacks any ability to communicate his point of view in a logical and consistent manner. I wouldn't trust the author to design a Ski Free game, let alone an operating system. In reality, if tasked with a job to write a Ski Free game, the author would instead return a 30 page paper on the subject of problems with modern game design.


"c2.com lacks any sense of design or taste. The author also lacks any ability to communicate his point of view in a logical and consistent manner. I wouldn't trust the author to design a Ski Free game, let alone an operating system."

The content on c2.com is not the work of any particular author - it's a wiki whose articles are collaboratively created and edited. Each paragraph may be the work of one or more anonymous or named writers.

In fact, c2.com happens to be the world's very first wiki, designed by Ward Cunningham. (The wiki's content has been frozen and is no longer editable. I seem to remember that Ward is working on a next-generation replacement for it.)


c2.com lacks any sense of design or taste.

It has the very best design! http://motherfuckingwebsite.com/


Personally I agree, but shiny pretty shit seems to sell.


There is a difference between, shiny shit, minimal design, and no design. Ford F150 Raptor is shiny shit, a Jeep Wrangler or an Icon FJ are pretty minimal design, and a subframe and 4 wheels, is just a subframe and 4 wheels. Sure, the letter can get you from point A to point B, but come on, it's not a proper car. I would argue that the design on c2.com and the wonderful diatribe of the fellow with a little too many "fucks" in his opus lacks design entirely rather then keeping it minimal, there is a difference. The lazy lack of design, which is similar to the lazy over-designing, should not be confused with proper balance of minimal design.


I think I agree with you. The c2 website could be less painful to read. However, the diatribe is near perfect. It fits the content very well.


I'm watching Bryan's interview, it's awesome.

For some history, the Jeff that he keeps talking about is a former student of mine at Stanford. I wasn't a big deal at Stanford, I was a TA for a Xerox PARC guy, when he retired Stanford asked me if I'd teach the class.

This was the papers class in operating systems. If you don't know what "papers class" means it means there are no text books, this is where you go to learn at the cutting edge. Pretty advanced class and Stanford let me teach it. To this day I wonder what they were thinking. That said, I'm proud of how I taught that class. The students learned a lot.

Jeff Bonwick was a student who was doing a major in statistics. Somehow he ended up in that OS class. I could tell he was sharp and I was working at Sun in the kernel group, I recruited him as hard as I could. He asked me why I would want him, his exact words were "I can't program in C". I told him that I could teach him how to program, I can't teach people how to be smart. He was very, very smart. I told him when he came to Sun he would go far higher than I ever did and I was right, he did.

Bryan is cut from the same cloth. He's an OS geek. There aren't many of those geeks around these days. Go him.


240? Wow, indeed. I am surprised an "outsider" would teach that.


Yup, CS240, still have the class notes.


If anyone cares, I learned a lot teaching that class. Happy to share.


We care! Please do.


I don't have a blog, can I just do a new post here?

Mostly what I learned was about people, it wasn't about OS.


seconded. please share.


what we _actually_ needed was radical surgery like object-oriented operating systems or microkernels or exokernels or orthogonal persistence (or all of the above!)

Do we not?

I watched your interview, but I don't recall you saying anything about research ideas. You even made a jab against Spring which rubbed me the wrong way.

An orthogonally persistent microkernel-based operating system, even another Unix, would be great. Which is why I'm looking at MINIX 3 and Hurd as interests. The latter has its anachronisms, but it's still a step forward.


I think we need a functional userland (like nix) and a microkernel (like minix), at least.

Sadly we will get a userland full of container-ized apps (via systemd) and a monolithic kernel (still linux). Worse is better.


Well, radical surgery was needed -- it just wasn't at the level of the operating system interface, but rather much deeper in the implementation. For example, ZFS post-dates this rant and certainly represents a radical rethink of filesystems -- but it did this without breaking extant applications. That extant applications were (and are) a constraint on the problem is something that's entirely unappreciated by mid-1990s OS research, which was hellbent on throwing out existing abstractions rather than building upon them.

Finally, in terms of Spring: based on the fact that you are implicitly defending it, I would guess that you never had to run it. As one of the very few outside of Sun inflicted with that flaming garbage barge, I can tell you that it was not at all a pleasurable experience -- and that whatever novelty it represented was more than offset by its horrifically poor implementation. If I ever harbored any illusions about Spring, they certainly didn't survive my first encounter with a machine running it...


I imagine it wasn't that pleasant being one of Jenner's cowpox inoculees, either. But that was an important step towards eradicating smallpox nine generations later. And the electricity I'm writing this with is produced by a steam engine descended from those that blanketed London in deadly smog for ten generations. Lots of beneficial innovations go through an early stage where they're counterproductive or unpleasant.


We've had the microkernel versus monolithic kernel debate _over_ and _over_. I'm not against research but I've yet to see a convincing argument that microkernels are better. If you decouple the logical bits of the kernel from each other all you'll get is greater impedance and message passing overhead. And for what? A "clean" architecture? Sometimes perfect is the enemy of the good, and I think that applies in this case. We've also got the tools to make monolithic kernel development scale now. Basically, I'll believe it when I see it.


So they can be better. QNX is the one example I know of that is better. It's also the only microkernel that is actually micro. When I was looking at it the whole kernel, all of it, fit in a 4K instruction cache.

Most "micro"kernels aren't micro at all, they are bloated crapware. QNX was not like that, they actually had a micro kernel and it worked. I ran 4 developers on an 80286 (no VM) and it worked just fine. Far far better than the VAX 11/780 that had more memory and more CPU power. The VAX was running 4.1 BSD.


We've had the microkernel versus monolithic kernel debate _over_ and _over_. I'm not against research but I've yet to see a convincing argument that microkernels are better.

Sure, if all you do is read Linus Torvalds ranting.

If you decouple the logical bits of the kernel from each other all you'll get is greater impedance and message passing overhead.

Debunked many times, as early as 1992 in fact: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.51.1...

Microkernels have come a long way since Mach.

And for what? A "clean" architecture?

You clearly have no idea what the advantages are.

Basically, I'll believe it when I see it.

You're seeing it in the billions of L4 deployments worldwide in wireless modem chipsets. You're seeing it in QNX being everywhere.

But then again, you might not see it, because you are willfully ignorant.


How many of those L4 deployments end up running 90% of their code in the "Linux" or "posix" process?


That might still be an argument in favor of microkernels if the Linux process can't crash the machine or cause it to miss hard-real-time deadlines. Or if you can use it to confine malicious code in the Linux process.


Fair, but when people talk about "real systems" built with micro kernels, as often as not, it could also describe Linux on xen with a watchdog restart. It's not especially compelling evidence that micro kernels are practical. Want to convince me that micro kernels are the bomb? Tell me about a system where no single service exceeds 40% of the code/runtime.


"Unikernels" is the misnomer being promoted by MirageOS for applications compiled to run on Xen, which they are currently doing on Amazon EC2 hosts alongside Linux instances. There are EC2 hosts that host a number of instances, including "unikernels" and Linux instances, and although I don't have details, the pricing on the smaller burstable instance types makes me think that some of the EC2 hosts are hosting actually quite a large number of instances, in which no instance exceeds 5% of the code or runtime. Is that what you're talking about?


No. What I mean is a micro kernel pace maker where 40% of the code is the beep beep service, and 30% is the meep meep service, and 30% is the bop bop service. As opposed to 99% of the code in the Linux service and 1% of the code in the realtime watchdog restart service.

Calling EC2 a micro kernel success story also seems like quite the definitional stretch.


Incidentally, I'd love to hear bcantrill rant on unikernels sometime, if he hasn't already in a talk or interview somewhere. I imagine he's not a fan, since they can never have the observability or performance of OS containers running on bare metal.


> you are willfully ignorant

No personal attacks, please.


Just so you know. I'm not going to debate you further, because I refuse to debate someone who calls me willfully ignorant.


What would you call someone who has strong opinions about something they refuse to research?


"Unconvinced". The assumption in your question is that anyone who has done any research can only possibly agree with you. News flash: it is possible for informed people to disagree.


The assumption in your question is that anyone who has done any research can only possibly agree with you.

I'm not assuming that. Any amount of research could be no research or close to none. People who have done very little research shouldn't opine. People who have done lots of research are more likely to be correct.

News flash: it is possible for informed people to disagree.

Well of course. Someone who's barely informed can disagree with someone who is far more informed. That would be two informed people disagreeing.

If someone has done adequate research and I'm correct, they must agree with me, otherwise the amount of research wouldn't be adequate or I must be wrong. I can avoid being wrong by avoiding having strong opinions on things I don't understand or know much about it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: