The explosion of interest in virtualization was started by VMware, a commercial entity, which rather makes Pike's point. Xen was started in response to the lack of an academia-friendly platform for virtualization research; "paravirtualization" was not initially a design goal, but rather an expedient choice to avoid the difficulties of correctly virtualizing the x86.
And while I'm biased, VMware's solution to this problem (dynamic binary translation from supervisor x86 to user-level x86) is massively more inventive, interesting and useful than Xen's solution (hack up the kernel). I emphasize "useful", because a lot of VMware's customers were most interested in running Windows, which was not paravirtulization-ready until 2008, a full ten years after VMware's products were available.
VMware's awesome... but its roots are also in academia, the company was started by a Stanford professor and his PhD students.
It seems like a circular argument if the criteria of an academic project "successfully influencing industry" is widespread adoption but that adoption cannot be facilitated by a commercial entity.
[As a side note -- the impetus for Xen, and also somewhat for VMware, was in a vision of global appliance-based computing, not in "virtualization research." I think that is really interesting given what just happened in this area in the last 3 or 4 years.]
The following passage from that second piece is fascinating (I had never heard of the "Unix room"... it sounds like an illustration of Conway's Law, and a lot of fun too).
One odd detail that I think was vital to how the group functioned was a result of the first Unix being run on a clunky minicomputer with terminals in the machine room. People working on the system congregated in the room - to use the computer, you pretty much had to be there. (This idea didn't seem odd back then; it was a natural evolution of the old hour-at-a-time way of booking machines like the IBM 7090.) The folks liked working that way, so when the machine was moved to a different room from the terminals, even when it was possible to connect from your private office, there was still a `Unix room' with a bunch of terminals where people would congregate, code, design, and just hang out. (The coffee machine was there too.) The Unix room still exists, and it may be the greatest cultural reason for the success of Unix as a technology. More groups could profit from its lesson, but it's really hard to add a Unix-room-like space to an existing organization. You need the culture to encourage people not to hide in their offices, you need a way of using systems that makes a public machine a viable place to work - typically by storing the data somewhere other than the 'desktop' - and you need people like Ken and Dennis (and Brian Kernighan and Doug McIlroy and Mike Lesk and Stu Feldman and Greg Chesson and ...) hanging out in the room, but if you can make it work, it's magical.
Sure, I agree with you, there was no "huge paradigm shift" (that is so extremely rare in CS).
But there was a virtualization efficiency advance that got an impressive number of companies to actively jump into the project (with time and money) which (along with it being free) made it explode, mature, and eventually make things like commodity computing rental via EC2 possible to implement just years later.
I agree with him that there is so much established infrastructure/standardization out there that doing something completely game changing is hard if not impossible. Which is why I think weighing things against such a behemoth industry, Xen's story is impressive.
It did what he wanted: "Make the industry want your work." But it's because it is useful, not just because it is cool. As for grant money's role in all this, it's the same issue. Like arts and humanities vs. energy/science, the lion's share of research funding is always going to go towards what will most likely advance the country's economy/technology.
> because it is useful, not just because it is cool
Seems like Pike's own opinions migrated towards an acceptance of this, from jackchristopher's slashdot link [1]:
"Applications - web browsers, MP3 players, games, all that jazz - and networks are where the action is today, and aside from irritating little incompatibilities, the kernel has become a commodity. Almost all the programs I care about can run above Windows, Unix, Plan 9, and on PCs, Macs, palmtops and more. And that, of course, is why these all have a POSIX interface: so they can support those applications."
> But there was a virtualization efficiency advance
You are just reinforcing Rob's point. Xen doesn't do anything new, it is simply an optimization, that doesn't mean it is not useful (as he also points out), but confirms his point that all 'research' this days is just optimizations and 'phenomenology', and nothing that is and feels truly different and new.
In the end Xen is a textbook illustration of the problems he is pointing at.
People across the globe are doing something that is and feels truly different and new: they're thinking and arranging their entire stacks in terms of remote virtual machines.
We could argue about the definition of novelty all day and in the end I think it's simply a judgement call what one deems as exciting or worth talking about. If you're looking for something so profoundly different along the lines of Einstein and Heisenberg in physics, I agree with him through and through, nothing in systems will probably happen like that given the current funding paradigms (I can't actually think of anything like that in the history of computer science -- you can for example be as reductionist as you want if you want to take Pike's side of the argument: "oh, it's all just a collection of logic gates"...).
> People across the globe are doing something that is and feels truly different and new: they're thinking and arranging their entire stacks in terms of remote virtual machines.
Except that there had been people doing precisely that for ages. Seems that computer science is condemned to eternally reinvent wheels... sigh
Well, in the seventies IBM mainframes had the Control Program (CP) and what is now the z/VM hypervisor, so yes precisely the same paradigm was dominant in architecting systems back then.
Sure, squint hard enough and it's the "precisely the same." Being in a virtualization/utility-computing job for the last five years, I've heard this kind of quip over and over and I agree with it on the surface but it's true only in the most general ways.
I guess I was not specific enough above with simply saying "remote virtual machines." With a credit card almost anyone can rent thousands of machines programatically (with redundancy across multiple datacenters if you want) and be using them in minutes. And make them go away just as quickly. And construct the VMs themselves programatically or by hand on a laptop. And the people renting out the hardware don't need to even trust you. All of this is a new development that changes the way business is done. Like I said above, you can say everything is "just" logic gates plus electricity and claim nothing new ever happens (and I'll agree with you, too, I just won't think the conversation is going anywhere interesting).
Agreed. My research: http://v3vee.org/. I don't even know what to say about this article - it's just so poorly thought out and researched that I can't believe it.
In that case, you're just wrong. Look at what has come out of systems research: Xen and its ilk created a huge jump in virtualization research. Microsoft and VMWare now have large paravirtualization as parts of their virtual machines, which we are seeing more and more uses for every day.
There are a number of other things that come to mind, mostly in high performance computing. Parallel FORTRAN is one of the most common HPC languages and I believe the primary development was based on work at Rice university.
The problem is that people too often get caught up on implementation: most academic departments do not develop "products," they work to develop ideas. Many of these ideas have been built on recently to produce successful products.
If Xen is the best you can come up as a result of research impacting industry in the last ten years... IBM had been doing basically the same for decades, so I'm not sure one could consider Xen particularly revolutionary.
Edit: Also note that Xen in no way fulfills the definition of Systems Research, it is simply a new minor twist on a very old idea.
"Irrelevant: Does not influence industry"
Two years later came Xen, a systems research project that has definitely influenced industry.