You can't really blame Apple/Microsoft for not experimenting to the same extent that a GNU/Linux desktop environment or window manager can. In GNU/Linux you have countless options ranging from small fast tiling window managers to more complex things like Unity and GNOME Shell which are discussed in the article. For any window manager you then additionally have a huge number of panels that you can choose from. All of this freedom and choice is awesome and it lets people find the solution that fits their needs and style of computing if they're willing to do so.
I can navigate around windows and launch applications on my computer using a customized wmii/dmenu combo much faster than is possible on a mac or in windows and I love that. If Apple switched to a similar setup in Lion, however, they would make some extremely small percentage of their users happy while turning away the vast majority of their customers. Apple and Microsoft can't be what they are without catering to the lowest common denominator and part of that is playing it safe. Every radical decision that they make is a huge risk with the potential for a negative impact on their business. If somebody wants to make some experimental window manager for GNU/Linux that attempts to change the paradigm then the worst possible scenario is that they wasted their time writing code that people don't end up using. If Windows tried the same thing then they could lose real business.
The beauty of the Linux ecosystem is that users have unparalleled freedom in deciding how they want their computers to work. This freedom allows people to create new programs without much risk to the community and it allows multiple projects with different goals and principals to simultaneously coexist and thrive. This is a major strength of GNU/Linux and I think it's part of the reason why we see so much innovation within that community.
I think it's really awesome that Linux is finally taking the lead with innovation on the desktop — Gnome-Shell and Unity are just the beginning. And, they are not "catch-up": they are released both before and with more interesting changes than Lion or (what we've seen of) Windows 8.
However, what Cupertino and Redmond are up to isn't "nothing", it's just not the desktop. The Windows Phone "Metro" UI is downright amazing when you use it, and iOS isn't bad either. Sadly, and I was hoping this would be better this time around, the open source world is still years behind on mobile.
(I'm not counting Android here, and that's a shame. Honestly, I've used it, and the innovation there, especially in user experience, isn't anywhere near the levels in iOS, Windows Phone, or webOS. I would include Android 3.0, but that's not open source and won't be for a while.)
I feel as if this article has a strong tinge of exaggeration (See: ".. ALL desktop ...") and you make some good points too.
I haven't tried Gnome3 extensively and daily I use Snow Leopard (On my MbP while I'm out) and Win7 (Main desktop machine) and Gnome3 and more specifically, Ubuntu 11.04 with Unity, hasn't really caught my attention in any huge way.
The new sidebar on Unity reminds me of the Dock and the new Win7 taskbar with some differences, they've now adopted a similar MacOS top bar, 'Spaces' aren't new, quickly launching applications isn't new or unique (Startbar since Vista and Spotlight on Mac) and so on.
Even with things like Compiz which whenever I use is basically to emulate features available on Win or Mac. For example, the window snapping functionality of Win7 by using the Grid plugin and the Spacebar preview of Mac by another plugin. Actually, most of the default plugins of Compiz I never use, I always seem to be needing to install the extras or rather, the copied/heavily inspired ones.
I absolutely agree in regards to Android too. I do love it (I've had a Samsung Galaxy S since last year) and do suggest it to others but it seems most likely my next phone will by WP7 or an iPhone and not Android.
Linux isn't MacOS or Windows and never will be. I like using both of those OS for different reasons and I like Linux too when I boot into Arch with a nice tiling window manager going and do some work/play.
And I just read the linked Reddit comment by the other guy. Spot on.
They aren't new but they've been fairly common in Linux desktop environments since the 90's while OS X didn't introduce them until 2007. I'm not sure where they first appeared but at the very least they've certainly become a common idea because of Linux.
Its true that the Unity UI was influenced by the Mac OS dock, but there have been a lot of UNIX UIs that were using docks all the way back into the 90s.
Also, its not true that Compiz stole the Windows 7 snapping feature, that was in Beryl even before Windows Vista came out. They _did_ steal the Windows 7 auto-tiling and MacOS expose though - so you do have a point. Its sort of nice to see all the cross pollination between the different platforms, and nice to think that UNIX is contributing a lot to pushing forward the state of the art.
I own both a Galaxy S II (Android 2.3) and an iPhone 3GS. IMHO, Android has the lead on the user interface.
Reuse of components and events provided by other apps is at the heart of Android, the system notifications are better, it has desktop widgets that are actually useful, keyboard interaction for writing text is better (I love copy/pasting or moving the text-cursor around on my Android) and you can use the back-button everywhere (just as on the web).
The only problem with Android is that many apps available in Google's Market are really shitty. Skype itself, which works fine on the iPhone, will bleed my battery dry in only 3-4 hours, with that same battery being able to last for 3 days with 3G + email-sync enabled.
If they get the market story straight, I'm sorry for the contestants, but Android is a competition killer.
I actually don't understand this. Those are all features — good features — but that doesn't mean it has a good user experience. Android, when I use it, has lots of completely strange and unintuitive choices: a third of the OS in green/black, another third in orange/gray, and the last third in black/orange is the most obvious, but there are others: an unlock mechanism that acts before you lift your finger (unlike the rest of the OS); scrollbars in most apps that seem to "inchworm" along; over-used and confusingly-placed drop shadows in the Gmail app; gray, thin titlebars and thick "action bars" in other places; orange glows around the icons in the launcher but an orange, square backdrop in the home screen; unintuitive "glow" effect at the end of lists (after modeling physical motion for the list above, abruptly stopping is disconcerting: it feels like the list has not reached it's natural end, but smashed into something).
There's more, but while Android does have a lot of really awesome features, the interface and UX don't appear like anyone bothered to even try and design them.
Android is a better mobile web/cloud device, for sure. Multimedia in general is better on iOS, but unfortunately for Apple, that's the same marginally profitable niche they dominated in the desktop era. For most people a mobile phone is a window into the web world and that's just more deeply encoded into Google's DNA than it ever will be in Apple's.
"I think it's really awesome that Linux is finally taking the lead with innovation on the desktop..."
The key word is "finally."
The proponents of the open source development model and, in particular, the proponents of X-Windows, promised that their way would lead to superior innovation in the user interface and user experience areas. That innovation has been a long time coming.
Now that those promises are finally coming true, momentum has switched from desktop computers to mobile computers, an area where everyone is still trying to catch up with Apple.
Agreed - I certainly don't see anything that I'd call 'innovation', and there is nothing on any open source desktop that would send me off to my friends and relatives houses to switch them from OS X or Windows.
Sadly, the article completely fails to mention the amazingly useful innovations made by tiling window managers like wmii, ion3 and xmonad.
Granted, they're not as new as Unity et al, but they offer a compelling, complete and fairly radical departure from tradition, and one that I really do miss on OS X.
Windows 1 used tiled windows (MS programmers didn't learn how to overlap windows until Windows 2) and Emacs has them since the 70's. I am not sure if tiling windows could be called innovation.
After using modern smartphones and tablet computers I'm starting to think that the last thing we need is more innovation in how to manage lots of little windows on a medium sized screen.
I don't want another window manager, even if it auto-tiles for me or has a really gosh-darn nifty way of hiding and revealing them. I want innovation in devices and apps that are designed to operate full-screen on all of them. It's as if I want a "tear off" function for windows that literally tears it right off the desktop monitor so I can put it in my pocket or shoulder-bag, or hang it on a wall, or prop it up next to a book. This is what the iPad and the iPhone have done to me.
Tabs, windows, task-bars, Expose, Spaces, ribbons and others are now making me feel like geeks are congratulating themselves on discovering a really nifty new way to organize their sock drawers and lunch-boxes.
The future of "the desktop" are cheap, wireless, mobile displays. Window managers should be replaced with pockets, wall hooks and stands.
That is all well and good if you are passive user but when it come time to do some work on a computer you will likely need to use more then one application at a time and some way to manage all these application easily so that you can keep task organized and easily switch between them.
That's not really true. If I'm writing a book, I don't need anything more than a word processor on that device. If I'm browsing research material on another device, there's nothing breakthrough about the idea of transferring that material to the word processor wirelessly: highlight the material in the browser and "bump" it to the device with the word-processor on it.
Having each app on a separate device would actually make it worlds easier to perform real work, because I'm no longer Alt-Tabbing or Expose-ing back-n-forth between programs. I hate doing that because it forces me to stop thinking about my work in order to think about how to manipulate boxes of pixels into the configuration I want.
Right now I'm considering spending another $1K to get a 27" Cinema Display to go next to my $2k 27" iMac. But I'm wondering if I should have spent that $3k on a stack of wireless tablet computers of various sizes and leave some Bluetooth keyboards around the house instead.
Edit: Now that I'm thinking about it more, I believe this is the trend and the reason desktop innovation has stalled. In a decade we'll all have solid state hard drives and ten times more RAM, so every app can run in its own VM. You could suspend an app's runtime state to disk, transfer it to another device and resume where you left off. "Bump" your Final Cut session to your big-screen when you need to, then "Bump" it back to a tablet when you wanna continue editing on a park bench.
I don't know why we can't do this with web apps already. It transcends the whole idea of a desktop manager.
What you are describing does not sound at all practical in the long run. The complexity of syncing up many devices is far worse then dealing with a occasionally fussy desktop manager. Running them as dumb terminals might work if it was actually possible but I'm still not sure it would be practical in most cases (I have thought that being able to send a workspace to another device would be cool though). I realize everyone has their taste but, it sounds to me like you just need a second monitor.
The second monitor still requires me to spend a large percentage of my time manipulating rectangles.
Nor is there anything impractical about process migration. I suspend and resume Windows VMs all the time, and Windows wasn't designed for that. Think of what's possible when the programming language and OS API makes it easier to write programs that can re-orient themselves.
And even if process-migration doesn't become a feature, there is still the inherent advantage of manipulating "windows" in physical space. It's more intuitive, it's more convenient, and it's direct. When people didn't "get" the iPad and dismissed it as just a big iPod Touch, it's because they didn't grasp the benefit of manipulating the UI directly instead of through a mouse.
When you manipulate windows with a desktop manager, you're two steps abstracted: you use the mouse to manipulate the widget that manipulates the windows. THAT is what's impractical.
This article didn't really get into the part I most care about, the incredible range of custom window managers you can find that work for various *NIXes. The fact that the way X is set up allows you to roll your own window manager so easily has really allowed a froth of new ideas to be tried out - and the best ideas to be incorporated into the big desktop environments and Mac and Windows.
I was sort of worried that the move to Wayland might hurt all of this. Compiz seems to be preparing to do double duty as a X windows manager and a Wayland display server based on what plugins you have loaded, but I wonder what might happen to other window managers in the transition? On the other hand, nobody is forcing anyone to give up X.
I agree that the incredible range of custom window managers (WMs) is a very interesting aspect of *NIXes and a source of innovation. I can't however agree with your initial worries about Wayland.
Wayland actually makes it easier to roll out your own window manager than X11. The Wayland protocol is a lot more straightforward and has fewer extensions. You no longer have to deal with XRender, XDamage, XComposite, etc. You also don't need to think about setups where some of these extensions are missing.
It also does away with network transparency and replaces the protocol with a simpler, more unified OpenGL ES based solution. You have more freedom to develop something original as you aren't stuck in X11's rendering system which is a glorified painters algorithm. You can use XComposite to obtain more freedom under X11 but so far few window managers use it (compiz, KWin, metacity). The default for Wayland is very similar to what you find in XComposite but is faster (given a decent video card driver) and simpler.
The only drawback to Wayland is that it is rather new and the protocol hasn't been finalized. As soon as it gains wider adoption and there are a few example Window Managers, expect to see some interesting and innovative WMs.
What's Linux got to do with anything? Gnome and KDE run pretty nicely here on FreeBSD, and I'm sure Unity could fairly easily if anyone actually wanted it.
Well, the portability between kernels is nice on Unixlike and Unix systems, but you would agree with me that nearly all of the developers and users for these larger desktop environments and GUIs are on Linux specifically. Technically, you can run them on Darwin even..
Linux-style package management is a lot more useful for servers, where you have fairly tightly controlled distribution requirements. I find it an encumbrance on desktops where I often want to update individual apps or keep several versions of a single app side-by-side.
The number of people I run into with outdated 3rd party programs with security flaws under windows and os x suggests it is a pretty killer feature for the desktop to me. Some of these folks are even rather technical, but for whatever reason don't approve the updates from the variety of checkers that pop up randomly or late.
I have Python 2.5, 2.6, 2.7 and 3.x, but I am a programmer who writes lots of Python. And they were all very easy to install (2.6 and 3.x from packages, 2.5 and 2.7 from source). I also have two different releases of Eclipse (I also write lots of Java).
Package management frees you from managing the software you don't want to manage, like MySQL or Apache. I want to manage my languages.
As a desktop user, you might not need this a lot. Sometimes when you have a new version of a software that's really cool but a bit buggy and you keep the old version to get work done. Case in point: Blender 2.4 vs. 2.5.
As a programmer, I would need this all the time. Sometimes you need several versions of a certain programming library or language implementation like Python or Ruby interpreters. I also need several versions of GCC, I have one native compiler for C++0x, a cross compiler x86_64-pc-elf for my hobby operating system project and another cross for arm-eabi-none that I use to hack system-on-chips at work. I get my GCC from Git sources.
Nix (http://nixos.org/) is a package manager that allows you to install multiple versions of the same software. NixOS is a Linux distro based on that package manager. You can also use Nix in your home directory on top of another distro.
Unfortunately I have not had the time to try Nix. Anyone else tried it?
PS. I was visioning a "versionless package manager" that downloads sources from Git repos, builds and installs them and keeps the build files for fast updates via incremental build. I only need a name for the thing, which is better: "vpm" or "dll hell 2.0"?
portage on gentoo has this kind of functionality, although it needs to be enabled by the package, producing multiple 'slots' which can each hold a different version of the same software. GCC by default works like this, as does python between major versions. It can also do compiling from git although I don't think it keeps the build files around, but with ccache you can somewhat mitigate the recompilation time.
FreeBSD has the ports collection [1] and pkg_add [2] with --remote. The range of software I can get from those is wider than with many Linux distributions, and it's often much more up to date. And yes, they can automatically handle dependencies.
It's unfortunate that KDE doesn't get the attention it deserves. It has been long rebuked as a Windows clone.
But, KDE's Plasma Desktop is highly innovative (through a wide range of interesting plasmoids). The fact that it can be easily scripted in JavaScript makes it even more awesome.
Plasma is just a framework for building little desktop widgets, isn't it? Except that everything on the desktop (including icons and taskbar) is a widget?
Unity is very innovative, and so is Gnome 3, but that does not equal "all innovation". In fact, about 70% of Unity's innovations are features copied from the mac: indicators, launcher, panel.
Gnome shell, while innovative, I don't see it as aimed for the typical end user, instead it's aimed for the more technical users.
gtk2 is still present in Unity and it's still a major anti-innovative set back.
To be honest, I like Unity as a desktop better than OS X, but as applications, OS X is still ahead. (As a Unix I still like Ubuntu better, that's a different subject).
Gnome shell is not aimed at technical users. It's simplified or removed many features which were present in Gnome 2, as well as a lot of customization. The tagline, "Made of easy", and the usability testing they've done suggest that Gnome 3 is trying to be a general purpose environment that anyone can use.
They may think so, but the gnome guys imo (no offense) are a bit too technical to really understand the needs of non-technical end users. From what I've seen so far of gnome3 it seems very confusing.
I must mention Quicksilver for OS X, which apparently was the inspiration for Gnome Do. I haven't had much of a chance to use Gnome Do since I'm a wmii user but from appearances it looks about the same.
Why is that? As I said I would like to see a few more representative numbers. (I remember in seeing in 2008 a statistic, that the then nascent iPhone overtook Linux desktops in internet traffic. That was an astonishing eye opener.)
It lists the reasons or barriers for using Linux. When you compare these for different systems, you'll note that -- on the one hand-- the reasons to adopt Linux have become less convincing. For instance, stablility is not that relevant, anymore, since competitors are often sufficiently stable, too.
On the other hand, there was nearly no change concerning the barriers for adoption: Android made a dent in the 'pre-installed' barrier, but that was it, basically. On netbooks, Linux had a lead initially but lost it to Windows.
While the article states that Linux is on par concerning application and peripheral support, I have my doubts. It has no third-party ecosystem that is able to compete with its competitors, which is probably due to the lack of a viable business model for desktop applications, lack of a decentral installer, higher fragmentation, in general, and consequently less public support.
Can we just get something simple like the ability to undo window changes? Undo: z-order, window activate/focus change, position, window moves, resizing. How many times did you bring a bunch of windows to the front and then accidentally click on window in the background? We could use an easy to remember shortcut key like Windows/Apple/Super + Z and + Y to redo.
Hell yes to this. This lack of this functionality has been bugging me for a while. I finally wrote it down two and a half weeks ago:
> Every application should have an undo/redo stack related to view operations.
> The way I see it, applications currently focus on allowing users to undo operations performed on the model. I submit that applications need to provide a similar mechanism to undo operations that affect the application's view of the model.
> E.g., if you have a sidebar open and are deleting the items listed there, you can generally undo the deletion. But suppose you close the sidebar. Generally, you can't undo the sidebar close.
You're talking about window management changes, and what I've written about mentions only application-level changes, but the example I gave doesn't preclude desktop-level change management you talk about. After you've established keyboard shortcut scoping conventions <http://news.ycombinator.com/item?id=2495838>, accomplishing this is straightforward, from a software design perspective.
Correct me if I'm wrong, but the "more than one user" thing is a Unix feature, OS X basically gets it for free. It's not copied from windows; if anything, windows tried to copy it from Unix, and it was horrible before WinXP
No. It's the desktop login thing. IIRC, you couldn't do that under X. Windows offered the lock screen and log-in as another user on XP. OSX decorated that with a 3D cube and Linuxes started getting it a couple years later.
While (I believe) you're right about Apple adding the 3D cube effect for multiple logins, X was explicitly designed to allow multiple concurrent users on the same machine. While many Linux distros did not have that configured by default in the early days, some did and it was certainly doable on the rest.
(That's one of the things that display managers like xdm/gdm/kdm do -- handle starting and stopping of X servers for user logins.)
Here are some of the iOS features that Apple has "shoehorned into Mac OS X" Lion which I am quite sure will appear in Linux free desktops in future: auto-save (system-integrated save versioning, integrated with backup); resume; airdrop (local file sharing without having to be on the same wlan). Even the full-screen feature, which seems pretty funny on the face of it, is actually more akin to Opera's "kiosk mode" than a regular full-screened application, and I am sure it will show up elsewhere.
Honest question: as someone who dropped Linux in favour of OSX largely to get a nicer user experience, what is in the new generation of Linux desktops to tempt me back?
The videos linked from the article show a Spaces'esk virtual desktop and a more interactive Growl'esk notification box. Both very slickly done but not really anything mind blowing. The features in the other videos on the Gnome 3 site seem equally derivative; clones of the Window snapping from Win7, the icon view from Lion etc.
Am I missing something? Should I be looking somewhere else? KDE?
What I find exciting is the increasing integration of Telepathy into Gnome apps. The idea is to enable easy networking for all applications so the user can just pick contacts out of their address book and never have to worry about ip addresses or nat traversal. Things like collaborative editing or music streaming can be built on top of Telepathy rather than every application doing its own networking.
before you got low UX, high customizability with Linux.
now you get very comparable effects (customizable and evolving with 11.10 too) with all of the customizability and flexibility of Linux.
I don't mind the UI/UX of my MBP, but I prefer both Unity and GNOME3 and my heavily customized GNOME2 installs. The app-indicators and control center in Ubuntu are actually easier to use than in OS X. ICS to my Xbox took 10 seconds in Ubuntu. It took 10 minutes of Googling and editting a configuration file to get it in OS X. My phone works out of the box as a router in Ubuntu, it's not detected as an Internet device at ALL in OS X, who knows why. (And those tasks took less than a minute or two to access directly from the app-indicators or control center. (And very excitingly, there is a whole new connection center planned for 11.10 which out to be even cooler/faster).
I wish more people would come back and really spend the time to give modern DEs a chance. I personally can't stand KDE even after a lot of customization, but even Qt apps are indiscernable from GTK+ apps in Ubuntu (yes, even file selection dialogs, etc).
It's also funny that no one seems to know the history of these UX elements. Spaces have been a *nix feature since the late 80's, early 90's. The Grid plugin existed in Compiz as an unstable plugin for a long time before Windows 7 came out. They simply refined it for Natty.
Also, anyone on a MBP should try Unity just to see the power of the uTouch API they've built. Compiz is actually multitouch enabled. You can move and resize windows with gestures as well as expose the dock.
I too prefer a Linux desktop, but too many of the apps I rely on still aren't available. The day Photoshop and Ableton Live get ported is the day I switch back.
I still haven't found a Linux distro that really behaves 100% reliably on a laptop either. Suspend/wake issues and flaky wireless behavior eventually became dealbreakers for me, even if the blame lies with the hardware vendors.
For what it's worth, I haven't had any issues with Ubuntu in the last two years or so (HP Envy 15, Alienware M11x). Even things like docks with multiple monitors work smoothly.
i like imagemagick, gimp, et al for 99% of my needs, but having ableton-like anything would be great, although the state of multimedia timing and the myriad combinations of audio subsystems, i hope someone can persevere.
> flaky wireless behavior eventually became dealbreakers for me
Speaking of which, I'm fairly sure that NetworkManager fails to select the strongest Access Point when many routers are broadcasting the same ESSID. It's a nightmare to try to use Ubuntu on my MBP at my University. It seemed better in 10.10, I didn't get a chance to try it in 11.04 but that was my biggest frustration.
I don't know about others, but the reason I love desktop Linux is because of it's modularity. I can use Gnome, KDE or whatever other applications with whatever window manager I choose, and it all works. In an era where everything is being dumbed down for the lowest common denominator, I'm just glad that there is one environment where power users can flourish.
I've gone back to the standard desktop on Ubuntu 11.04, because the Unity UI is everything I despised (and disabled) in Windows 7.
I don't want a damned cell phone UI on my laptop or desktop, and I sure as hell don't want a damned web browser as my desktop UI. My computers aren't "appliances".
I should take another look at Linux, then. It's been my experience that Linux desktop development has largely been a game of trying to imitate Windows and/or OS X to make new users more comfortable. But if that's changing, then I'm thrilled.
All those innovation are in GUI area, and they have little to do with Linux (as a kernel). For example, One could easily run KDE on Windows. So the source should be "FOSS", not "Linux".
Mobile is where innovation happens today and Linux has almost no presence on mobile devices (except all major mobile OS using linux/freebsd kernel). It's a stagnation, not innovation.
What do you think Android is - and its not just the kernel. I have a working userspace on my phone. In fact, when I have spotty connectivity, the first thing I do is fire up the terminal and do a ping/traceroute.
Linux isn't just its kernel, by Linux we usually mean some distribution with most utilities readily available in packages. I can't buy a mobile device in the shop and install working Linux distribution without hacks, it feels wrong.
Android is a Java userspace on top of a Linux kernel.
It doesn't help that most people say "Linux" to actually mean "GNU/Linux", but Android is certainly not GNU/Linux and it can't be "Linux" either, because that's just the kernel.
Shouldn't we start calling it Android/Linux? Oh, and, BTW, Java is just the language most people use to write (most of the) software for it. Dalvik is not the JVM and, in fact, the compiler people use to generate Dalvik bytecode doesn't read the Java sources - it read Java bytecode.
We can, but I don't know how meaningful that would be. When I say Linux I think of a combination of things, including the kernel, the GNU tools, a WM - i.e. a distro.
The differences between Android/Dalvik and Oracle/JVM are few and far between (that's why they're getting sued :P). The important thing is that you write your apps in Java and you run them under a VM.
> The differences between Android/Dalvik and Oracle/JVM are few and far between (that's why they're getting sued :P)
Not really. They are being sued because it's a threat to Oracle's control of Java. Dalvik may very well look a lot like JVM, but reason alone doesn't have any influence on Oracle's legal department.
And that's one of the good parts of Android. But C and C++ are not first class citizens, everyone is expected to use Java (Google's words) and assumptions made about a normal distro don't hold for Android.
This is a poorly written article. Linux on the desktop is infamous for relentlessly cloning Windows and OS X. When citing the changes in OS X Lion (while leaving out major changes like the removal of manual document saving) that he believes to be trivial, he repeats "that's not a joke," as if that's a valid enough rationalization for his position that he doesn't need to explain it further.
The source of desktop innovation today is mobile operating systems. He believes that features from mobile OSes are being "forced" onto desktops, without explaining why it's bad to be adopting mobile features. Full-screen display, automatic document saving, and removing the need to manually quit applications are major innovations that simplify desktop computers even more, moving them closer to the long-sought idea of appliance computing.
I'm tempted to think the article was intentionally written as flamebait. The writing is poor, and there are no examples given to explain why exactly Unity and Gnome are so much more innovative. The absolutist claim that Microsoft and Apple have "completely dropped the ball," as if their operating systems haven't changed in 10 years, is just false.
The source of desktop innovation today is mobile operating systems, or more accurately, appliance computing devices that finally remove extraneous aspects of computer interaction (e.g., manual saving, manual quitting, filesystem management, and so on).
I can navigate around windows and launch applications on my computer using a customized wmii/dmenu combo much faster than is possible on a mac or in windows and I love that. If Apple switched to a similar setup in Lion, however, they would make some extremely small percentage of their users happy while turning away the vast majority of their customers. Apple and Microsoft can't be what they are without catering to the lowest common denominator and part of that is playing it safe. Every radical decision that they make is a huge risk with the potential for a negative impact on their business. If somebody wants to make some experimental window manager for GNU/Linux that attempts to change the paradigm then the worst possible scenario is that they wasted their time writing code that people don't end up using. If Windows tried the same thing then they could lose real business.
The beauty of the Linux ecosystem is that users have unparalleled freedom in deciding how they want their computers to work. This freedom allows people to create new programs without much risk to the community and it allows multiple projects with different goals and principals to simultaneously coexist and thrive. This is a major strength of GNU/Linux and I think it's part of the reason why we see so much innovation within that community.