I've had it installed on my main machine since late December. There were some rough patches as things were landing during the beta period, but right now this is easily the best desktop I've ever used.
My workflow adapted to some of the additions in Unity so quickly that it was absurd. At this point I could never go back to something without Super+#, Super+w, Super+s and the other keybindings found here: http://askubuntu.com/questions/28086/unity-keyboard-mouse-sh...
Just be sure to install compizconfigSettingsManager from the software center and you can tweak Unity to some degree.
Just an FYI, what I tend to dev on is Ruby/Rails, Python, Javascript (Node/etc), some Erlang (not as much anymore) and Android dev. This system is so freakin' fantastic for all of those...really quite happy.
My workflow adapted to some of the additions in Unity so quickly that it was absurd. At this point I could never go back to something without Super+#, Super+w, Super+s and the other keybindings found here: http://askubuntu.com/questions/28086/unity-keyboard-mouse-sh...
There go my hopes that someone would enshrine a sensible scheme for keyboard shortcuts. On new GNOME installs, I remap shortcut modifiers to
Super for actions that operate on the window. This is actually pretty straightforward since most of the default shortcuts can just be adjusted so replace Alt with Super, e.g., Super+Space to invoke the window menu, Super+F4 to close the window, Super+click-and-drag to move the window from anywhere on the window, Super+Tab to tab through windows. It also helps to dealing with this mentally when you realize that Super maps to the Windows key for most keyboards.
To pull the perspective back, Super+Alt is used for actions with a scope wider than one concerned with the current window, like desktop stuff. Super+Alt+Left/+Right for switching desktops, Super+Alt+F2 for bringing up the Run dialog, Super+Alt+T for opening a terminal, Super+Alt+F1 for GNOME menu, etc.
This has the added benefit of freeing up any combination of Ctrl and Alt to be used by applications. This can actually be a big problem, otherwise; I've run across many applications that assume Alt+Function key shortcuts, which cannot be used since they're bound to some global action in default Ubuntu (GNOME?) installs.
>Just be sure to install compizconfigSettingsManager from the software center
That's great to hear, I had just assumed Compiz wouldn't be available on Unity. I don't think I could live without Compiz Grid and Compiz Negative.
Edit: I see Unity is actually built with Compiz. Somehow I missed that on the Ubuntu Unity website way back when I started reading about it to evaluate it.
Unity is built using Compiz and requires 3D acceleration; a lot of people have issues running it in a VM. There's a Unity2D project built using Qt, but I don't know if it was mature enough at the time of release to make it in as a fallback.
I've just tried it within the latest VirtualBox today and it works wondefully well. The install was simple and efficient, and it integrated nicely from the get go (with being able to use mouse and keyboard seamlessly).
I only had to install the guest extensions to get 3D acceleration and Unity to work.
You can install Unity alongside Gnome, and choose between them using the menu at the bottom of the login screen.
To do this, open Synaptic, go to Settings -> Repositories, and in the Other Software tab click Add and paste this line:
ppa:canonical-dx-team/une
Save your changes, then click the Reload button in Synaptic. When the new package lists are loaded, install the package called Unity. When you log out, you should be able to choose "Ubuntu Unity Netbook Edition" from the menu on the login screen.
Sure, it might be if GnomeDo was installed by default and tweaked to do the same thing. But since it isn't, I would hardly call it "pointless", but rather, awesome.
It seems like that's a big theme with the latest Unity changes. Huge improvements over the default desktop, but nobody I know was using the default desktop; they all heavily modified it with devilspie, xmonad, and friends, which are far beyond what Unity offers.
So while it seems like a good development in the abstract, it's hard for me to get excited about it personally.
One reason I used wmii instead of the default is because the default stunk. Unity is raising the bar on the default desktop now; I'll definitely give it another look.
I second this. I read some information on Unity and it all sounded so common to me that I thought WMII and the likes actually had an impact on how people making window manager think about window managers.
I'll stick with WMII though as it never gets in my way.
Soooooo cool that they realized that every keyboard has a windows key!!! Instead of adapting the ms shortcuts, they find different shortcuts which do the same, how great is that?!
When last I used the Unity shell, it was clearly not ready for prime time. I'll spend some time later today, cross my fingers, and discover if that has changed or not. There's much more in a new release than just that, but the supposedly-cleaned up Unity will probably get most of the press.
Some concerns aside, I respect a lot of the chutzpah that Canonical is showing: Unity, Wayland, trying to force KDE/GNOME to work together on a notification API. They want to see the Linux desktop improve, and even if people fight against them and they lose, to me it feels better than the inertia of the status quo. Stasis gets no one anywhere.
Canonical have nothing to do with Wayland. Mark Shuttleworth wrote a blog post saying they wanted to use it (in a ridiculously unrealistic timeframe), but they've contributed nothing to meeting that goal.
They haven't tried to "force KDE/GNOME to work together on a notification API". They did their own thing, then later on adopted and modified a KDE protocol (which in the first place didn't even do the primary thing they were aiming for -> menus), then complained about GNOME not adopting their code (which is a separate issue to the protocol).
They're basically doing their own thing now, not paying a lot of attention (let alone contributing) to upstream projects.
I'm not going to join in, but I have trouble reconciling the arguments "GNOME not adopting their code/protocol" and "not contributing to upstream projects".
GNOME rejected the libappindicator code and approach (largely because it was irrelevant and badly timed), but not the StatusNotifier protocol (as defined mostly by KDE, then once modified, the basis for libappindicator).
The "not contributing to upstream" comment has a much broader context than that particular issue though. One late attempt to push something fairly irrelevant upstream does not define the entire context for their engagement. :-)
Man, Ubuntu's changed a lot since I used it last (8.0 days). They've taken a lot of UI cues from the mobile space (app stores, launchers) and adapted it to the desktop.
It might take a couple more years but somehow I have the feeling that Ubuntu might be heading for the mainstream. Rightfully so.
I'd like to think so, and they're doing a great job, but I can't help but feel they're a few years too late. I suspect general email & browsing is going to happen more and more on tablets and phones and people will only want full-blown computers to run specialized native apps, most of which will probably never be ported to Linux.
Of course, thanks to Android, Linux will be more relevant than ever in this scenario.
For consumers that can afford it, tablets may replace a lot of what a desktop computer does for those people, but there are many people who are lower income who will still depend on very inexpensive desktop computers, either ones the own individually or (more likely) ones that they can access at their local library. I think Ubuntu is heading in a direction where it will replace Windows for users that only need an office suite and a web browser but need to use a cheaper desktop machine.
I think it's much more likely lower income people will use a smartphone as their primary computer than a desktop machine. Cell phones already have incredible penetration, and the cell companies (not to mention Google) are going to push smartphones into that population in short order.
I agree. Everybody will have a smart phone because they're so useful and for a lot of people that will be all they need, particularly if money is tight. I'm in Vietnam now and most people here make next to nothing but they still have smart phones and I even see a lot of iPhones, even though they cost a fortune by Vietnamese standards.
I see a fair number of cheap desktop computers too but they're all running bootleg copies of Windows. People aren't paying for their software anyway so the price advantage of Linux is negated.
A second thought: Linux might be able to compete on the basis of security. Every single Windows machine I've used here has been riddled with trojans and viruses. I have a feeling this is common in a lot of the developing world.
I hope so... I could easily have a dozen family members switched over in a heartbeat - if only iTunes were readily available (and not having to use vmware or such...).
I hate to be 'that guy', but have you tried wine? Wine typically has very good compatibility with popular applications like iTunes.
Okay, I just checked appdb and it kinda works. (May be worth trying a prerelease of wine). Otherwise, ubuntu should have built in iPod support (I am assuming that's what you need iTunes for), and if not, I guarantee there's a way to get it working on the internet/wiki somewhere.
It is never a good idea to install a brand new release on a production system unless you have to (i.e. if it is the only easy way to fix a show-stopping usability/security/compatibility issue) - this goes for any software. Play safe and let the pioneers get scalped, and/or try it out on a less critical machine first.
I recently bought a new netbook and rather than waiting a couple of days or installing the beta to upgrade later I put an older release on. 10.04 in fact, rather than 10.10, as that is an LTS release and I was in a cautious mood.
For this reason, I rather conservatively remain with my LTS, no system crashes since the .1 release and I'm extremely happy with it, although I do miss the excitement of these new releases and its clear that there is a lot to look forward to in this release, especially if Unity is as good as some of the comments here suggest.
As a fellow day-job-using Ubuntu user, I'm with you. In fact, it's not just this release, I always wait a bit to upgrade, since the general consensus seems to be that fresh Ubuntu's are often a bit touchy :)
In the past I've found that if I installed packages that weren't in the main release I would sometimes run into gigantic dependency issues that I couldn't resolve without a clean reinstall.
I've never had a problem with a Debian in-place upgrade; I've got servers that have been upgraded from woody through to squeeze without a hitch. Where I've had problems with Ubuntu, it's been around the desktop, and udev/hotplug integration.
That being said, I've not given it a fair try for a few releases, so it's possible that in-place upgrades are better tested now, but I don't see the point of taking the risk. If it goes wrong, I'll end up reimaging anyway, so why waste the time?
I'm not the OP, but on my in-place upgrade to Natty today I had the nspluginwrapper installer die with a segfault. As a result, the update "failed" (only nspluginwrapper failed, really). I had to remove google-talkplugin to fix it (so really, the bug was in Google Talk, but apparently nobody at Canonical has Google Talk 64-bit installed to encounter this). So yes, seemingly-inconsequential system changes can interfere with upgrades.
Yea that too. I've heard bad things, but I've never even attempted it myself. Years of Windows has built up some serious negative reinforcement here :)
It's no big deal if you don't mind having a few partitions on your system. Personally I have a large partition for my /home directory, and smaller partitions for testing out new releases, distributions and such. If one of the releases is/becomes unstable, you can just select your old reliable release in the boot loader and get work done on that. It has worked out really nicely for me.
The only issue would be if you run a newer version of a program that updates it's config file (or something similar) format, which ruins things when you boot back into an older partition running a prior version of the program.
This sounds like you do not have a working backup plan. Instead of installing the new Ubuntu, make sure you have a backup plan, test it and automate it to secure you against a random massive computer failure that would make you unable to get work done.
Arch tends to keep configuration formats and file locations between major versions. I've found that with Ubuntu, simple things like mouse button config change drastically between versions. I love Arch on the server but Ubuntu is just too easy and has so much software available for the desktop.
For the last few Ubuntu versions I start with Ubuntu Minimal (30MB instead of 690MB) to avoid all the extra noobuser bloat that comes with Ubuntu default. This way I have a clean installation similar to Arch but without having to hand-configure everything.
I'd like to point out that this is a good time to really test your connection. I've never downloaded anything faster than a torrent of a fresh ubuntu release.
Though unfortunately upddate files are still distributed through centralized repositories.
I wonder whether there'd be major security concerns if a torrent layer was stuck over the main centralized backbone? It could probably reduce bandwidth costs for hosting and provide better speeds to people, though keeping the "swarm" updated might be kind of tricky.
Security concerns can be dealt with via the pre-existing public/private key crypto that's going on in apt already.
The problem with torrents is they don't work well for lots and lots of (relatively) small "file groups" where each user has a lot of the files, but each user has completely different files. Something based on tiger tree hashing or some other mechanism like that might be more suitable, perhaps?
True, in order to provide consistent benefit to users you'd need some sort of mechanism to determine whether it'd be faster to "grab from the swarm" or "grab from the repo".
Perhaps even setting up each already-existing mirror as its own seed-center would help - there are plenty of them around, and by definition they all have the same data. Worst case you end up getting only one connection, and you have the same performance as now, but in the best case you can download from multiple locations at once and get some rudimentary load balancing.
+1 for this idea, I think "eventually updated" should be largely sufficient for a generally one-way change like repository updates.
I'm "live updating" right now, and the time estimates jump between 2 hours and four days depending (I believe, given the stability of my schools internet connection) on the load on the ubuntu repositories.
BitTorrent download speed has a lot less to do with bandwidth than with how many peers you can connect to, and downloading these things fresh there is often a lack of them.
I BitTorrented the 64-bit Desktop edition it as soon as OMGUbuntu broke the news on Twitter and it took me around half an hour (for 700 MiB).
Yep. I usually find that most reasonably popular Linux distros will max out my down connection over torrents. Pretty nice way of telling whether my ISP is throttling or lying to me about my bandwidth.
Of course, this is the release that changes the desktop interface around quite a lot. I'm a bit hesitant, although this answer in the FAQ was soothing:
No problem at all. You can choose to launch the classic desktop experience when you log in to your computer.
Not sure if this really means that the choice has to be re-made on every login, or if is remembered. Anyone?
IIRC it's just using the standard GDM "Choose an X Session" functionality. The same thing tha allows you to havea KDE and GNOME installed at once and switch between them. Last time I was using it, it prompted when you were choosing a different setting than your last login with a "Do you want this choice to be one-time or remembered?"
Though Unity works as expected (awesome!), there still seems to be some other bugs with VirtualBox integration. A big one for me is that seamless mode doesn't work when running 11.04.
You can still run Unity on any VM (or lower powered machine) by using Unity 2D. It doesn't ship with 11.04 (it will in 11.10, I believe) but you can get it like this:
It's in 11.04, you can just install the "unity-2d" package; you should only use the dailies if you're looking for trunk builds that might not be stable
Since I've been using Ubuntu (year and a half now), upgrade was always easy, and always resulted in a month of little annoyances afterwards. But I look forward to it nonetheless.
Upgrading now and feel like a kid on a christmas morning :)
Ok... this seems pretty much out of place but I dont suppose I can get a better answer anywhere else, so here it goes.
I'm a student in India and in a dilemma about buying a Mac or buying a windows machine (dual booting with Ubuntu 11.04 ). At about 3/4th the price of a Mac I can purchase a more powerful windows laptop and boot up Ubuntu ( and thus avoid windows altogether ).
I need a workstation for Ruby on Rails/ Node development. Since my parents will be the one paying, I want to be sure if Mac is worth it. I've never worked on a Mac before but since I've read that most startups that are hiring offer Mac to developers, I'm guessing owning a Mac would really make development more enjoyable. Would love to hear your guys' thoughts on this.
There is no easy way to answer this question. You can definitely get a more powerful PC for the price. With the Mac, you are buying some brand, some better construction, and a lot of better customer service. You are also buying OS X, which may or may not be attractive.
I used Linux for many years as a primary development machine, and it was great. Ubuntu is doing a good job solving the "there is no good desktop for Linux" that was pretty true five years ago.
That said, I prefer my Mac because I get almost the same ability to play at the command line (I spend most of my day in Emacs right now, including running my shells there), but I also get a refined experience when I'm interacting with applications outside of development (e.g. Garage Band, Pages, Keynote, etc).
In the end, I didn't switch for development, I switched for all the time I use my laptop when I'm not developing. For me, it was worth it, but YMMV.
If you use Emacs all day on a Mac, what do you use for the Meta key? I find I'm not really happy no matter what I do. If I make Option into Meta, I have to bend my thumb way under my palm to hit Meta. If I make Command into Meta, then I lose access to lots of global Mac UI shortcuts. I guess I could experiment with swapping Option and Command and then using Option as Meta, although I'll have to retrain a lot of muscle memory.
And then there's the fact that one Emacs window in a terminal is still not as productive as five Emacs windows under X11, and X11.app still sucks, and the various native Mac Emacs ports still don't feel right to me.
Overall I find using Emacs on Linux X11 still provides the best Emacs experience. This alone will probably keep me on Linux as my primary work environment for the foreseeable future.
* Meta is mapped to the Command key (and the option key).
* Control is mapped to control.
* I suppose I should try to remap the Fn key to be Control to really make it work right instead of the weird curve that currently has to happen.
When I really am interested in getting typing done I hook up a MS Natural 4K. It's a much more pleasant experience. My keybindings are customized for maximum ease there (and believe me, my hands/wrists do not hurt after a long-term coding session there). Laptops are fine for messing around for a while, but after a period of typing my hands hurt.
Unless you intend to do development for iOS, buy a PC and dual boot into Ubuntu. Even if you really need Photoshop of Office, you can run them on the Windows side. You'll feel dirty, but you'll be fine.
With a Linux machine, you'll have an environment closer to whatever server your application will run (nobody deploys on OSX and nobody sane deploys on Windows), which is a bonus. Plus, package management will save you lots of time. If you want to partition your machine on several environments, you can use Linux containers to do that and have many little Linux "servers" running without interfering with your desktop and taking up very little resources (much less than full blown VMs would).
Disclaimer: I am writing this on a Mac. I love Macs, but I work on Linux.
Having a linux desktop and mac laptop, I prefer my linux box, and probably will not get a mac again in the future. I do Java and Ruby dev. Apple basically won't commit to java in the future. Apple updates do cost money ( not incremental updates) as well as most of their software. Now I'm not saying there are not free ones, but I want utility not eyecandy in most aspects ( and I understand people will probably not agree with that point) and the repository pretty much I can find what I want. In order to use a lot of the tools I want, I need to use macports or fink. Paths to hardware upgrades are going to be some what easier on hardware thats not mac, probably along with price of hardware. I use mainly cloud/web utilities for things like google apps, calendar, and email.. I don't care for a lot of the native mac apps, I really can't think of a single mac app that I use that isn't available on ubuntu.. but the good examples are things like iMovie and garage band. I really don't understand the point of development being more enjoyable, you're going to be living in a text editor and hopefully you understand commands to run things without a gui ( example would be code repository ). Also part idealism as well for me, I like be a supporter and advocate, and when if I have a chance to educate people and be interested in alternatives, I'm all for it.
As others have stated I think its what you do outside of development. Also you can look at the osx86 project if you're adventurous.
Get a Macbook Pro, dual install Ubuntu via Bootcamp, then run Windows in a VM in either OS (fine for testing, at least). That lets you learn OSX and build iPhone apps while still being able to use Linux as your primary OS if you desire.
I've been researching this setup the past few weeks, and it appears the only issue was that the linux gfx drivers for the HD3000 gpu (embedded in the i5/i7 Macbook Pro Sandy Bridge cpu's) were still alpha/beta quality, but apparently that's fixed in Ubuntu 11.04.
I'm still see-sawing though. My other option is to get a tricked out custom Sager with the latest Nvidia GTX 485 gpu and use it to start farming bitcoins (and perhaps learning CUDA). The embedded HD3000 gpu in Intel i5/i7 gets very good performance for an on-chip gpu (google for it), but is not in the same league as the 485 GTX.
I own a Mac(Book Pro), and I dual boot Mac OSX and Ubuntu 11.04 on it.
I spend 90% of my time in Ubuntu (mainly developing).
I honestly think that if you're tight on cash and you just need a machine for development, get as much CPU-cycles and stability for your money as you can, and run a linux distro.
Most all laptops are equal for development, regarding processing speed at least. As long as the CPU and video card are not budget types, and there is enough RAM.
What makes a big difference to me is the display quality and the battery. Apple tends to have very good batteries. That's why I personally would buy an used Unibody Macbook or Macbook Pro, one of the recent types with the long-life high-capacity batteries.
I do mainly Linux development work, on Ubuntu, using an older Macbook Pro, and everything just works great and is super smooth to use - in fact this is the best Linux machine I have used!
I use a macbook pro and it truly is a better laptop than anything else, but if you're looking at a desktop then the argument becomes a lot murkier if you're not doing iOS development. If cost is an issue, I'd build a windows desktop from parts that are OSx86 (Hackintosh) compliant so if you need the osx experience you can get it. That gives you the best of most worlds' I'd say without having to pay the mac premium - though if you're getting a laptop, then the mac premium is almost certainly worth it IMNSHO.
It's not that it's not worth it, it's more if he's constrained by price he could get a perfectly functional self-built pc for a fraction of the price of a mac.
This is not the right place for your question. Anyway, you shouldn't buy a Mac. Ubuntu with gvim can be the most enjoyable environment for rails/node development.
Your question is fine. If a thread raises a question in your mind, then that can be the entirely appropriate place to ask your question. Keep asking questions.
I'm not going back to linux on the desktop until it can sleep my laptop reliably. OSX is so nice for that. Just close the lid and go... I can't imagine working any other way now.
It isn't that sleep works reliable on OSX, it's that sleep works reliably on a Mac.
You see, when you buy a Mac you know every feature (including sleep) has been tested thoroughly. When you decide to install an OS on a computer you already own, you put the onus to test on yourself. I don't understand why people continue to make this unfair comparison. If you want a linux computer where sleep just works, by a preinstalled linux computer. System76 makes fantastic ones. They thoroughly test every feature. They test when new versions of Ubuntu get released. You'll have the same experience you have when buying a Mac or when buying an HP.
Are there any major PC manufacturers that ship with Ubuntu? I imagine the day you go to Sams and buy an Ubuntu system will be a big milestone for Linux.
I see 4 laptops and 1 netbook at http://www.dell.com/us/business/p/laptops#facets=80770~0~179... (use OS filter if link doesn't apply) different models running 9.10, 10.04, 10.10. For some reason the link on dell.com/ubuntu only shows the two Latitudes, not the three Inspirons.
Not major manufacturers, but also see Zareason and System76.
Even dedicated Ubuntu system providers like System76 run into hardware problems.
Ubuntu 10.04 removed (previously working) support for System76 Starling Netbook wireless cards. It was several months before they had a workaround, allowing their customers to finally upgrade to 10.04 just in time for 10.10. System76 then said they felt 10.10 was not ready for their Netbooks, and said their users should remain on 10.04 until further notice. That post was stickied until yesterday.
It may be an unfair comparison, but it's still very relevant (although on the other hand you do have a point- there are preinstalled systems you can buy the same way you would with OS X or Windows).
That's a good point. I actually had sleep working really well on 9.04 then upgraded to 10.04 and it broke. It's that type of pattern and the frustration that is worth the premium Apple tax I guess.
It's a wildly varying experience that depends on your hardware and the distro. What I've noticed with Ubuntu, though is that hardware support keeps getting better with each new version. Before I buy any new piece of hardware, I check for Ubuntu support.
I've been able to sleep my laptop (mostly) reliably since 2006. It all depends on what hardware you have. The "mostly" is because in 2007, ATi released an update to their catalyst binary blob driver that broke sleep and hibernation. I try to steer clear of non-open source drivers now.
The fix is available in kernel 2.6.38 ( Ubuntu 11.04 is 2.6.37)
I run Ubuntu 10.10 with a PPA 2.6.38 kernel to get around the problem. Please note this problem is at the kernel level - going Fedora will not help.
Ubuntu is more or less the only OS I use currently - sleep/suspend/resume issues still exist. I do hope they integrate projects like TuxOnIce, etc. into the mainline - I would gladly pay couple of hundred dollars to have working sleep/resume on Ubuntu laptops.
Considering that Dell, HP, Lenovo, Acer, Asus would have about 15-20 models at one time (considering the motherboard + wireless), I'm surprised there is no third party company providing supported Linux - for companies like KSplice, this could be another line of business.
true - my mistake.
Although you should also know that 2.6.38 and 11.04 has a mysterious power management bug that increases laptop battery usage significantly.
I can't even say that my macbook sleeps reliably. There are some times where closing the lid fails to put my computer to sleep and manages to kill it's battery while it's in my bag. Occasionally, I'll open the lid and my macbook will not come out of sleep mode.
I've been running Ubuntu on three systems for quite some time: a unibody Macbook Pro, a Lenovo T61p (t60p before that) and a System 76 netbook. Sleep works great on all three (be sure to install the mactel PPA for the MBP).
That being said...I know for sure there is a configuration out there that doesn't work...that is the beauty and benefit of a locked down ecosystem like Apple. They control everything and absolutely can guarantee the experience....and to be honest, if Apple can't deliver on that promise with the compromise of choice, they shouldn't be in the game.
So, I don't really think it is an issue, or nearly as big a one as people make it out to be. I find that if I stick to the bigger Linux friendly manufacturers (Lenovo) or the ones many people will buy (MBP), you will have good luck.
This, along with automatically using an external monitor when one is plugged in, is the reason that I switched to the Mac for laptops.
As others are saying, it sounds like the suspend/resume feature works well now. I can add that at least on my multi-monitor desktop Ubuntu 10.10 will reliably automatically adjust when I plug in or unplug one of my monitors. I found this out accidentally and it was really a pleasant surprise.
Of course, as always with Linux, both these features probably depend on how well your hardware is supported.
I recently started digging into Arch and getting sleep to work was surprisingly easy. (pm-suspend and modifying handler.sh for power button/lid events).
Are there any up-to-date statistics how many people uses ubuntu? Ideally also compared to other linux distributions, windows, osx.. Google gives me only few years old numbers or estimates, I guess its not easy to do these kind of stats?
Been using Unity and natty for a few weeks now. The only feature I really miss from the gnome interface right now is the ability to add / move / remove toolbar widgets. Has anyone figured out if it's possible / how to do this in Unity?
Overall I feel a lot more productive with Unity. The numerous super- functions are really helpful and nifty. I especially like the super- 1-9 for positioning/resizing windows on the current monitor.
I find the description of which image to choose confusing. You need the a 64-bit image to handle processes over 4GB, right? Why is the x86 version recommended for "most machines with Intel/AMD/etc type processors and almost all computers that run Microsoft Windows." (That's from the server description, too.)
I think it's almost entirely due to flash. Most users won't notice the difference between 32 and 64 bit machines, and it's still much simpler to use flash in a 32 bit browser. 32 bit chrome has flash bundled, and 64 bit chrome does not, which will cause confusion as well. At work, I use a 64bit OS with the pre-release of flash "square", but at home it's just easier to use 32 bit on my laptop.
> Flash hasn't been an issue for quite some releases in Ubuntu
Your personal experience doesn't make it true for everyone.
The official flash release has to be run through nspluginwrapper on 64-bit systems, which can be a pain when it doesn't work. Suggesting that the average user stick with the 32-bit release removes any support issues related to 32 vs 64-bit plugins (java included too) and nspluginwrapper, with no noticeable drawback for most users.
That's simply not true any longer. It just isn't. It's not "my personal experience". If you install ubuntu-restricted-extras, all of this is done transparently without ANY work on the users' part.
I would know, because I've never installed Flash manually on this system.
Yes, this is done transparently, and most people won't have a problem. As with most things Ubuntu, as long as you go with the defaults, everything will work smoothly.
What I'm saying, is that flash, java (plugin), and nspluginwrapper can be broken. It happens; and when it happens, it's not obvious what went wrong or how to fix it. Going 32-bit simply removes this support issue altogether.
I find that in order to run Eclipse properly with lots of plug-ins, a fairly big project and Tomcat running at the same time you NEED 4gb. Yes, Java uses a lot of memory but I have 9 gb in my desktop so I'm just fine thanks. 4gb dimms are available for about $54, according to Pricewatch, so quit complaining.
PAE kernel is really only useful in the special case that you have some ancient 32-bit server hardware that supports > 4GB. Anyone else over 4GB should run 64-bit.
For almost every processor architecture going to 64 bits slows things down due to increased memory usage since very few programs need 64 bit integer addition. However, in x86 the transition to 64 bit doubled the number of architectural registers to 16, and so actually resulted in a speedup.
Hypothetically a 64-bit application could be designed to save space by hacking together a sort of "near pointer/far pointer" system by using native 64-bit reference pointers and 16-bit or 32-bit offsets from them. This would save on a lot of the space used for storing pointers to nearby data.
May improve cache hit rates, but my Dell laptop running 64-bit version never hits the swap partition unless I open a couple Eclipses or do something equally stupid.
Most older machines and many new machines aren't 64 bit. All 64 bit x86 machines will support a 32 bit x86 OS. Many people aren't sure about what's inside of their computer. That's why it's the default.
On Sandy Bridge graphics,X server freezes or flickers everytime anything interesting happens, like screensaver or suspend or console switching. New graphics bugs are reported daily. My X has frozen hard 6 times in 2 days.
Nvidia drivers have some trouble also.
In short, 2011 hardware is not compatible with Unity or Compiz at all. Legacy Metacity is slightly better, and disable all power management to get a mostly stable system..
FYI for other users: installing Gnome3 via UGR (http://ugr.teampr0xy.net/) will break your unity install.
Ubuntu (and Unity) is still Gnome2-based (I haven't heard anything, yet, concerning a move to Gnome3 for a future release).
So upgrading to Gnome3 is a one-way upgrade and will break your Gnome2-dependent environments. Although I've done a ppa-purge to uninstall the UGR packages and things seemed to have gone back to normal, pretty much.
Networking is extremely flaky in my home LAN. Since I set up a new box with natty, my box had been crashing my actontec DSL modem. There are some workarohbds online for disabling ipv6 and restarting wifi when he module crashes and the indicator Applet loses its connection to he network-manager service.
Ubuntu's website shows of its Dash "Spotlight", Launcher "Dock", Status Icons "Menu bar with status", Workspace "Spaces", Ubuntu Store "Mac App Store".
It's almost like they're trying to directly compete with Mac OS X, with "killer features" that match exactly with what apple regularly shows off with Mac OS X.
Probably a good strategy. I would like to have a pretty good OSX clone for PCs. There's no good reason to clone Windows -- if you have a PC you can just use the real thing. You cannot (easily) run OSX on a PC.
Besides that, Windows is insanely complicated. There are zillions of little panels, little things that make other things happen, you have that crazy google-to-find-a-program-that-does-what-you-need and then download it and entrust your machine to a foreign executable (you run it as an administrator in order to install it) and begin that ridiculous next-next-next-finish dance. And do that for every updated version.
The first pitch I ever heard for Beagle was 'it's like searchlight', so I always assumed it came later- but I looked it up, and it's hard to tell. inotify 0.8 was announced about a month after Steve Jobs started advertising spotlight, and Beagle grew out of Dashboard which certainly has design documents that indicate they wanted to do that. So... I think you might be right.
Funny. Apple takes features from other OSes like Linux. Linux in turn takes features prominent in Apple's marketing strategy for OS X and replicate it in their own.
Docks of one sort or another have been around for years. RISC OS had a dock-like icon bar in 1987, GNUStep nicked one from NeXTStep in 1994, and I'm sure there are earlier examples.
That's an interesting point. If so, would it not be a little misguided?
I'd love to see some data one way or the other, but in my experience and among my friends and associates, people who buy Apple computers are incredibly likely to value the polished, "just works" factors of OS X and if they do care about being UNIX-like, it has them covered too. Then among those who run Linux as a primary OS, they do so without paying for the shiny Apple hardware. Again I stress this is a personal subset and I welcome it being corrected or confirmed by hard numbers.
At the same time, perhaps I am looking at this the wrong way:
If you have a goal of being the best desktop experience, then better to pit yourself against the best?
Don't forget moving the minimize/maximize/close buttons to the left side of the window toolbar. At least that's easy to change.
I really like Ubuntu because it comes out with an update every 6 months so users keep getting the newest software but I can't stand this flagrant Apple copy-cat ism. Is there any Debian based linux that updates reliably and has a sane UI?
Linux has had those things for longer that Mac has with the exception of the global menu.
Docks have been around before Finder adopted a dock. Beagle predates Spotlight. Workspaces have existed for as long as GUIs have for Linux and Apt has been around for over a decade, with the Ubuntu Software Center in its present form predating the Mac App Store.
Pro-tip for people with laptops that have switchable graphics - Go into your BIOS and set the default to "Discrete graphics" and not "Switchable". Otherwise Ubuntu will default to integrated and you'll miss out on all the fancy animations.
aargh! I just installed it in Vmware Fusion on Snow Leopard only to realize it does not support OpenGL for Linux OS. Ubuntu went into a fallback mode and disabled Unity. Seems like I need to switch to a different Virtual Machine Software, any suggestions?
I don't know about VirtualBox and OpenGL, but I just installed Parallels and everything works swimmingly! Better native partition support and what seems like very good OpenGL acceleration.
I also got native partition booting working very quickly in Parallels - I can boot both my native Windows 7 installation and my native Ubuntu installation - that are On Partitions Of The Actual Hard Drive. Took a tiny little bit of nudgework, but now it just goes.
I am wondering, if Unity isn't supported due to lack of hardware acceleration, does it fallback to Gnome2? (does it mean it still bundles Gnome along with Unity?). This is essential to make a choice between Xubuntu and Ubuntu when running in a VM
I've had the 11.04 beta installed for a few weeks now, and though there have definitely been some bugs that have come up, even the beta of natty has been really stable overall, much more so than some of the previous Ubuntu releases.
I always was a little skeptical about unity , and thought I'd rather go with the gnome3 , but consider it a peer pressure or whatever i am upgrading to unity as I type this :) [and yeah i'm every bit excited] , cheers .
- better AppArmor
- "PowerNap" (reduces power consumption by 14% by selectively disabling cores on certain Intel hardware.
- "private clouds" (don't know what that means)
- Latest Linux kernel (2.6.38 vs 2.6.35)
I'd like to know this too. Ubuntu Server is headless. I tend to stick with the LTS versions even for my desktop unless something really amazing is introduced in a subsequent non-LTS version.
I was very impressed by Unity, I'd been using a tiling window manager for a while but after rebinding some miscellaneous shortcuts to Super+(right hand key) I'm totally happy with it.
You should install ubuntu-11.04-dekstop-amd64. There was never a notebook version of Ubuntu. For a while, they had a separate "netbook remix" that you could download and install, but that has been discontinued.
i'm shopping for a new laptop to put this on. i'm looking for something maybe like the samsung series 9. ideally, i'd like it to be as slim as that, but have a 15" screen. anyway, looking for laptop HW recommendations from others here. i've run ubuntu on a number of thinkpads before, but i want to buy a new laptop to replace my MBP.
Thanks. Right you are, right after a reboot the prompt appeared. Maybe that's why my question was gathering (-1, Moron) downvotes. Remember people, there are no stupid questions only stupid users.
I did dare. I'm in, but it took getting very scared looking at a black command prompt with no gui. Doing "sudo apt-get remove flgrx" (to reset the graphics settings) did the trick. After a couple of restarts I was in.
I've had it for a little bit, but I went back to Ubuntu Classic when I saw I couldn't add a "Show Desktop" to the panel... Between gnome-do and the window snapping feature a-la Windows 7, "Ubuntu Classic" works well for me.
Besides, I'm experimenting with moving all my servers to Debian Stable. It's interesting how much Ubuntu does for you automatically...
So, I'm running Ubuntu 10.10 - but I use awesome-wm instead of gnome. Is it worth to upgrade to 11? After reading the comments here it seems like the big thing about 11.04 was the new gnome shell.
My workflow adapted to some of the additions in Unity so quickly that it was absurd. At this point I could never go back to something without Super+#, Super+w, Super+s and the other keybindings found here: http://askubuntu.com/questions/28086/unity-keyboard-mouse-sh...
Just be sure to install compizconfigSettingsManager from the software center and you can tweak Unity to some degree.
Just an FYI, what I tend to dev on is Ruby/Rails, Python, Javascript (Node/etc), some Erlang (not as much anymore) and Android dev. This system is so freakin' fantastic for all of those...really quite happy.