On a somewhat related (non-mobile) note, I really regret buying my AMD 5870 three years ago. Things have changed quite a bit in that nvidia has done some serious work on their Linux binary drivers, while AMD's has been utterly unimpressive, if not embarrassingly bad.
I understand that Linux has been a lesser priority for both companies, which is understandable. It's not going to make them the truckloads of money that the rest of their market segments will. But to do silly things like release official Linux binary drivers without so much as a changelog just reeks. They got enough bad blood from that to start releasing drivers with changelogs again, but support for kernels/xorg server versions still typically lags far behind nvidia's. And I can't do something as simple as run multiple monitors without seeing all kinds of weird graphical artifacts with high frequency.
My next card with be an nvidia if things continue on their current trajectory. I know that their binary drivers aren't perfect, but compared to AMD...
Edit: Also, I can't use the OSS drivers, so they don't weigh into my buying decision at all. Good on AMD for doing better there, but the binary drivers are the only ones that can run my setup.
As someone who has bought both brands recently-ish, I have to say I would VASTLY prefer AMD over Nvidia for Linux compatible graphics cards. I found that the community-provided AMD drivers where leagues ahead of Nvidia proprietary and community drivers (in terms of shear usability).
Here's a thread where I test my Nvidia 670 and my AMD 6870 in Linux. I explain why I like one over the other:
The community drivers are pretty limited if you need to drive more than two monitors or need to work with non-trivial OpenGL applications or games. Neither open source driver has very good power management, either. They both leave a lot to desire.
I found the open source driver situation to be quite good with AMD. I was able to play non-trivial games (TF2 and Dota2) and use two monitors quite well. In fact, the open source AMD drivers had better multi monitor support then either the proprietary or the community Nvidia drivers.
What kind of AMD chipset are you running? The HD 7000 series for instance was released early in 2012 and only started having _experimental_ and incomplete OpenGL support a year later. The performance was atrocious even 18 months after release and it likely still is. In the end I bought an Nvidia card which worked fine out of the box.
AMD has been completely rewriting the drivers as free software. The 2d performance of the in-kernel drivers is now better than the proprietary driver for southern islands(HD7000). The 3d performance of the free driver is at about 50-70% of the proprietary for SI with linux 3.13, according to phoronix. The latest cards C.Islands use the same drivers as SI so the performance should be similar.
I run 3 monitors off a 5850 without issues. The power management situation is pretty much solved from kernel 3.11 or something like that. Enabled by default for 3.13+ I believe.
For what I see, future is bright on AMD side for those features. I don't follow the news closely, but phoronix reports on the subject have change from a long string of steady bad support and performance to a seemingly constant progress since last year.
I switched back from AMD proprietary to the open-source drivers just recently. The open-source drivers just seem to work better for desktop Linux, whereas the proprietary drivers were okay but since neither is driving gaming for me yet felt a bit too clunky.
The big change is definitely dynamic power management in the OSS drivers - now my PC doesn't sound like a turbine while I'm in Linux all the time.
Just to be clear you are taking about binary drivers for desktops/laptops and this news is totally about mobile (Tegra). It is commonly agreed upon knowledge that on PC platform ATI's binary drivers suck compared to Nvidia's.
However in my experience ATI's GPU docs project plus the resources they invested in getting OSS driver to properly support ATI hardware means that normal users shouldn't really need to install binary drivers if they have a recent ATI GPU. And it works good enough for 3D and power management is supported so laptops benefit too.
Nvidia's patches referenced here are for Tegra which is strictly a mobile chip - nothing to do with desktop/laptop. In desktop/laptop department NVidia's OSS support is still non existent and Nouveau doesn't work as well as OSS ATI driver due to lack of documentation.
I only have limited experience with AMD chips: helping friends and family get Ubuntu running on their desktops and laptops. I have been sticking to Nvidia for the past 10 years whenever a Linux box is involved. I have to say that while the AMD/ATI OSS drivers do work, getting them working can be an exercise in the mystical arts.
The most recent issue was getting my brother's HP laptop to run Ubuntu. The OS installed but would not launch the X server. The fact that there are multiple different drivers possible for his card made matters much more confusing and neither the official docs nor the Ubuntu forums provided any ready-made answers past "install package X" which we did. I was able to get it to work but it took hours.
On the other hand the Nvidia binary blobs just work. You install it and start the X server. That's it. They don't seem to be bothered by the kernel upgrades when running Ubuntu either.
As far as I'm concerned, just buy Nvidia. Welcome your binary blob overlords and don't waste time on getting things working. I do of course wish that Nvidia just open sourced their current drivers, but for me usability of the system out of the box is worth much more than having or not having yet another binary blob.
That's odd. What laptop was that? There really shouldn't be any reason for all but newest ATI GPUs to not work out of box.
With nvidia the terrible stability and kernel upgrade hassles plus the need to externally install the drivers means it should be much uglier an experience than out of box ATI OSS drivers. Anyway as long as it works...
Never had any stability issues with nVidia drivers personally. As far as installing, I just use the akmod packages from rpmfusion, they take the binaries and package it up for every kernel. When you pull down a kernel update, you also pull down the associated binary drivers. Open source ATI linux drivers might be less install hassle, the problem is they just don't work for anything beyond the most popular cards running 2D graphics on a single monitor.
>Open source ATI linux drivers might be less install hassle, the problem is they just don't work for anything beyond the most popular cards running 2D graphics on a single monitor.
I have been using nVidia cards on Linux since the GeForce 2. I have never had any problems with their driver (apart from misconfiguration by me before X.org started autodetecting screens/GPUs and xorg.conf etc. was deprecated).
It's a small thing, but their binary drivers do not support kernel modesetting, so if I want to switch between X and the console it takes several seconds with a visible screen blank. Using nouveau (or anything that supports KMS) is very fast by comparison.
I don't personally use the OSS drivers, so their status doesn't weigh in on the buying decision at all for me. My 5870 is meant to drive up to six monitors, something the OSS drivers can't do right now (I have three).
It's easy to assume that theres a cause and effect relationship between Linus's words and Nvidia's actions, but we have no evidence this is the case, and shouldn't jump to such conclusions.
I'm not disagreeing with you but in this case I can share with you how I reasoned to that particular statement.
I've been around enough to know that there is always an effect of being called out in public, by a credible critic, inside of a company. What that effect is can vary based on how the corporate culture views itself.
In this particular case I have had over the years a bit more visibility inside [1] the company than some others. And I had been working very hard to get access to their acceleration into third party OSes. The developer program was verbally very supportive of that, but actually not so much. After a lot of work and investigation the proximate cause was the extreme patent mess around graphics and Nvidia's aggressive use of same to deal with competition.
I also discovered that at the executive level there was a belief that Nvidia was very supportive of Linux and third party use of their drivers. And even though I suggested reality was quite different than that, it didn't really affect their thinking.
When Linus called them out, and the coverage of that went viral, I thought to myself "This is going to be hard to explain in the context of an impression of support." In some ways having Nvidia not threaten or attack the nouveaux work was a big step forward. Actually contributing patches suggests to me complete surrender (or acceptance) of making "real" the notion of supporting third party use of their gear.
[1] My history was that a friend of mine from Sun was one fo the founders, he introduced me to the development program when the NV1 came out, I interviewed with them but did not take the job, but while at NetApp they were one of NetApps "Top 20" customers and I often spent time with their engineering teams working on ways their design flow would work better on network attached storage. A couple of former Intel colleagues ended up there, and a number of folks came over from there to Google when I was at Google as well. So while visibility was certainly incomplete, it wasn't strictly what was reported by the press.
Don't forget Intel. While their GMA chips don't have the performances of a Nvigia/AMD desktop GPU they have a much better track record at publishing code in the open. In fact I don't think there are even proprietary binary drivers for those.
Yeah, I just switched from a 2007 system running I built with a nVidia card running Debian lenny to a Ivy Bridge system with a CPU that has their "HD Graphics" running Debian wheezy, the integrated successor to GMA. I only have 2D graphics requirements and it's quite nice, the motherboard has integrated sound and video HDMI output.
(The board is SuperMicro's X9SAE, the first of a new line of workstation boards they're doing, they have a Haswell X10SAE, but it's a little early to try to make one of those work.)
The sad part is that, being on console hardware for a long time with chip derived from their main lines (and being in all 3 "main" consoles of this new generation), AMD obviously already has some great drivers for *nix systems, so it's not like they can't do it or would have to start from scratch.
How much of that is legal impossibility, secrets protection or simply not caring enough, well ...
The PS4 is the only console ever to have an AMD GPU that runs a conventional Unix (FreeBSD). Obviously Microsoft's consoles wouldn't, and Nintendo uses their clunky "IOS."
In any case, the interface exposed by most consoles to game developers is significantly lower level than anything resembling a desktop graphics driver, especially since each platform only has to support a single GPU configuration, so I highly doubt there's any useful (for our purposes) code tied up in there.
> I understand that Linux has been a lesser priority for both companies, which is understandable. It's not going to make them the truckloads of money that the rest of their market segments will.
Actually, they are both theoretically very interested in the Linux market because of their GPGPUs. Windows is pretty much unheard of in scientific computing (in November it powered a whopping 2 of the 500 comuters in top500). AMD does offer a good range of development tools for their CPUs; I haven't worked on their GPUs, but I've heard nothing but complaints about it.
> NVidia is probably not targeting top500, so is focusing on the 99% of market.
Nvidia definitely targets the Top 500 and is currently (Nov 2013) providing the GPUs for #2 and #6 in the Top 10 of the list and dozens more lower down the list. There's a lot of money in supercomputers, in the Top 500 scale and smaller installations too.
NVIDIA is focusing on the movie industry, where SGI workstations and rendering servers have largely been replaced by Linux boxen with NVIDIA graphics kit inside. That's probably been the biggest driver of NVIDIA's Linux support not completely sucking these past few years.
There was a time when AMD was hyped to heaven for their efforts. Wrongfully so in my opinion. Nvidia cards just always worked on linux, you had to install the blob but then it was a breeze. Fglrx on the other hand...
I tried to setup a dual-gpu notebook for a friend last week. It was impossible. fglrx doesn't support the chips anymore and ubuntu doesn't support the legacy blobs. Complete fail. I had to tell her ubuntu won't work on her computer. (The gpus will overheat without a proper driver)
So thank you AMD for basically not providing any drivers at all, and if you do, them crashing randomly taking the kernel with them. Thank you Nvidia for just working.
Imho in that regard Linus didn't know what the f*ck he was talking about.
AMD has been paying developers for years to develop free drivers. It has taken this long because they essentially had to rewrite everything and modern display drivers are complicated beasts.
The in-kernel, fancy KMS, AMD drivers should just work right now for many people. 2D is faster than the proprietary driver and the 3D speed is a significant fraction of the proprietary driver.
Have you tried enabling dpm? When enabled with a kernel parameter it dynamically changes the card frequency. I think you need a kernel version > 3.11. Check out:
I didn't yet. The best results I could possibly achieve were dynpm ones - about 65 degrees Celsius, which is completely unacceptable anyway. I've moved to a regular desktop PC, and I'm so happy for it ever since. Good to know though, that some improvement seems to be made. Thanks for that link!
I'm curious to see what kind of impact the use of GPUs for cryptocurrency mining has on Linux. There are a lot more people using them in Linux than previously so it'll be interesting to see if it has an effect on drivers and support.
After getting a laptop with an Nvidia card built in, I'm pretty much turned off from buying an Nvidia card again. Nvidia's treatment of Optimus on Linux has been... lackluster at best.
Note that this is Tegra related, which might explain their new-found eagerness to help (unlike desktop, they don't fully "own" the mobile high performance chips market yet, and it moves very fast)
As far as I understand it, this set of patches allows you to (or at least aims in the longer term to) run Nouveau user space on top of Nvidia Tegra Kernel drivers (did anyone read the code, does this use nvidia kernel drivers or nouveau kernel drivers?), which are already mostly open source.
This would mean that you could run a full open source graphics stack on an ARM System on Chip (SoC). I don't know if this is possible on any current SoC out there.
Linus:
> Hey, this time I'm raising a thumb for nvidia.
Commentor in G+ thread:
> They aren't there yet, more middle finger recommended.
A little chuckle to start the day ;-)
Nvidia's Linux support is generally pretty decent these days; however, the one issue that gets the middle digit moving in their direction is power management in a multi-head laptop setup. Ouch, max power = heat = fan noise.
I had to flash the VBIOS in order to force the chip into low power mode (not a gamer, just need to drive multiple displays).
Doesn't help that they remove PM features from anything but their high end chips; have a K2000M here, nvidia-smi shows a bunch of N/A, can't do diddly.
"Let me also stress that although very exciting, this effort is still experimental, so I would like to make sure that nobody makes excessive expectations based on these few patches. The scope of this work is strictly limited to Tegra (although given the similarities desktop GPU support will certainly benefit from it indirectly), and we do not have any plan to work on user-space support. So do not uninstall that proprietary driver just yet. ;)"
This is just the same basic enablement work they used to do with the "nv" driver. This will just make it possible to use an unaccelerated framebuffer and they go out of their way to mention they aren't going to do any open source userspace (aka GL) work.
I understand that Linux has been a lesser priority for both companies, which is understandable. It's not going to make them the truckloads of money that the rest of their market segments will. But to do silly things like release official Linux binary drivers without so much as a changelog just reeks. They got enough bad blood from that to start releasing drivers with changelogs again, but support for kernels/xorg server versions still typically lags far behind nvidia's. And I can't do something as simple as run multiple monitors without seeing all kinds of weird graphical artifacts with high frequency.
My next card with be an nvidia if things continue on their current trajectory. I know that their binary drivers aren't perfect, but compared to AMD...
Edit: Also, I can't use the OSS drivers, so they don't weigh into my buying decision at all. Good on AMD for doing better there, but the binary drivers are the only ones that can run my setup.