Hacker News new | past | comments | ask | show | jobs | submit login
Nvidia Removed Linux Driver Feature Due to Windows (tomshardware.com)
227 points by simula67 on Oct 4, 2013 | hide | past | favorite | 89 comments



Can someone please explain why "feature parity" even matters here? If Nvidia can do something with GNU/Linux that is technically hard to do with Windows, why shouldn't they do it? In what way does it make sense to force technical limitations of Windows on GNU/Linux users?


"feature parity" is a poor choice of words, but nVidia builds their Windows and Linux drivers from a single code base.

Keeping support for four monitors in the linux code probably meant adding a bunch of #ifdefs in various files.

They just decided it wasn't worth the effort to maintain.


That may be true, but they are presumably already maintaining a bunch of #ifdef's for the Windows-only features.


Given the relative popularity of Windows vs Linux, it probably is worth the effort to maintain Windows-only features.


True, but Windows has the much larger market share so it becomes worth it.


> That may be true

We'll never know. That's a binary blob on your machine.


This is true, but I fail to see the relevance here. This post just comes across as a "any chance to mention Free Software" type post. This thread is talking about if this decision by Nvidia makes sense. Trying to inject "well, we could all just ignore Nvidia if you were using Free Software," doesn't really contribute much to that conversation.

> That's a binary blob on your machine.

On my machine? It's less likely than you might think.


Being able to assume the #ifdef for features go one direction can have some benefit in reasoning about the code it relates to. In other words, there may be benefits to keeping the windows features a superset of the linux features, most likely in preventing stupid programmer errors.


It goes (or at least at one time went) well beyond that. Last I heard, nvidia.o was shared across Linux, FreeBSD, Solaris, and Windows.


Don't these systems use different object-file formats?


At the very least I can confirm the object is identical (same md5sum) across Linux, FreeBSD, and Solaris. Past sources have indicated to me Windows shares his object.


How is this a limitation with Windows even? I'm sure Windows can have more than 3 displays, so why does Nvidia have this restriction on either platform?


Because they sell "professional" Quadro/Tesla cards with this feature enabled at enormous prices.


Aaaah... so this probably is not about Linux vs. Windows, but rather actually about 'GeForce GTX 560 Ti' vs. Quadro. They aren't worried about driving people away from Windows, they are worried about cannibalizing Quadro sales.

There is probably a lot of use cases for many monitors that don't really require Windows (perhaps pro power-users, like traders, who need lots of browser windows open at once), so they need to ensure that their premium cards retain their edge even on Linux.

That makes much more sense.


But how many of those users would need to use NVIDIA's drivers? As I understand it, the benefit of NVIDIA's drivers over Noueveau is performance, not basic hardware support. If all you need is many monitors for web browsers, then you should not have a problem using Noueveau.


> But how many of those users would need to use NVIDIA's drivers?

That's a good point; I cannot say. Perhaps they think they are dealing with people who think 3D performance is necessary, or dealing with people (IT organizations?) that prefer 1st party drivers to hardware savings.

I don't think Microsoft is really involved in this though.


I have some experience here. With Nouveau, XRandr works out of the box -- great if you have one card. With the proprietary driver, no such luck (last I checked). On the other hand, my workstation has two cards, and XRandr is not up to the task; it was back to Xinerama and playing with xorg.conf.


I'm using the proprietary driver, it has had XRandr support for a while now.


Ah, my information is a bit out of date (at least a year, maybe more?). Thanks.


This sort of logic completely baffles me.

People buy hardware to use their advertised features. If you are the type of person that only needs 2D rendering and video then you should use an Intel GPU. I would assume people who buy a state-of-the-art dedicated GPU want to use them for actual 3D acceleration.


Or possibly the advertised feature they're interested in is 'ridiculous number of attached monitors'. Intel GPUs generally only have a couple of display connectors; if you want a card with a lot of physical ports you'll want to buy a discrete GPU whether or not it support 3D acceleration.


Even some of the Quadro cards (Q2000m in my personal experience) won't do more than 3 displays, and to do that requires changing bios settings which make games unplayable for the most part.


Yeah even Windows98 could run 9 displays. From the sound of it though, it looked like it was about presenting a single display to the OS across multiple GPUs. That's significantly different than Windows' built-in multi-display support.

Think full-screen apps that call APIs that aren't going to work across screen boundaries.


It doesn't. That's just their excuse. If it did, they'd start removing Windows driver features, too, since they can't be found on the Linux driver, and therefore there's no feature parity.


I bet it has more to do with them pushing new cards and wanting users to replace them with new ones.

I have the 560 Ti and it's capable of playing most current games comfortably. The newer cards seem to be able to do 4 monitors, but the 500 series is being left behind by nvidia as they push the 600/700 series with their shield and shadowplay features.


It's possible they have a licensing deal with MS that forces them to do this, the same way Amazon dictate that they must have the best price on a book.

Then again, I find the simplicity argument somewhat convincing as well. NVidia's unified driver model must be hellishly complicated anyway, anything they can do to keep control of the complexity is probably something they'd greet with open arms.


Maybe Nvidia doesn't want end users to feel like they're missing out on anything on Windows, therefore finding the need to cripple Linux users?


But what is Nvidia's motivation for that? How does Nvidia win if people stay on Windows vs. moving to Linux? As far as graphics cards are concerned, they would just start putting more effort into their Linux support (to match the swing in customer OS usage).


Pressure from Microsoft in some fashion?


Valid point. We know how MS has managed to keep up their sales point monopoly for decades (forced Windows sales via mandatory pre-installations). We know how they wanted to offer exclusively their browser on Windows (the EU disagreed with that idea).


NVidia has a history of not playing nice with Linux. Allow me to point to a small comment Linus Torvalds made on Nvidia a while ago (NSFW):

http://www.youtube.com/watch?v=IVpOyKCNZYw&t=1m41s


I just use the nouveau drivers. They perform well enough, cause me far less xorg.conf headaches, and are free.


GTX 670 in a motherboard with a UEFI BIOS, running Ubuntu 12.04 LTS, the nouveau driver crashes whenever you move a window too quickly.

Verified on three machines with identical hardware.

AMD doesn't have decent drivers even for Windows. (By decent, I mean less than 1 graphics-related crash on average per year.)


Did you submit a bug? You went to the trouble of verifying it, so I'm hoping so.


I'm always amazed when I see stuff like this. Nvidia's windows drivers crash all the time too. Neither company is capable of producing quality drivers, yet fanboys still want to insist one of the turds is made of gold.


I think it's like hard drives. You have a bad experience with one manufacturer and jump ship, and subconsciously justify that decision to yourself as rational because of how bad the experience was, not because the manufacturer you're jumping to is any better.


> Nvidia's windows drivers crash all the time too

Counter anecdote mode: engage!

As an owner of many Nvidia cards, I find they crash infrequently, and when they do it's because:

* I was running Windows 8 Preview and pre-release Nvidia drivers

* There was some sort of hardware problem with my systems

I have used four ATI/AMD graphics cards over the years. Not all of them were negative experiences, but none of them were relatively more positive than using Nvidia hardware.


Counterpoint - ive found amd drivers to be pretty good on linux and they support multiple rotated monitors and color matching across multiple monitors much better than nvidia.


I've been running a GTX 680 on Windows 7 for the past year, and count 0 crashes, none from the desktop nor games, though I'm been playing fewer and fewer games these days.

Before that, I had an ATI HD 5870 for over a year. I had about 3 crashes on the desktop, some that lost me work. The anger from losing work is what I remember most strongly, and made me vow not to buy ATI next time.

Before that, an 8800 GTX. I had 0 crashes from the desktop over several years.

I was running Nouveau drivers on the 670 for no more than 60 seconds before it crashed.


I've had both, non-consecutively over 6 different cards over the years. I've not had any large problems at all, and my current GTX 670 has been performing like a champ for about 15 months now, no video crashes.


This is still a reason to cause upset though. A small minority of linux users actually care about GPU programming and 3D acceleration and rely on proprietary drivers. Nouveau is improving but nowhere near commercial grade (this is largely Nvidia's fault as well).


I agree. It's yet another instance of Nvidia not playing nice with the Linux community.


I don't know. ATI has been "playing nice with the Linux community" for several years now, and their free drivers still don't approach the performance of the proprietary drivers for most cards (cards 3-4 generations old are starting to perform comparably on both drivers; newer cards are still much slower with the free drivers).

I think it's more about resources -- NVIDIA and ATI have lots of money that they put into driver improvements and maintenance, and the radeon and nouveau maintainers primarily have free time after work, occasionally with one or two full-time contributors. If we are serious about open-source driver platforms, we need to figure out how to allocate comparable resources. Some easily accessible tutorials on 3D driver development would probably be helpful.

Neither company ever plans to see an open-source driver rival their proprietary within a few years of a card's release, despite their "commitments" and "assistance". ATI's officially stated strategy is that they provide documentation so that the OSS drivers can provide basic compatibility and functionality, and that their proprietary driver should be used by anyone who has actual performance needs. They're basically offloading the onerous maintenance of previous generation hardware on the community.

The point is that nvidia is not especially bad. They've graciously given us the only consistently decent video driver on Linux for 10+ years now, despite its closed nature. Let's try not to be too hard on them.


AMD has played nice enough for people to release a stable free driver for their cards, but they haven't released enough information for a fast 3D driver.


Right, and people also have released a stable free driver for nvidia cards. ATI made it slightly easier, but the point was that neither one has really embraced the OSS driver scene, and neither one has "played nice" to an extent that deserves special commendation.

I don't want to give the impression that I'm ungrateful for the help that ATI offered the implementers of drivers for its cards, but I think when we take all the facts in balance it comes out in a wash or even with nvidia slightly ahead: nvidia has been consistently producing a high-quality, near-performance-parity driver for Linux for a very, very long time now, and it's almost always compatible with new kernel and Xorg versions prior to their release. ATI has given docs for basic functionality in the free drivers, but they've utterly failed to produce good, performant drivers for Linux machines, closed or free, and in fact they are now on the verge of being two Xorg releases behind.

nvidia may not have "committed to help the open-source community" until recently, but they've done a great service for Linux by being the only vendor to provide consistent, solid, high-performance drivers for Xorg and the kernel. Intel has shaped up recently and gone full-bore OSS, but nvidia has a much longer track record of good [closed] *nix support.


What information have they not released?


The perforce suffers with nouveau with anything needing fps.


OpenCL, CUDA


Do people really do much heavy lifting with OpenCL/CUDA using cards that they are also using to actually drive browsers? Seems to me like if you are doing serious work with that sort of thing, you probably have dedicated cards for that (if not straight up using Nvidia Tesla cards or similar).


I don't use OpenCL often enough to justify the expense of a dedicated card, but, when I do use it, it's a life saver to be able to simply run the code on the CAD workstation.


as a developer, whose primary role is in GPGPU programming, i can't give you an accurate answer.

i can tell you, however, that the specialist cards typically are only in super-computers. consumer GPUs are typically faster, and significantly cheaper than their HPC counterparts.


Probably, but that's just one more reason for them not caring about 4 monitor support in non-Windows drivers. From what I've gathered from mailing lists chatter, the very reason Nvidia maintains Linux and FreeBSD drivers is OpenCL/CUDA related work, apparently some big customers demand this (but since I don't really have a source for that, take it with a grain of salt).


I both play video games and experiment with new technologies at home. I don't think it's strange to use the same card for play and research.


Is it fair to say that this is "due to windows" though? I don't see anything in this article saying that this directive came at the urging of Microsoft or anyone on the Windows team there. It's entirely possible that this can be some internal nVidia political nonsense happening.


I think it is time to reverse the nvidia driver v295 to make way for an open source nvidia driver which supports upto n screens.


Who is "sandipt" and what is their connection to Nvidia? I don't see any flair or profile information (on the forum) to indicate that this is an actual staff member, or one with authority to comment on the motivations behind this change. I see no basis for all this speculation.


The guy's post history is clearly that of an NVidia employee, and his avatar is a NVidia logo, the same as used by the forum's moderators: https://devtalk.nvidia.com/default/topic/573252/evo-push-buf...

There's no reason at all to doubt his status as a NVidia employee.


I don't routinely visit their forums, so I had no way to know that. I don't even recognize that as the Nvidia logo. I assumed it was a default avatar.

So, no, it's not clearly an Nvidia employee. That's why I said

> I see no basis

We still don't know whether that's an official position or the speculation of a volunteer/low level employee.


I don't routinely visit their forums, either. In fact, this was my first visit, and less than five minutes of browsing was sufficient.

Why would you assume that the NVidia logo was a default avatar when most posts on the forum appear at first glance to have a question mark as the avatar?

How are you still unwilling to conclude that he's a NVidia employee when a post of his about the status of NVidia's internal bug tracking system is corroborated and followed up by a post from a moderator whose signature identifies him as another NVidia employee? Yes, Sandip might be only a low-level employee who has nothing to do with NVidia's business strategies with respect to Linux, but he's clearly not just some random poster. (And for you to suggest that NVidia might let a volunteer have access to an internal bug tracker suggests you know nothing about how NVidia does business.)


Be less eager to bicker. All I did was try to cool all the Microsoft/Nvidia conspiracy talk.


... by introducing equally unreasonable speculation in the opposite direction, without even putting forth a few minutes effort to gather real information. Being an apologist isn't really a good way to try to steer the conversation back toward more reasonable explanations.


Find someone else to play ego tennis with.

edit: basically, you read 5-6 things into my words that simply aren't there. I avoid aggressive bickerers like you because I end up spending too much time clearing things up. You're primed to take the least charitable interpretation possible. I have better things to do than to deal with someone like you.

Anything is a better thing to do. I've wasted too much time in the past arguing with perpetually perturbed people to let it happen again.


"Sandpit" sounds like the name given to an account that multiple people may be using in a test environment, rather than something meant as an actual ID.


Actually, it is a pretty common Indian name (Sandip or Sandeep). The 't' at the end is probably the first letter of his surname.


"sandipt", not "sandpit". I assume it's a first-name-last-initial style username of someone named Sandip.


The article currently says "Sandpit"


The name on the forum thread is sandipt. This is what happens when lazy non-journalists fail to do the most basic detail checking. Or seek a more credible source. Or seek an official comment.


Yeah, I read through the forum to set whether this was a mistake in the article. I think my point was that the lazy author was the source of confusion of the name.


Maybe a late response to Linus :)

https://www.youtube.com/watch?v=_36yNWw_07g


Obviously, the feelings are mutual.


What does this mean exactly?

I'm currently running an MSI 660 PE Ti with 3 screens in TwinView mode with driver version 319.49. This gives a total desktop of 4960x1920 in landscape-portrait-landscape layout.

I would like to get another 660 Ti and add another 3 screens in TwinView. Is that even possible?


This might seem unrelated but I have a question to the crowd related to 3 monitor setup with Ubuntu.

Does anyone know how to use Intel + Nvidia on Ubuntu 12.04 (or later)? I was able to do this in the past with Windows but I can't get it working with Ubuntu.


Something similar with AMD. My flex graphic supports 4 screens. At some point proprietary drivers dropped this and now it only supports 3 screen. Open-source driver works just fine with 4.


An important rule for software is to not take features away. People will complain even if it's an obscure feature they don't even use.


It's probably time to vote with our wallets again.


It's something a little like this:

"We are altering the deal, pray we do not alter it any further".

Closed source, closed negotiating position.

When you buy one of these cards, you are already assuming the position to get kicked, you are just hoping not to get kicked too hard or too many times in a row. Doesn't seem like a sensible thing to do really.


Oh, you bought Nvidia? Well, that was your first problem, and fault.

The Intel graphics are not so fast, but at least a big chunk are open source.


I'm unaware of the ability to drive a 4-monitor xorg desktop off of Intel graphics. The user in question here was attempting to do such a thing, which was supposed to be possible with the Nvidia cards. If his/her 'problem' was in purchasing Nvidia cards, then what would your recommendation be for creating such a setup be?


Not the OP, but I think the open source AMD drivers support both DisplayPort and Dual Link DVI, so there's probably several cards from them that would work.


so, you're saying that anyone that wants reasonable 3D performance, 2+ displays, and GPGPU support can go fuck themselves? :/

don't get me wrong, open source is great. we need it now more than ever. but this is exactly the wrong sort of attitude to take.


There's always AMD. The open source AMD drivers are great for the 5000 and 6000 series, and support for the 7000 (and by extension 9000) series is getting better.

If you're interested in compute just pick up a 5850 or 5770 from a friend, they're cheap (if you can find them) and they run well with the open source drivers, and have support for at least 3 monitors. My 6850 has 2 DVIs, an HDMI and 2 miniDPs, I can drive 4 monitors at once with it.


I would counter that the AMD drivers are not great. I got a whitebox last year with a 5770, and the experience was terrible. The open source drivers couldn't do 3D acceleration properly, and the proprietary drivers refused to do dual-screen - the maximum available resolution across desktops was a square 1920x1920.

In my experience, nvidia has been providing a superior linux experience in the trenches for the past couple of years. The 'amd is good for linux' seems to be an emotion from yesteryear.


I use amd with three displays on debian for a couple of years now, never had any problems.


Yes, i bought a machine with intel graphics for that reason but i never had as many kernel panics before. i915 leads http://oops.kernel.org/ :-/


The 915 chipset specifically is ancient and terrible. In fact, you could make the argument that that chipset was partially responsible for Vista's launch failure (background here: http://seattletimes.com/html/microsoft/2008403670_microsoft1...)


Ancient? My machine (thinkpad x220) was brand new in 2011.


Ah, gotcha. Yes. the 915 was originally released in 2004. The GPU in an x220 is an HD3000, which apparently uses the same driver. Confusing.


You could just as easily have blamed him for using the proprietary vs free (open-source) drivers, with the same effect (and he still gets better performance).

Seems like your distaste for a company/policy is running this comment.


Does this issue affect FreeBSD drivers as well?




Consider applying for YC's first-ever Fall batch! Applications are open till Aug 27.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: