It's things like these that sometimes make me think of all those businesses that rely on some ancient Windows XP machine because it's the only thing that supports the proprietary drivers for the engineering machine they bought nearly 2 decades ago for hundreds of thousands of pounds. If instead the drivers were open source (and the relevant controller software for that machine), they could more easily move over to a more recent operating system that doesn't run the risk of being infected with some virus (think Wannacry)
I imagine also how more difficult it would be for someone to reverse engineer the drivers for something a lot more complex than a VGA-to-webcam box.
Well yes, open source firmware and drivers would be cool but for most companies that's part of their proprietary IP and they think that making it open would make their product easier to copy/exploit by competitors. Since it cost them lots of time and money to develop there's no way your gonna convince management to just make it open unless you're some stinking rich FAANG Co. and those HW product sales are not your main source of income.
Just look how locked down Broadcom documentation and even datasheets are. Everything is under NDA.
HW business make money by selling more HW so making you buy their latest product just for the latest updates even though the 10 year old one still does the job is a viable business model for them.
Source: I worked in the hw industry and it's how they think.
I think there is a lot of ... cross pollination ... in the hardware industry, and companies are pursuing every angle to make sure that it's hard to figure out what IP they may have "independently discovered" that their competitors patented. I believe this is the reasoning NVidia gave for not open-sourcing their Linux drivers; everyone is copying everything and looking the other way. (It all started with the MOS/Motorola thing. Engineers took documents from Motorola to MOS, Motorola noticed, Motorola sued. Now people are more careful.)
As for datasheets, I feel like from certain vendors, they are "living documents" and every customer gets a different datasheet. You might mention in a meeting with the sales engineers "oh, we need a 100MHz SPI bus" and then they will edit the datasheet to say that the SPI bus works up to 100MHz. This is why there is an NDA, every customer gets a different datasheet exactly tailored to their needs.
That's an interesting explanation that I've never heard before, for why the hardware industry is reluctant to publish this stuff. It certainly seems credible.
You say:
>I believe this is the reasoning NVidia gave for not open-sourcing their Linux drivers
Do you happen to have a source for that? Or at least an explanation of why you think this is the case?
AMD is a company that has largely gone from proprietary to open source and without looking at their financials I think it seems to have been going really well for them. AMD GPUs just work out of the box on linux without having to install anything so I and almost every linux gamer buys an AMD card now even if it isn't the best performance per $
I do all my gaming on Windows and I still only buy AMD GPUs, so that when I switch back to Linux to start working, everything's rock solid and stable. I don't know if maybe I've just had bad luck, but nVidia cards have always been a bit off, even using the proprietary drivers. It was particularly noticeable with screen captures, I'd get this odd flickering that I could never completely solve. Switched to AMD and it cleared right up.
Even if the performance isn't the best, the openness is worth that price to me. I don't feel like the stability of the rest of my system is hampered by a third party that doesn't seem to care.
Didn't that strategy start before ATI was bought by AMD? How much of that strategy is just following through on what they were doing at ATI before and how much is it is in line with what AMD does in general?
I'm really curious how you came to believe that, given that AMD started supporting the development of open source GPU drivers in 2007 by collaborating with Novell/SuSE and releasing low-level documentation. By 2013 the open-source AMD GPU drivers were far better than any reverse-engineered NVidia driver has ever been, and only two years later AMD started the process of deprecating part of their proprietary AMD driver in favor of a hybrid approach (proprietary userspace module, open-source kernel driver).
I just looked in to this and it looks like what I am thinking of is the AMDGPU driver which wikipedia says released in 2015. I'm not sure what the driver you are talking about is or what the difference between the AMDGPU driver and yours is. I just remember in 2013 people saying nvidia was best for linux use and then with the AMDGPU driver people saying AMD now is.
The AMDGPU driver was the first one where the "main" consumer driver was the FOSS one. Prior to this, "fglrx" was for many years the fully-featured, proprietary driver, intended for both professional purposes where certified OpenGL drivers are needed, and for consumers. The "radeon" open source drivers tended to lag behind in both hardware support, features, and performance. Initially, they were largely a community effort, but then AMD started publishing some GPU documentation, and then also put more of its own resources into developing the FOSS driver.
The UX of the "fglrx" drivers tended to be pretty awful, all told; typically worse than nvidia's proprietary drivers, so they were wildly unpopular among users. The performance gap eventually started closing, to the extent that some games/apps would be faster on the "radeon" drivers, and some on "fglrx".
Eventually, AMD switched to the AMDGPU model, where the kernel driver is always FOSS (and part of the upstream kernel), and there is the FOSS userland driver as part of Mesa for consumers, and a proprietary certified OpenGL driver for workstation users.
(I believe Valve also had a hand in this somewhere along the way - they were interested in shipping Steam Machines/SteamOS and understandably were keen on having a stable and fast graphics stack and so contributed to Mesa.)
>The UX of the "fglrx" drivers tended to be pretty awful, all told; typically worse than nvidia's proprietary drivers, so they were wildly unpopular among users.
This is my memory of the time. Both options being undesirable and proprietary but nvidia's at least working better.
> AMD is a company that has largely gone from proprietary to open source and without looking at their financials
"without looking at their financials" does not make sense. Do they open anything in the Windows space? If a HW company wants to sell their stuff to the Linux desktop users they are generally forced to open the source, not because of good will. In terms of openness Intel does it a far away more. nVidia seems to be only exception.
Freesync comes to mind and I think they had something related to physics/hair rendering.
There is no requirement to open source on linux.
Nvidia does not. It just means you have to constantly update your driver for the latest kernel version and spend time keeping up to date with all of the latest developments like wayland.
> There is no requirement to open source on linux.
This is wrong. If you publish a Linux kernel module, you might be obligated to release the source. Fundamentally, this is how Linux differs from *BSD: Linux is intended to be generally hostile to binary blobs.
But it's not universal. Linus thinks nVidia's module may be exempt from the GPL's 'virality' because it wasn't originally written specifically for the Linux kernel.
Isn't that just planned obsolescence? Isn't the EU very much opposed to that kind of business strategy? Shouldn't there be some law in the EU prohibiting that?
I mean, I wouldn't force the manufacturers to keep supporting the hardware after a decade, but shouldn't there be a law compelling them to open source the drivers and firmware upon end-of-life? If a product reaches end-of-life, surely the manufacturer already has a demonstrably better, upgraded version of the product to attract new sales?
You'd think so, but what we are wrestling with now is that vendors can create products (microelectronics) that are too difficult to reverse engineer that its operation is effectively a secret to all consumers.
We're probably going to see more of this in the coming years around SecureBoot/TrustZone on ARM.
Is it just how they think, or is it the reality of the business?
Would be cool if manufacturers would make old versions of sourcecode public after a couple of years, and the final version when a product goes out of production.
I think for a lot of software businesses there is a truism that "your competition doesn't want your code, they think you're morons and you think they're morons."
But in the hardware game there is a lot of counterfeiting. There is rarely any shame or consequence when it comes to outright IP theft. Making it easier to access the firmware makes it easier for someone to counterfeit your widget.
You think the public facing representatives (management etc.) are morons, because
you think their strategy for working in the same space sucks unless
you think your company is totally blowing it and you are on your way out
but you don't think their programmers are morons so much (except when you examine site and it performs worse than yours for obvious reasons that you fixed on yours) but
if they roll out a really cool feature you can implement in your product you will write code to implement that feature and not wish you had their code because anyway your stuff is incompatible codebases unless
you get bought by them or they get bought by you or you both merge because you are in fact compatible and this way something beautiful will emerge
Firmware and drivers are different though. If the firmware was totally locked down then the drivers would be no use for a clone and if the firmware was taken but the drivers were proprietary the clone could just use the exact same driver binary.
Revealed preferences show that
when given the choice among economically sustaiat options, customers prefer to pay value based pricing and to rebuy things they still use, instead of paying more upfront for forever products.
> when a product goes out of production.
By the time this happens, the company has lost their throwaway source code.
In terms of taking care of the environment and fighting planned obsolence, this should be mandatory. If you stop supporting your product, you must enable your customers to support themselves.
It is sadly the reality in many cases, especially their business is small compared to FAANG. When a HW becomes popular it is natural that Chinese begin to make clones (google "Saleae clone"). In this case their propriety HW driver should filter these clones. Making it open source mean they give their business away to the Chinese clone makers.
Less maintenance and testing when OSs like linux distros bundle the drivers in and keep them working on future versions of the OS. This is also huge for mobile hardware since OEMs won't have to put in any effort to keep patching the latest upstream android.
Of course they may not want any of this since it means they can sell a new phone every 2 years.
I used to work for a Power Company, that had a hydroelectric dam. There were sensors in the dam that needed Windows 98 to read data from. I remember we ended up having a fleet of old Asus Eee PCs specifically for that, that obviously never ever ever connected to the Internet, or really any network.
ReactOS really sounds promising for these specific areas and I hope it gets better and better with time, it's just a shame that it's development cycle is slow right now though I understand it's hard to reverse engineer Windows
I'm surprised a manufacturer of such engineering machines hasn't differentiated itself by offering a SLA on driver (and general software) availability for future operating system versions. It would be reasonable to require users to pay a subscription fee for extended support. I'd think the resale value of machines with drivers still available would be much higher than for those without and that this eventually would allow the manufacturer to charge a premium price for the machine at initial sale.
I own a Fujitsu Scansnap s1500. It's out of support, and the existing drivers are incompatible with Catalina. I now must pay a 3rd party $100 for drivers or fiddle around with a Linux scanner server or similar. Never again will I pay $400 for a Fujitsu scanner, that's for sure.
It's probably because when faced between paying X for continuous support (where X is a big number) and "we will keep around a few computers for when it's a problem in 20 years", most companies are going to go with the later.
Even if it would make financial sense long term, how likely is it that the person making the decision will be there in 20 years to deal with the mess?
On the other hand, it will look much better on their next quarter results.
It happens all the time, but it's generally not "visible" to the typical HN developer.
PC and peripheral manufacturers in the embedded space often provide availability guarantees e.g., "this version of this hardware will be available for sale at least until YYYY."
Sometimes it matters, usually it doesn't. But it does at least make ordering spare parts easier.
If the drivers were open source, they would need good programmers to maintain them. And if they had good programmers, they could have reverse-engineered proprietary drivers. But I guess that buying an old PC might be cheaper than hiring a good programmer.
It's more that buying old PCs is safer than trying to reverse engineer drivers. Buying enough extra hardware to have spare parts for the life of the product is a pretty well known and cheap way to insure no down time.
I have experienced even when the original driver writer (me) is still on staff, it can be less expensive to just keep buying old-spec PCs than rewrite the code for a new OS. I make the company more money writing new code than supporting code that's a decade old.
I imagine also how more difficult it would be for someone to reverse engineer the drivers for something a lot more complex than a VGA-to-webcam box.