Hacker News new | past | comments | ask | show | jobs | submit login
Arm co-founder: Sale to Nvidia would be a disaster (bbc.com)
478 points by mepian on Aug 15, 2020 | hide | past | favorite | 303 comments



I'm in agreement. nVidia's history of not playing nicely with others does not make this sound like a good deal, especially for entrenched ARM vendors. (Apple won't be phased; they basically design everything themselves, so they don't really even care about the direction of ARM or the architecture. They could hard fork tomorrow, or two years ago, and nobody will notice since they're entirely vertically integrated.)

But, on the plus side, it's the biggest opportunity RISC-V will ever get. Almost overnight anyone who was building ARM cores will suddenly see investments in RISC-V prudent as a hedge against nVidia's future core designs not being licenseable. SiFive will very quickly become a multi-billion dollar organization...


Apple’s ARM chips contain an awful lot of different components, some of which are highly customised. The neural engine stuff they use for image processing for example. The CPU cores are custom tuned, but I believe they are still 90% based on ARM CPU core designs. For example the Secure Enclave is actually an ARM developed CPU extension. Apple focuses their efforts vary specifically on areas that give them features that grant a competitive advantage. They don’t reimplement stuff for the sake of it. One of their big performance wins over other ARM licensees isn’t even really in the CPU, it’s the very well designed and generous cache memory systems.

Having said that, looking to the future I think Apple is clearly willing to pour more resources into their CPU programme than any of their competitors. This means the risk to them is probably a lot less than to say Qualcomm or Samsung as they have a much better capability to go their own way. As time goes on their designs will probably diverge more and more from main line ARM anyway.


Apple‘s designs are already far better than the equivalent ARM coRe. I doubt they're using anything besides the ISA.


Apple licenses the isa only. They implement their own micro architecture.


Apple has an architecture license from ARM, which allows them to design their own cores yes, but also includes all the benefits of all the lower tiers including rights to all the ARM designs.

This would have to be the case, because Apple still uses many of their earlier ARM designs in products, which we know used ARM designed cores and would definitely require core design licenses.


All this seems fitting since they are an ARM founder.


I think nVidia could run into some serious anti-trust issues if they tried to interfere with the ARM ecosystem. There's far too much at stake here. Is there a phone on the planet that doesn't run ARM?


The Asus Zenfone 2, released 5 years ago, ran on x86 (Intel Atom) [1]. I know someone who was still using it this year Eventually switched but the phone still works. So, to the question:

> Is there a phone on the planet that doesn't run ARM?

Well, technically, yes ;-) This does not change anything to your point though.

[1] https://en.wikipedia.org/wiki/Asus_ZenFone#Second_generation...


I bought the Zenfone 2 which had an Intel processor, and had a similar experience to the sister comment. The only super annoying thing that happened was when Pokemon Go came out all my friends had it but it didn't support Intel. By the time they made it available for the intel phones (which was a few weeks) the game wasn't cool anymore. :/


I used the ZenFone 2 for about a year ~3.5 years ago; it worked surprisingly well. It was impressive that it really "just worked" like a normal ARM device.

There were some annoying software aspects (the Play Store only allowed me to download x86 apps even though it could run ARM apps), and the device was 100% a battery hog - but cool tech.

Surprised Intel gave up on that effort.


Intel gave up because Atom for smartphones were sold at a negative price after marketing incentives. They couldn’t even get volume from manufacturers when it costed less than zero; because no one wanted to invest upfront R&D costs.


Intel should have made their own phones.


There were a few MIPS Android phones too.


MIPS was pretty popular among early Shenzhen tablets.

Then they just disappeared. Either MIPS screwed up or ARM did something really smart. My money is on the former...


> MIPS was pretty popular among early Shenzhen tablets.

> Then they just disappeared. Either MIPS screwed up or ARM did something really smart. My money is on the former...

MIPS SoCs simply fell too much behind. They kept presence in the STB, and router boxes for some time, but even there they lost due to overall "deadness" of the entire ecosystem:

Want latest compiler work, and mainline GCC support? Go for ARM

Want to not to have to support your own Linux fork? Go for ARM

Want more IP coming from a single vendor that just core, interrupt, controller, and DMA? Go for ARM

Want to work with an IP vendor whose sales are not part time lawyers? Go for ARM

When MIPS understood that they are losing, they simply gave up on competing, and improving.

And in the end, the lunch went to MediaTek because it was the only SoC vendor that was ready to ship competitive SoCs with 4G baseband without rapacious licensing deals like Qualcomm do.


Actually at that time MIPS suddenly came back to life and had a big push with improved designs and a few deals resulting in pretty decent MIPS socs. Performance was okay, battery life was good, and they were first to market with the latest Android release (I think it was 4.0 or 4.4).

But then everything suddenly stopped and they left the market altogether.

Edit: Regarding this:

> Want to work with an IP vendor whose sales are not part time lawyers? Go for ARM

One of the founders of OpenCores (RIP) told me that ARM lawyers contact him _within hours_ of someone pushing a design they dont like. While MIPS has a bad rep, ARM is probably more aggressive.


> Want to work with an IP vendor whose sales are not part time lawyers? Go for ARM

Er... not convinced about this one.

ARM has been quite lawyerish sometimes.

For a while, parts of QEMU could not be published, at ARM's request, because they implemented instructions ARM did not permit a software emulator to emulate.

At least one non-commercial design on opencores.org was removed at ARM's request.

By comparison, for a while MIPS ISA was completely open for anyone to implement, software or hardware. That's been retracted now, but it was nice while it lasted. ARM has never done that as far as I know.

Even before, I used devices based around MIPS without a licence a long time ago, so long as they avoided implementing one or two "special sauce" instructions that was fine. The instructions weren't necessary, just a neat, patented optimisation, and I have a GCC patch somewhere for building the target without those instructions.

So I don't know about today, but until recently I would have consistently regarded MIPS as much less lawyerish than ARM.


> Want to work with an IP vendor whose sales are not part time lawyers? Go for ARM

Interested in this comment: were MIPS/Imagination known for being particularly litigious around licensing?


There are tons of stories about MIPS going after people for trivial designs.

http://probell.com/Lexra/index.html


> Want latest compiler work, and mainline GCC support? Go for ARM

> Want to not to have to support your own Linux fork? Go for ARM

But that's also true for MIPS.


I have not seen any activity from MIPS on GCC for years.

Making a Linux port for a given MIPS based SoC is a known pain. They should've worked towards not requiring any hardcoded configuration for each SoC at all, like Linux on ARM does these days.


Do you mean the device tree?

I thought all major kernel archs were on device trees now...


Yes now, but last time I talked with people who dealt with them, people spoke of their new cores coming with "prehistoric kernels" on usb sticks.

So every vendor had to upstream their own soc in its own


Ubiquiti will still sell you a shiny, expensive MIPS box, as will GL.inet.


Interestingly, they've gone ARM in their latest generation of hardware, e.g. the UDM-Pro uses a Quad ARM Cortex-A57 Core at 1.7 GHz: https://store.ui.com/collections/unifi-network-routing-switc...

So their MIPS stuff might be consigned to the past at this point.


>I think nVidia could run into some serious anti-trust issues if they tried to interfere with the ARM ecosystem.

I seriously doubt it, Broadcom - Qualcomm sale was blocked because there were concerns that the former would inhibit the telecom technologies development of the latter, resulting in strategic disadvantage to U.S. But, in the case of supposed Nvidia-ARM sale I see the quite opposite i.e. huge strategic advantage to U.S. and so I feel the anti-trust might not be at least State initiated.

But, would other U.S. Big Tech who have invested heavily in the ARM ecosystem oppose this is what to be seen. But even there, I wonder whether their fear could be substantiated? What if Nvidia just signs a good agreement with them to alleviate their fears? After all Apple and Qualcomm were strangling each other throats, now see where they are now.

In the end, I think the deal would just go through and it's us who are worried about the future our Raspberry Pis among others things who would feel lost(and actually loose).


Wouldn't the EU antitrust look into that too or is this going to be out of their jurisdiction after Brexit?


Even if the EU did not have jurisdiction British authorities are not very keen on large mergers either.

British resistance caused American Pfizer to walk away from a potential merger with British AstraZeneca.


I'm not sure how much sway British authorities would have, since ARM is now wholly owned by SoftBank?


In a merger of this scale all the big powers (US, EU, China, and I assume also UK now) have basically veto right.

There are countless examples where country 1 has stopped a merger between a company from country 2 and one from country 3.


> In a merger of this scale all the big powers (US, EU, China, and I assume also UK now) have basically veto right.

> There are countless examples where country 1 has stopped a merger between a company from country 2 and one from country 3.

From what I've seen, the only method of enforcement for a third-party country is to ban the sale of goods in their respective markets following the merger (like GE/Honeywell in the early 2000s). Would that even be possible in this case?


Not sure how this is enforced. Generally the merger falls apart if there are any indications of this happening.

Don't forget that these are public companies and a stopped merger can affect their stock. Besides, major investors would eat the board alive if they put the company in such situation.


Antitrust regulators power is determined by the size of the market, not the location of the company HQ.

Antitrust regulators for big markets like the US, EU can effectively block mergers from any internationally operating company. Both AMD and Nvidia need access to EU - their customers certainly do.


> Antitrust regulators power is determined by the size of the market, not the location of the company HQ.

> Antitrust regulators for big markets like the US, EU can effectively block mergers from any internationally operating company. Both AMD and Nvidia need access to EU - their customers certainly do.

They could block the merger and if the companies decided to go ahead with it anyway, the EU would ban imports. But, wouldn't that further consolidate the market?


> and if the companies decided to go ahead with it anyway

This has never happened, because the profit loss from both parties from losing access to a market with 400M+ people has always made investors blanche.


What I mean is, in what practical way could the EU ban ARM/Nvidia from their market, considering these manufacturers' chips are in many devices? Would that mean devices with those components would also be banned?

What I'm trying to get at is, with a sufficiently large merger, the emerging company may produce products so ubiquitous that it would be impractical to bar them from your economy, due to the widespread use of their products and their integration into other products.


I believe as long as ARM sells their hardware in the EU, they are able to look into antitrust issues.


Not that I disagree with you, but ARM doesn’t sell hardware—they sell licenses to designs of the aforesaid. Nonetheless, don’t forget that nVidia, insofar as it sells hardware in the EU, would definitely be within the scope of local anti-trust authorities.

I don’t see this being much liked by those authorities, either. nVidia is clearly interested in doing this deal because it will give them competitive advantages, and it’s pretty clear by their behaviour versus other firms (including Apple!) that they have so much of that already that they don’t have to “play nice”. Giving them further purview to do as they please is basically what anti-trust is designed to prevent, whether you focus on industry relations or the ascendant on consumers.


I think there’s some MIPS and SH-3/SH-4 smartphones from the early 2000s still around.

I had an SH-3 PDA in 2001, but Microsoft dropped support for SH-3 and MIPS in Windows CE by the time “Windows Smartphone 2002” came out (my 2002 HTC Canary runs ARM, IIRC) so it’d have to be a non-CE platform. What did the Palm Handspring use?


Internally we supported SH4 through CE7!

Source: I maintained the compiler test infra for it.


Not only internally. Infotainment Systems with the CE7 and SH4 combination had been built well beyond 2012. Only around that time Arm got traction.

At some point in time it got then renamed in embedded compact.

Source: worked in the automotive together with a couple of supplier which built those systems.


SH4 is out of patent now - do you know if the J-Core can run CE7?


Unlikely, since the SoC is different.


I salute you sir!

What’s the current roadmap for CE? I saw there’s a new CE environment in Win10 IoT, but it feels like too-little-too-late to save the myriad of embedded enterprise CE systems.



> What’s the current roadmap for CE?

No clue sorry! I left many many years ago.

CE7 was incredibly good, the minimum image size we had it running on was ~20MB on disk.


I saw the WPF-esque XAML demos for CE7 - I was really impressed and also hopeful that it would lead to smooth fluid UIs in embedded devices like car infotainment systems and home thermostats.

Oh how naive I was!

(I assume the push for WPF-like XAML in CE7 was because at the time (2011-2012?) there was still hope of a CE7-based Windows Mobile competing with iOS and Android, and Microsoft realized the mistake of not modernizing CE's UI system at-all between 2002 and 2012? Windows Mobile 6.5 was a hideous eyesore (and all the pretty home-screen functionality was literally just skin-deep).


Palm and handspring used moto 68k before PalmOS 5.0, ARM since then


Are you sure?

Intel makes both CPUs and GPUs.

AMD makes both CPUs and GPUs.

nVidia would make both CPUs and GPUs.

What would be the difference?

Actually, as long as there are a number of arm design licensees, the situation would be way better than it currently is for x86: beyond Intel and AMD there is basically no other player in that field (Via? Transmeta? are they even noticeable?).


The difference is that Nvidia would be a customer and the owner of ARM at the same time. An Nvidia controlled ARM will primarily exist to make Nvidia happy at the expense of all other ARM customers.


nVidia already makes CPUs and GPUs (they make the CPU that goes into the Nintendo Switch, for example). Buying ARM doesn’t change much in that regard, except that they can save on licensing fees and it puts them in a position to disrupt competitors who also produce ARM processors.


I would think not licensing new IP would be the main concern. basically hogging ARM IP completely for themselves and their products. It would take a couple of years for companies and devs to switch to a different architecture, but you can be sure that there are competitors willing to fill that gap.


Why would nVidia disrupt competitors?

They need to pay $30bn to buy ARM which means they want to maximize the return for that purchase, which would be selling more and not disrupt for competitors to look elsewhere.


Why give them the opportunity to do so?


What’s at steak? Why is it too much?


It’s not that they will go out of their way to harm it. The ecosystem will change from being the top focus of arm to becoming more of an afterthought. Nvidia arm could end up taking more of a take it or leave it stance because those customers are no longer the top priority.


> Apple won't be phased; they basically design everything themselves

Their new notebooks are still moving to an ARM-based CPU... Even if they enhanced the ARM design they still have an ARM licence.


Apple licenses the architecture(instruction set), but the chip design is theirs, I think


I would assume they had protected themselves before the SoftBank purchase. Also, it's odd that they refer to the new stuff as Apple Silicon.


>Also, it's odd that they refer to the new stuff as Apple Silicon.

Apple supposedly pays intel more for their CPUs than other OEM to avoid having their sticker on Macs; So I don't see this as odd, it's usual Apple branding tactic and from the looks of it(ARM being under threat of sale) it seems to be a wise decision as an average Apple customer wouldn't mind if it switches the architecture for its silicon.


>Apple supposedly pays intel more for their CPUs than other OEM to avoid having their sticker on Macs;

It would be more accurate to say Apple does not receive the Marketing Discount from Intel.

In reality i seriously doubt those discount dont materialise. Intel could have given the TB3 controller for free as part of the deal. Likely less than the Sticker Discount, but still a decent cost saving.


I don't think Intel stickers are mandetory. Some other OEM don't have them either.


Intel gives you a discount for putting stickers on there though.


Apple did make a big thing of it when they switched to Intel.


> "Apple ... could hard fork tomorrow, or two years ago, and nobody will notice"

Could they really do this? Start making incompatible designs? I would have thought that Apple's ARM architecture license would include terms to preclude incompatible forks.


>I would have thought that Apple's ARM architecture license would include terms to preclude incompatible forks.

For an ordinary licensee you might think so, but Apple may well have special terms, given that they co-founded ARM


> given that they co-founded ARM

?????


> The company was founded in November 1990 as Advanced RISC Machines Ltd and structured as a joint venture between Acorn Computers, Apple Computer (now Apple Inc.) and VLSI Technology.[17][18][19] The new company intended to further the development of the Acorn RISC Machine processor, which was originally used in the Acorn Archimedes and had been selected by Apple for its Newton project. […] The company was first listed on the London Stock Exchange and NASDAQ in 1998[23] and by February 1999, Apple's shareholding had fallen to 14.8%.[24]

* https://en.wikipedia.org/wiki/Arm_Holdings#Founding


Holy moly, how did I never know this? I've worked on ARM based systems for years and never run across this tidbit. Thanks for sharing!


From the LA Times in 1990

https://www.latimes.com/archives/la-xpm-1990-11-28-fi-4993-s...

Apple went on to use ARM in the Newton and other devices....


> Apple won't be phased

Nit: fazed, not phased.


I mean, they presumably also won't be phased either


Unless we're in the Star Trek universe...


That doesn’t jive with how I’ve been spelling it....


It's jibe :)


I know. That was the idea. It’s one of those mistakes that is like nails on a chalkboard for me.


I was expecting you to make another one, but "nails on a chalkboard" seems to be the correct usage. Am just ever so slightly disappointed.


Meh, 'I could care less'.

...I can barely bring myself to type and submit that...


I’m (half-)British and I consider “could care less” a hideous American corruption. Recently I successfully argued my case with an American friend, not coincidentally a physicist, by analogy to zero-point energy.


Yes so am I (person you're replying to) - fully - and also consider it both hideous and American.

I wouldn't call it a corruption though, because I've never actually heard it in Britain; just American film and television, and even on HN.


Very related David Mitchell rant:

"Dear America..." https://www.youtube.com/watch?v=om7O0MFkmpw


When I hear someone say it, I often respond that "I could care less, but only a little bit" to make the point about that phrase.


Both "I could care less" and "I couldn't care less" make sense. If you emphasize the second word ("I could care less") and phrase it in a tentative tone of voice it conveys the meaning - i.e. it's possible that I could somehow care less, but it's a theoretical possibility only.


FWIW Professional Linguists think it's perfectly fine to say "I could care less" and that people who peeve about it are just pointless peevers who don't understand how language works.


That's not the point: "I could care less" is grammatically correct, but the meaning is not what was intended. If I had a cup and wanted to tell you there's no more beer in the cup, I wouldn't say "there's beer in the cup", even though the sentence "there's beer in the cup" would be accepted by professional linguists.


> That's not the point: "I could care less" is grammatically correct, but the meaning is not what was intended.

The intended meaning is obvious. There's plenty of colloquial phrases which say one thing but mean another: "I could murder him", for example, doesn't literally mean that. That is how language works.

And, if you saying "there's beer in the cup" was understood by everyone as meaning "there's no more beer in the cup", professional linguists and dictionaries would be A-OK with that interpretation.


By analogy to vectors though, 'I could murder him' is directionally correct but magnitudinally wrong. Whereas 'I could care less' (as used) is directionally wrong - no you couldn't, your point is how little/not at all you care, so stop talking about the existence of an amount that you do care!


> And, if you saying "there's beer in the cup" was understood by everyone as meaning "there's no more beer in the cup", professional linguists and dictionaries would be A-OK with that interpretation.

The fact that "I could care less" is now clearly understood is not intentional. It's an adaptation to a mistake. It's similar to how people say "irregardless", which is clearly an error. It's unambiguous, and everyone knows what you mean. It's still a mistake, and using "regardless" is strictly better.


> is not intentional. It's an adaptation to a mistake.

Yes, that's what language is. Will you be complaining about "apron" as an adaptation to a mistake?

https://www.quickanddirtytips.com/education/grammar/how-a-na...

> It's similar to how people say "irregardless", which is clearly an error.

It's an error that's been being made since 1795, though, and people have been understanding it which pretty much qualifies it as being incorporated into the language.

(I should note that I am strongly descriptivist and think that prescriptivists are silly people.)

> using "regardless" is strictly better.

I think "regardless" is better, yes, because it's a lot easier to say but that's just my opinion, not a Gilt Edged Rule Of English.


that's kind of a stretch; I would imagine "I couldn't care less" is vastly preferred, and the people that use "I could care less" are most likely doing so out of ignorance and not an intent to convey the subtle "theoretical possibility"


Actually, thanks for doing this. As a non-native, frequent misspellings are incredibly annoying as it’s difficult to see which one’s correct.

Now I learned not one but two important correct spellings. Thanks!


You got me


"nVidia's history of not playing nicely with others": can you elaborate or mention an example? That aspect has escaped me thus far.


There was the apple gpu fiasco. Apple laptop GPUs were failing at an extraordinary rate in 2010-2011, and Apple were getting harangued in the media about it (which was justified). Apple eventually announced a program to cover replacement of the GPUs, but it was very delayed. I'm 100% convinced the delay was caused by Nvidia refusing to accept responsibility. When the program was released, in order to have the repair completed at no cost, your computer had to fail a very specific graphics diagnostic, and the failure would be linked to your serial number in Apple's repair CRM. At the time, no other apple repair program had this requirement — Don't get me wrong, it makes sense to use diagnostics to verify hardware issues, but they aren't the only tool you can use. I'm convinced Nvidia only would reimburse apple for computers that had a recorded failure of this specific test.

Since 2014, apple has almost exclusively used AMD graphics cards in computers that do have discrete GPUs, and I don't think it's unreasonable to suggest this was motivated with their terrible experience with Nvidia.

(I'm aware I'm not really backing this up with evidence, but I think the publicly available facts about the whole situation support this narrative)


That situation was also following Nvidia spoiling one of Steve Jobs' announcements even further back, so there was already a bit of bad blood.


Good point, I had forgotten about that.


Which one was that?


Apple also refuses to sign the kernel extension necessary for NVidia GPU drivers to run. NVidia had made a GPU driver for High Sierra, but literally a version later Apple locked down the kernel so that only signed drivers could be loaded. Anyone who was using Nvidia GPUs through, say, an external GPU box got screwed.


I had really hoped that the external GPU thing would takeoff, and give us a good option for gaming on Macs. I bought the dev kit, and it was nothing but a headache, which stopped working at High Sierra too, because they stopped supporting TB2. I'm bitter about it. It really seems like we ought to be "there" by now, but it's starting to look like it will just never be a thing.


It was reported that the problem was NVidia switching to lead free solder for their chip to motherboard connections but not also switching the potting material to one that had the same thermal expansion coefficient as the new solder.


That was a general problem when the industry switched to lead free solder, see also the Red Ring of Death on the Xbox 360.


Don't get me wrong, it makes sense to use diagnostics to verify hardware issues, but they aren't the only tool you can use. I'm convinced Nvidia only would reimburse apple for computers that had a recorded failure of this specific test.

In 2010, Apple wasn’t exactly a beleaguered company. It wasn’t like NVidia not reimbursing Apple would break the bank. Apple should have repaired it just to protect its reputation.

Jobs himself was famous for saying to high level employees that at a certain level of authority, you don’t get to make excuses.


I do think Apple should have just undertaken the repairs much, much earlier. I do also think that it was incredibly complex situation, and my guess is Apple was flabbergasted that Nvidia initially wouldn't reimburse them for the failed parts. So, internally at apple, my guess is that it was about the 'principle' of the matter. They also probably recognized that if they started replacing the boards of their own volition without an agreement in place, they would never be able to get any compensation from nvidia at all.

Again, I agree with you, apple should have launched the program much, much earlier.

But, I mean, that's besides the point because this isn't a post about how Apple handled these failing GPUs, it's about Nvidia's inability to work with others.


> I'm 100% convinced the delay was caused by Nvidia refusing to accept responsibility.

To be fair, the issue was mainly caused by Apples form-over-function thermal design.

To best of my knowledge, Nvidia GPUs works just fine in all other laptops.


100% not true, my Dell Latitude at the time also died. There were major class action lawsuits at the time and basically all the manufacturers had to accept liability and fix them.


No. I had a laptop in 2010 that also suffered from this issue, and there were also PC GPUs affected.

Laptops were only affected en masse because they have far more heat-cool thermal cycles than the average desktop computer.


Discrete GPUs break all the time, customers are just used to it. Maybe it’s improving since GTX1xxx/RX4xx era but they were extremely unreliable when Apple was using them/industry was switching to RoHS compliant Pb free solders.


This issue was industry wide. RIP my XPS M1330.


> To best of my knowledge, Nvidia GPUs works just fine in all other laptops.

Nvidia bumpgate says hello.


While is was probably a factor, I believe it's also because NVIDIA refuses to let Apple modify their drivers to push their Metal API.


I recall the GeForce Partner Program raised some controversy, although I couldn't say whether or it was bad or good or why.

There's also the issue with them refusing to open source Linux drivers. Supposedly this is because they throttle workstation GPUs to create a market for higher-end versions while reducing the manufacturing diversity, but I haven't heard a definitive source for this.


> they throttle workstation GPUs

Not sure about this, but they definitely prevent the loading of drivers for GeForce cards if they detect a hypervisor (Quadro cards work). Nvidia claims that it's a bug, and that they're not fixing it only because GeForce cards are not sold to be run in virtual machines. At the very least it's a dubious claim since for a short time there was an arms race as workarounds were figured out and the next version of the driver would detect them...

Of course it's market segmentation by obscurity so it only lasted a few weeks and it's trivial to work around, but still it's quite shady since there's absolutely no ill effect from the work around.


It is absolutely not a bug, Disassembly of NVIDIA drivers shows that its looking for specific KVM signatures, and if you modify those signatures, magically it starts working again...


Removing KVM signature is now the standard solution. But there's still a working driver patch to remove the virtualization check, it's available at here.

* sk1080/nvidia-kvm-patcher: Fixes "Bug" in Nvidia Driver preventing "Unsupported Configurations" from being used on KVM

https://github.com/sk1080/nvidia-kvm-patcher


Of course it's not a bug. :-)


Not doubting that this is the case, but what's the reason for nVidia wanting to restrict use in virtual machines?


"Fuck you, pay me". Nvidia explicitly forbids "datacenter usage" of their drivers outside of specific licensed-for-datacenter product stacks. (The only exception is for crypto mining, for some reason.) This isn't the only product segmentation in their lineup, either. CAD software optimizations are only available for Quadro cards and up, virtual GPUs are only available for GRID cards, etc. The same GPU they'll charge $600 to gamers for will often go for $3000 or more for businesses.


> The only exception is for crypto mining, for some reason.

I think this is because they were outcompeted by AMD in the GPU crypto days, so they didn't want to get in the way of that (nor were they likely to succeed in blocking this use).


Why does Intel cripple ECC and I/O virtualization [0] in high-end consumer desktop systems? Force people to buy Xeon, even if the performance is otherwise equivalent or higher for the use case. Why does Nvidia restrict virtualization? Force people to buy Quadro and Tesla.

> nVidia

FYI, a few years ago, nVidia has officially changed their name to "Nvidia"...

[0] IOMMU is not only a virtualization feature, it's also an important security feature to protect the host from DMA attacks of malicious peripherals (e.g. 1394, ExpressCard, Thunderbolt, USB 4). Fortunately Intel no longer cripples IOMMU (VT-D) since Skylake, but ECC is another story.



I stand corrected. I also found a clarification from Wikipedia on the situation.

> "From the mid 90s to early-mid 2000s, stylized as nVIDIA with a large italicized lowercase "n" on products. Now officially written as NVIDIA, and stylized in the logo as nVIDIA with the lowercase "n" the same height as the uppercase "VIDIA".


True, but if isn't an acronym (correct me if I'm wrong) I am not going to use all caps. Because it feels like attention grabbing marketing BS to me.


They don't want people using geforce cards in compute servers/clusters. It's artificial market segmentation since the workstation cards are higher margin. They want customers to pay the big bucks since hypervisor setups are basically only used in business and big budget settings.


> I recall the GeForce Partner Program raised some controversy, although I couldn't say whether or it was bad or good or why.

It was about NVidia wanting to make it so their 3rd party GPU partners, (EVGA, Asus etc.), could only sell their popular GPU brands with NVIDIA in them, so for AMD they'd have to come up with something customers are entirely unfamiliar with.

> here's also the issue with them refusing to open source Linux drivers.

It's not just that. AMD didn't open-source their original Linux driver, presumably there could have been some 3rd party licensing issues, but they wrote a new one that is open-source.

NVIDIA doesn't even let others write an open-source driver for them, they make it purposely difficult to reverse-engineer, sign their firmware that they only release with a massive delay and generally refuse to cooperate.


> It's not just that. AMD didn't open-source their original Linux driver, presumably there could have been some 3rd party licensing issues, but they wrote a new one that is open-source.

Even further than this, they created such a healthy environment for it that there are now two competing open source Vulkan drivers, and the third party one (RADV) is usually winning by a bit, and is now directly supported (with staff) by Valve.


NVIDIA can compete based on the superior performance of their products, but that's not enough. They try to attempt lock-in by encouraging software writers to use CUDA and not use OpenCL.

In many projects that support both, OpenCL and CUDA often give similar performance, and in some, OpenCL is faster than CUDA. So CUDA isn't about getting more performance - it's about excluding everything non-NVIDIA.

I suspect that NVIDIA's gameplaying is what led to Apple deciding to not use NVIDIA in any Macs for the last several years.


They don't need to do any lock-in when the choice is between plain old C with printf like debugging, and a set of programming languages able to target PTX, with out of the box support for C, C++, Fortran and several managed languages, and nice graphical debugging tools.

C++ support only came to OpenCL after it was clear to Khronos that it was time to move beyond C only vision of the world, yet almost no one bothered to implement OpenCL 2.x, which is why OpenCL 3.0 is basically OpenCL 1.2 to make all those OEM compliant, and SYCL is now moving into an backend agnostic C++ stack.

So the competition did not move a finger to make OpenCL eco-system beat CUDA tooling, Google did not cared one second to make OpenCL available on Android instead pushed for their own Renderscript C99 dialect, but it is easy to blame NVidia when competition isn't up to their stuff.

Just like Apple, after creating OpenCL and offering it to Khronos, moved away from their own creation after disputes on how OpenCL roadmap should be managed. Ironically, just like with CUDA, Metal Compute happens to be multiple language aware and having a modern API.


The problem with OpenCL, just like OpenGL, is that it relies on vendor support for essentially the full stack. That means if a single vendor doesn't support the feature, it's basically impossible to ship any code that uses that feature to arbitrary users. Nvidia lagging behind in its support for OpenCL 2.x was probably not explicitly stated as "let's sabotage this competing open standard by dragging our feet" but that was the result. AMD and Intel both put together OpenCL 2.x support, so clearly Nvidia could have finished their incomplete support but didn't.

The fundamental failing Khronos made was repeating the GLSL mistake and not specifying a compiled IR that would decouple OpenCL kernel code from program execution. This was clearly known to be a mistake at the time, given ample evidence from CUDA and HLSL that an IR was possible and could greatly improve the developer experience.


CUDA is probably the biggest thing here, also I don't think they open source much.


>... also I don't think they open source much.

You could say that: https://drewdevault.com/2017/10/26/Fuck-you-nvidia.html


This is increasingly becoming not true. See cutlass and the new release of cufftdx.


These are both things that are unlikely to ever be able to run on anything other than nvidia hardware. It's hardly a mark in their favour over abusive lock-in practices.


Well, of course. They're written in cuda. Why would Nvidia try to write it in a language like opencl that even AMD is moving away from, and instead making fft/blas library calls? At least the source code is there if you wanted to port it.


I think it's very clear that I wasn't commenting on what framework they choose to write their projects in, and you're rather deliberately missing my point.


> These are both things that are unlikely to ever be able to run on anything other than nvidia hardware.

I think it's very clear that the framework was important here. Because you were pointing out that they don't open source much, I pointed out that they do, then you nitpicked by saying it's cuda.


I had no idea, so I just searched and found there are 201 repositories in their github:

https://github.com/NVIDIA


that's a pretty meaningless number without further context


The person I replied to said nvidia doesn't contribute much to open source. I pointed at their github repositories. If you care about the context, it is there, and there is no risk of me summarizing it unfairly.


Having some of your code in github isn't contributing to open source.


nVidia has a very long history of not getting along with Linux.


And now, hilariously enough, they have ended up with a lot of their growth coming from Linux-based systems.

To be fair, I recently bought a new Thinkpad, and with Ubuntu 20.04 and Nvidia CUDA just worked, with absolutely no tuning (I didn't even need to set LD_LIBRARY_PATH, which I found pretty odd). So things seem to be improving, because there's money in it for Nvidia.


The Nouveau drivers have a throttled clock speed due to firmware signatures. And the proprietary drivers are complete mess on wayland, where most compositors and other GPU drivers use GBM, but Nvidia did their own thing and used EGLStreams. Maybe someday in the future the situation will improve, but historically, it hasn't been good.


This sentiment comes entirely from their lack of open source drivers for Linux. It became pretty much a meme after a rant that Linus had about them in 2012.


They have made it even more difficult for open-source to be able to reverse engineer their drivers with signed firmware blobs, push their own solution on Wayland, (ELStreams), over everyone else adopting GBM etc. - in short they refuse to cooperate in an official capacity.

Both Intel and AMD do have an official FLOSS Linux driver and regularly work with the wider community.


Google it maybe? Here's one example thread to pull at: why did Apple stop buying nVidia GPUs across their entire hardware line? EGLStreams is a recent pain in the ass. nVidia's infanticide of OpenCL with CUDA being another painful reminder that nVidia doesn't care about your software needs, just their own.

It's really not hard to find a company that nVidia's pissed off with its dealings.


If there was an infanticide of OpenCL, Apple (the original creator of OpenCL) was at least as responsible as Nvidia.


Interestingly NVIDIA uses a RISC-V core in their GPUs.


Why don’t Apple buy ARM?


They don't need too. They were a cofounder, and have a perpetual arch license


ah, this answers a lot of questions for me :)


This would be ten times worse (for everyone except Apple).


Wouldnt there be IP that Apple needs if they go their own way There was something about how qualcomm Bundles ip licensing with hardware purchases


Apple just spent a billion dollars buying Intel's wireless modem business. If there's still a hardware reliance on Qualcomm, it's certainly coming to a quick end.


I was commenting on the parent comment, that Apple doesn’t need ARM with an example of another company that push their products with their IP licenses


The Observer proposed an interesting solution last weekend:

"The government could offer a foundational investment of, say, £3bn-£5bn and invite other investors — some industrial, some sovereign wealth funds, some commercial asset managers — to join it in a coalition to buy Arm and run it as an independent quoted company, serving the worldwide tech industry..."

https://news.slashdot.org/story/20/08/10/0358216/should-the-...


I feel like with the current political climate in the UK this solution is fairly unlikely to happen.


Actually the UK government has started taking stakes in technology firms where it deems it has a strategic interest. [1]

I can easily see the man now running the UK (Dominic Cummings) deciding to take a stake in ARM. Maybe other ARM customers would also be willing to take (non controlling) stakes to keep it away from Nvidia (Apple of course used to be a big shareholder).

[1] https://www.bbc.co.uk/news/science-environment-53279783


So basically an ARMsrace against Nvidia?


No, no, no. Government should not be in the business of picking winners and losers and this is more about denying someone else that benefiting the whole. Oh its wrapped up in pretty paper with a ribbon on it but the true reason is they don't want "those people" to have it. (being Nvidia but I am certain there are a few more suitors they don't like either)


I'm not generally in favour of government picking winners but this is an exceptional case where keeping ARM independent really would 'benefit the whole'. The argument against Nvidia buying is that they are likely boost their own business at the expense of ARM and the wider ecosystem.

It seems perfectly valid for the UK government to step in to protect its most successful technology firm.


Valid for what reason? There is no evidence ARM would be anymore successful under UK control then nVidias, in fact the opposite maybe true. The history of government owned businesses, esp. in UK is desultory.


According to the article, at least Herman Hauser believes that ARM will likely do poorly under nVidias control. I think the idea is that nvidia is a particularly bad owner considering arm's buisness model (for a similar reason that having Nokia as the main Symbian participant was a bad idea), rather than that the UK would exert beneficial influence. Indeed, the suggestion is that the UK government invests as part of a group rather than takes a controlling interest.


How would the UK governments influence be beneficial in any way? The governments influence will be the agenda of individual politicians, who know nothing of chip design or care about it.

In the US that’s the type of thinking that’s given us the SLS, an $20B launch system that will cost more than twenty times as much as commercial alternatives, and is required by law to use obsolete 50 year old components from key congressional districts.


I think you're agreeing with me. The suggestion is not that the UK government will exert any influence over them at all, just as the Norweigan sovereign wealth fund doesn't exert management control over the 1.3% of global equities it owns.

The point made in the article is that that absence of influence will be better than the negative influence that nvidia would apply.


There is no such thing as absence of influence. Government funding always comes with political influence, and almost always negative.

Norwegian sovereign fund is an investment fund that holds a tiny fraction of each public companies stock in its fund. For the UK to engineer a purchase of ARM would require raising $32B, offering to kick in 1.3% would go nowhere. Anyone with 99% of $32B doesn’t need the UK.

An alternate deal is only possible if the UK offered to kick in a large part, probably at least $10B, and taking a large minority stake. At that level they’d have to have significant influence, or the politicians who engineer it will get run right out of office. What if the new owner moves R&D from the UK?


Because Nvidia will find a way to use ARM to strongly favour its own SoCs at the expense of everyone else’s - that’s why it would pay $32bn. That’s not good for ARM or the UK (or for everyone else for that matter).


NVidia isn’t going to intentionally destroy a $32B acquisition to throw that money down the drain. They are likely to fully support every customer, to protect their investment, and fund it do new things strategically important to NVidia.

If the UK owns ARM, over time every investment decision they made will be required to benefit the party in charge, it’s supporters and their districts. That will certainly destroy ARM.


So they invest in Tegra say, turn Tegra into a hugely profitable success story (which would be great) and then the next moment turn to Tegra's biggest competitor who is also using ARM. Will Nvidia owned ARM provide 'full support' to that customer or will they tip the playing field to favour their own products? We would just have to trust Jensun and his successors. I'd rather not have to rely on their sense of fairness.

I don't like state ownership either. No one is suggesting that UK govt should buy ARM outright - just that it takes a stake alongside other long term investors so that ARM can remain independent.


ARM isn’t independent if the UK takes a stake. It’s beholden to those politicians, and those politicians are beholden to their key supporters.

NVidia will be contractually obligated to support ARM customers. If over time they do a poor job, they just flushed $32B down the drain.this is an imaginary problem that requires zero preemptive government action. It’s like trying to stop someone from buying an expensive car they can afford on the unsupported theory they won’t take care of it.


On reflection Nvidia actually have a fiduciary duty to their shareholders to favour their own products in this scenario.


> NVidia isn’t going to intentionally destroy a $32B acquisition to throw that money down the drain.

It's not just an assumption that they will manage their investment poorly (although that wouldn't surprise me), it's what the likely behaviours of other market participants would be if a key supplier was controlled by a competitor.


Why does the business need to be successful once it’s government owned?


> I'm not generally in favour of government picking winners

tHe MaRkEt WiLl ReGuLaTe ItSeLf!!!


Effective regulation != Taking large equity stakes


He's absolutely right. Unless there is a 100% firewall between ARM and Nvidia then Nvidia will influence ARM to move in directions that help Nvidia's wider business and not the ARM ecosystem as a whole - and that will adversely affect everyone who uses an ARM CPU - and that's basically everyone.

I don't believe that such a firewall is possible in any event. Who will enforce it?

For those who are saying that it's great as it will give RISC-V a boost. Fine in principle, but how long will it take before the RISC-V ecosystem is anywhere near comparable to ARM in mobile?

One final point: the UK government is now considerably more interventionist than at any time in the recent past. Not completely impossible that they intervene in this in some way.


I agree that this deal is good for Nvidia only and will hurt ARM CPU usage.

Nvidia is right now not in the same league as Intel/Apple, but having the most successful consumer and corporate GPU business along with control of Arm architecture CPUs puts them in the same league in my opinion. That kind of dominance is never good for consumers.

Having worked for a company that was acquired and became a daughter company, such a setup is not enough of a firewall.

Slowly, the acquiring company will modify internal systems and internal processes until there is not much left of the original company. And many key employees will leave as they don't identify with the way things are being done.

I would like to think that acquisitions can be a good thing, but that's not the case in my experience.


Nvidia are worth more than Intel.


It is, although it's just in the last month: https://www.tomshardware.com/news/Nvidia-stock-beats-intel

In October 2019 the market cap was 226B vs 108B in favor of Intel. https://www.forbes.com/sites/greatspeculations/2019/10/10/ho...


OneWeb cost $500M, SoftBank wants $32B+. Not gonna happen.


Seems like alarmist FUD to me. While I agree that having Arm owned by any one of its major customers creates some potential conflicts of interests, I think the notion that NVIDIA or any other company would start to drive licensees away with draconian terms/costs with some poor business planning pretty silly.

The more interesting question to me is what would any single (major customer of Arm) company gain from owning Arm? For instance, if NVIDIA wants to be competitive in data center CPUs (which would put them at equal footing with AMD and Intel, not more monopolistic), why couldn’t they go the Apple approach with a perpetual license and design it entirely custom (or not) as they see fit? (AFAIK they already design custom mobile CPUs.)

What do you gain by buying out an expensive, large licensing company that essentially makes no money that leaves you in a weird position with various rivals? Doesn’t quite click for me.


If they are buying the perpetual license and hiring the ARM expertise anyway, why not just get the company, which also includes giving nVidia majority control of the ARM roadmap as well.

You get the people who literally wrote the book on modern ARM, you get all the IP which builds up the patent chest (improving chances of counter-suit and cross-licensing in the event of patent infringement issues), you don't need to pay for the license, and you know you aren't going to get fucked by roadmap or licensing changes in future. And your competitors literally end up paying you for privilege of competing against you.

Not saying it's right, but there's certainly multiple arguments for it.


> If they are buying the perpetual license and hiring the ARM expertise anyway, why not just get the company, which also includes giving nVidia majority control of the ARM roadmap as well.

Because the perpetual license is much much cheaper than $32B.


The most cynical view would be that they could not reup licenses for competitors.



Afaict that seems to be an agreement that a patent holder enters into a third party with to promise to license their patents to implementors of the standard. In this case there is no standard, just the patents and associates technology, so I'm not sure that applies. I would be shocked though if big players didn't have similar long term agreements with Arm.


No one is going to pay $32B for a business, then cut off its key customers and revenues.


nVidia already owns Mellanox which today seems very monopolistic for any HPC-interconnect. Them buying ARM means, they could just stop supporting everything else someday.


I for one would welcome this. Firstly, as mentioned, it gives people incentive to look into RiscV.

That aside, for far too long QCOM has had a monopoly on the phone chip market. Patents aside, which increasingly became irrelevant due to the death of Sprint, their major advantage has always been in Adreno, which as far as I remember AMD basically did for them. Mali is terrible. If Nvidia could revamp ARM graphics at the spec level, we'd have real competition again from Samsung, Mediatek, and potentially others.


Samsung recently entered into a licensing deal with AMD to use their graphics technology in a mobile design. Early performance leaks/rumors are very promising (if true).

I don't see this as much of a problem, competition is already on the way.


After what, 8 years? I welcome competition, but that's ALWAYS been the exynos curse. What if Nvidia makes a better GPU spec?


The best results on these graphs are not from Adreno

https://www.anandtech.com/show/15963/oppos-reno3-vs-reno3-pr...


> Mali is terrible

care to elaborate?

the Panfrost project has reverse engineered a bunch of popular Mali ISAs and now there's hardware support on the kernel for VA-API on these chips.

https://www.phoronix.com/scan.php?page=search&q=Panfrost


> has reverse engineered a bunch of popular Mali ISAs

As a FOSS user and developer, this is already a big problem to me. Although the situation of Nvidia's mobile GPUs (e.g. Tegra) is a bit better than desktop GPUs, but not much better.


Why is it a problem for there to be additional hardware usable with open source drivers? Or is it that it was reverse engineered?


The problem is not reverse engineering, but having to reverse engineer.


i should have clarified that i was asking why Mali is more terrible than Adreno, MediaTek, etc.

AFAIK, all embedded GPU drivers have to be reverse engineered and none are open source.

as you mentioned, it looks like Tegra might have some open-source contrib from the mfg.


The VC4 (RPi GPU) drivers were written by an open source contributor (Eric Anholt) who was employed by Broadcom to do that.


After many many years of petition from the community to even release documentation on the VC4.


Mali has never come close to Adreno. Simple as that.


It will be the end of Linux on ARM. They will choke platform with blobs until the inevitable death. NVIDIA and open-source is like matter and antimatter - they simply cannot coexist in the same place and time.


ARM is all about licensing and selling hardware IP, they will use Linux to the extent that it helps to sell their IP, just like most companies that actually pay for engineers to contribute some parts of their crown jewels to Linux, while leaving the rest for their own in-house distributions.

Also in case you haven't noticed, the IoT domain where ARM thrives is getting full of BSD/MIT POSIX clones, guess why.

ARM mbed, Zephyr, RTOS, NuttX, Azure RTOS ThreadX, ...


Having an liberally licensed code-base with make it very easy for entities to publish a benign version on github, and then deploy an "enhanced" version on the real hardware, cryptographiclly signed, and not dumpable. They'll find something, that can support the argument, that not all functionality can be open-sourced, and that is why the published binary will never have the same checksum as something you have compiled from the public sources.


From my comment history you will see I am pretty much into commercial and dual licensing, so I do agree with your point, however kind of feels like the way copyleft licenses have been managed has brought us back to the commercial licenses with source code, just under a different name to make them more appealing to younger generations.


Hi, thanks for replying!

> the way copyleft licenses have been managed has brought us back to the commercial licenses with source code, just under a different name to make them more appealing to younger generations.

I think, I'm following along (&agree), but to be sure, could you perhaps elaborate a bit on that? (Also for other? readers)


Basically Linux kernel, GNU utilities and GCC are the only projects left with a copyleft license, almost everyone else migrated to non-copyleft licenses in the context of making money with some form of open source.

And not everything gets upstream, in name of keeping the main business at the company's soul, for example those optimizations used by clang on PS4? Sure all of those that don't reveal PS4 architecture features that could eventually be "misused".

The large majority of those business have moved into clang, thus reducing GCC usage to copyleft hardliners, Linux is visible on ChromeOS and Android but not on a context that it can fully profit from and then there is Fuchsia waiting on the backstage.

So in a couple of generations, when copyleft software is just a memory in digital books, we will have gone full circle to the days when buying developer software would entitle you to an extra floppy with a partial copy the source code.

The only difference is how that partial copy gets delivered, which will be just the upstream of non-copyleft software.


O.K. so we see the same things.

I can't discern from your writing tone, if you think the above is desirable or a train_wreck_in_slow_motion coming.

I think, it's the latter, not necessarily by direct effect, but by all the bad behavior this will enable in corporations for generations to come.


ARM has little use except for open source. They have zero incentive to kill it, as they don't compete with it.


> ARM has little use except for open source.

Huh?? ARM is probably the most popular CPU architecture in use today -- unlike x86, which is largely limited to desktop/laptop and server computing, ARM is used in a lot of embedded applications.


Have you the counts of systems using ARM on Linux(Android) vs ARM in everything else? It's not even close.


You're underestimating the number of ARM microcontrollers in deeply embedded devices. For example, most hard disks and SSDs have at least one ARM core -- and often several -- running the device. They're also quite common in peripherals, like keyboards and mice. The firmware for these devices is often either a realtime embedded OS (like FreeRTOS), or is entirely bespoke. It's virtually never open-source.

Even if you're only looking at mobile phones, the cellular baseband is often implemented on ARM as well, and it definitely isn't a Linux system either. (Nor is it open source.)


Indeed. https://www.cambridgenetwork.co.uk/news/record-numbers-arm-b...: “Arm silicon partners shipped a record 6.4 billion Arm-based chips in the fourth quarter of 2019”

Compare that with https://www.slashgear.com/the-2019-global-smartphone-sales-t..., which says about 1,5 billion smartphones were sold in the four quarters of 2019.

Also, Android, I think, by now is an edge case of open source. The kernel is, but there’s a thick proprietary layer on top of it without which it wouldn’t be Android.


I'll just reply once here. Thanks for the source. Keep in mind each cell phone can/does have multiple ARM chips, which I attribute still to 'Linux', maybe that's not fair.

I tried to get real stats and didn't get far. The most I could find was that ARM was actually declining somewhat in embedded, and that embedded was less than half of revenues. Also, that ARM had 90 percent of the 'phone/tablet/laptop' market, but nothing amounting to close to a majority anywhere else.

I think I was incorrect saying that it isn't close, it likely is. I still think more go to supporting a Linux device than not, but am happy to be proven wrong if such data exists.


I don’t have those numbers, but would guess that, the smaller the ARM CPU, the more of them are sold.

Also, the more are sold, the more hardware costs become important relatively to development costs, making Linux a poorer choice than smaller OSes or even bare metal.

(And the smallest really are small. https://developer.arm.com/ip-products/processors/cortex-m/co... tells me you can get a 0.0066 mm² 32-bit CPU. That one doesn’t have a MMU)


Last I heard was hard disks and SSD manufacturers were moving away from ARM to RISC-V to save money on ARM license payments.

Source

https://hn.algolia.com/?q=swerv


Heck, every AMD processor has an ARM core on it to run some firmware. So even if you're running x86, you may also be running ARM.


You seem ignorant of exactly how many embedded systems out there are not running linux but smaller OS or just bare metal custom kernels. You should research it. Looks like someone down below is less lazy than me and gave some source so do some reading. ARM cores are everywhere buried deep, where you can't see, waiting for skynet to come along


You nailed it. Nvidia's refusal on releasing sufficient technical information and noncooperation with the FOSS community is already self-evident on multiple platforms.

Need to play the "Blob" song by OpenBSD.

https://www.openbsd.org/lyrics.html#39

> Blob was popular at school he was helpful too / He could get your motor runnin' / with a drop of goo / He was givin' it away never charged a dime / But by the time he graduated / Blob was business slime!

> He was a blah blah blah blah blah blah / blah blah blah blah blah blah blah blah / blah blah

> He's givin' you the Evil Eye!

> Now everybody had it / they was drivin' around / They was givin' up their freedoms / for convenience now / Blobbin' up the freeway, water black as pitch And somehow little Blobby was a growin' rich!

> He was a blah blah blah blah blah blah / blah blah blah blah blah blah blah blah / blah blah blah blah blah blah blah blah / blah blah

> It's linkin' time!

> Now it was out of control / n' fishy's came to depend / on Blobby's Blob Blah, seemed to be no end / Then his empire spread and to their surprise / Blobby been a growin' to incredible size!

> He's a blah blah blah blah blah blah / blah blah blah blah blah blah / blah blah blah blah blah blah / blah blah blah blah blah blah / blah blah blah blah blah blah / blah blah B-b-b-b-b-b-b-b-b

> Then along came a genius Doctor Puffystein / And he battled the Blob / who had crossed the line / He was 50 feet tall — Doctor said "No fear" / I got a sample of Blob I can reverse engineer!

> But it was too late! / Blob was takin' over the world! / He wants your video! / Ya he wants your net! / He wants your drive! / He wants it all!!

> Somebody help us! / Noooooooo!

> NVIDIA! / Intel! / Atheros! / 3-Ware! / VIA! / ATI! / Broadcom! / TI! / Myricom! / HighPoint! / Adaptec! / Mylex! / ICP Vortex! / and IBM!

> Takin' over the world!


did you really spam the whole song here ?


No. This is not the whole song, the whole song is longer than that.


It would have been funny if you replied "that just a tribute" (semi-quoting a Tenacious D song)


QCOM dominance has absolutely nothing to do with Adreno.

It has to do with their superior modem.


Even with the potential for some positive outcomes I'm not sure it's worth the potential disruption.


Could someone explain me what the huge deal about ARM is since they only licence but don’t actual fab the chips but intel, Apple and nvidia do.

Also what’s the difference between ARM and RISC-V. Why chose one over the other ?


> since they only licence but don’t actual fab the chips

From the article: "It's one of the fundamental assumptions of the ARM business model that it can sell to everybody. The one saving grace about Softbank was that it wasn't a chip company, and retained ARM's neutrality. If it becomes part of Nvidia, most of the licensees are competitors of Nvidia, and will of course then look for an alternative to ARM."

Implying NVidia's competitors who use ARM would fear losing their license.

> Also what’s the difference between ARM and RISC-V. Why chose one over the other ?

They have technical differences, but TBH I don't know them and I'm not sure they matter. However, conceptually they have two big differences IMO : RISC-V is open (don't need to buy a license to use it) but ARM is way more mature and supported (RISC-V based computers are still very rare).


RISC-V sounds like a good idea, but (imo) could lead to a situation, where all/many device manufacturers make custom processors based of the liberally licensed code-base. At first, this sounds great, "Yah, no more reliance on Intel/ARM/, no more ME/Trustzone". But then you realize, that although you, the user/buyer, can build/run some custom code on there, you'll never be able to tell for sure, if you did not get a version/series with a gimped HWRND, or other (security-reducing) goodies. Targeted attacks will become very easy for anybody, who can pressure companies/engineers into adding something for (a subset of) the customers, or can reroute/intercept device shipments to you.

ARM is a "known" situation. They have TrustZone, we know it exists, but buyers/end-users do not get full control of it (outside of some niche products, or dev-boards). That situation will not change. With ARM, get already got used to the idea, that you'll never "own" the machine, you are merely allowed to execute some code in a walled-off area of your Telescreen.

So, with RISC-V, you have even less assurance, that the processor IC really contains what you expect, and nothing more.


> But then you realize, that although you, the user/buyer, can build/run some custom code on there, you'll never be able to tell for sure, if you did not get a version/series with a gimped HWRND, or other (security-reducing) goodies. Targeted attacks will become very easy for anybody, who can pressure companies/engineers into adding something for (a subset of) the customers, or can reroute/intercept device shipments to you.

I fail to see how this is different from today's situation. What's stopping targetted attacks via custom modifications of ARM or x86 chips?


> What's stopping targeted attacks via custom modifications of ARM or x86 chips?

Nothing, if you have power over the manufacturers. (Usually governments)

But now, with RISC-V, anybody with deep enough pockets to spare a few million can create fake silicon, where you'd need an electron microscope to prove, that you got an "enhanced by TAO" chip.


> anybody with deep enough pockets to spare a few million

Which for all intents is ... governments.

So no real difference.


RISC-V allows companies to make closed systems, but it also allows them to make open systems, which wasn't viable before.

Most people will keep using closed systems because they don't care about computer architecture, but us hackers will finally have a chance at having a (small) portion of the market. A small industry of open hardware.


> RISC-V allows companies to make closed systems, but it also allows them to make open systems

True, but can you recall any moment in history, where market forces did not actually make producing "open" devices unprofitable? So, (imo) in theory, you are right, but in practice, (successful) companies will use the technology to create "even more closed" system, and worse: Make systems, where the user can't even tell, if he got an "open" system.

> A small industry of open hardware

That you will only be able to use in a lab/isolated_env, or illegally, because mandatory signing of FWs/BLs/OSs/APPs is just around the corner for consumer devices capable of speaking to the internet.


> True, but can you recall any moment in history, where market forces did not actually make producing "open" devices unprofitable?

I'd say that the early era of personal computers proved it. The Apple II and IBM PC were very open platforms-- thoroughly documented, developer friendly and made with very few parts you couldn't buy out of your favourite electronics distributor catalogue.

They were riotous successes in large part because the open platform meant they didn't have to guess or corral the entire market. Remember that most of the killer apps of the era (VisiCalc, WordPerfect, dBase, 1-2-3, etc.) were third party, and it was largely aftermarket firms selling "six-in-one" cards, Sound Blasters, and Hercules graphics cards that patched the gaping holes in the PC's hardware design.

Conversely, look at a machine like the TI-99/4 series: while on paper a decent machine for the time, it was as "closed" as a machine of the era could be: an unusually limited built-in development platform, and many of the desirable upgrades involved buying a very expensive expansion chassis.

It's interesting to consider what the business tradeoffs of today's walled garden model are. Apple could choose to do a "pure hardware" play, or emulate the early 1980s IBM, where they offered some popular software via their own product line as a trusted source. The result would probably be a marginally larger total addressable market, as it would bring some customers and developers to the platform who needed something that could never clear the App Store before. But would the growth in market size compensate for the reduction in the average cut they can command from the market?


> IBM PC were very open platforms

What about the whole debacle around the IBM BIOS?


I think they made a commented disassembly available officially, and there were definitely aftermarket books with blow-by-blow analysis.

You could argue it was imperfect (not exposing every feature it could, and having performance costs, sent a lot of software to bypass it and pound hardware registers directly, creating that weird world of early clones that would run MS-DOS but not 1-2-3 or Flight Simulator) but it doesn't seem like a hostile gesture to external developers.

Now it was definitely a legal hurdle if you wanted to make an outright clone, but by 1984 or so, Compaq and Phoenix had solved that problem.


mandatory for whom?


I don't know, where you live, but in most of the world, certification/signing/encryption of RF-capable Devices/Firmware has been established rule for decades, with a few exceptions for dev-boards. TPM, ME, SB, the_recent_grub_fixes,etc are all steps putting in place a pervasive infrastructure, that will enable truly draconian control over our computing devices for the key-holders.

And I have not reason not to believe, that somebody somewhere is waiting for a pre-text to implement full vertical stack control in law for his (region/country/domain). Perhaps "so that we can ensure, that the contact-tracing app gets installed everywhere, otherwise we'll all die!!11!". Or protecting us from Terrorism; that usually does the trick. This argument also works surprisingly well to argue against backdoor-less encryption protocols. See e.g. Australia (iirc)

Do you not perceive a similar evolution?

Note, TP,SB,ME could in theory be used to build "secure" systems, but currently they are (also) used to secure the PC against you, the buyer/user.


If your device isn't hackable then it's not because of ARM imposed limitations - it's because of decisions made by the device maufacturer and there is no reason why that would change if they were using RISC-V instead.


This is completely backwards.

See India's Shakti

https://www.economist.com/asia/2019/10/03/india-is-trying-to...

... where they will fab their own cpu chips for, eg, military use starting from inspectable, verifiable open source Risc-V HDL.


If you re-read my post, you'll see, that we are not contradicting each-other.

If you have the time/money/infrastructure/knowledge/trusted_personnel to design/make/test/etc entire processors, then now with RISC-V you can in theory make secure systems.

Also remember, that your effort needs to already start at the level of tool-chain, and other tooling, all the way down to the transistor. Otherwise you'll have a trust problem with regards to your compilers/synthesizers/etc.

But that is quite a high barrier for entry for anybody smaller than a decent sized country.


> So, with RISC-V, you have even less assurance, that the processor IC really contains what you expect, and nothing more.

What you wrote, quoted above, has it completely backwards.


If you haven't produced the processor, and the tooling, and the base components all by yourself?.


shrug someone is wrong on the internet... you have a choice to learn something, or keep trying to pretend to have been right... nobody cares...


I enjoy learning every day. And -generally- I'd like to know, where my knowledge has limits.

You seem to care, and claim to be wise, so please, do enlighten me, where I'm wrong. We may be misinterpreting, what the other one is trying to say.

I'll try to re-word what I mean: If your "system" uses standard off-the-shelf parts (CPU/SOC/Mem/...), then the fact, that you can go out to the shop, and buy standard replacements, means that you can be reasonably sure to be able to thwart/detect a targeted attack on your supply-chain easier, than if your device contains a e.g. custom, specific CPU for that device, that could contain god knows what extensions to the instruction set, and that you can only get a replacement for from that one specific vendor.

Or where do I have dent in my logic?


Neither Apple nor nvidia fabs their own chips.


I find it baffling that SoftBank isn’t simply taking ARM public. If ever there was a company that should be independent, it’s ARM. Is there a good explanation out there for why this isn’t the obvious choice?


Most likely, money. Specifically, they probably think they can get a better price from nVidia than on the public market.


Son wants $32B+ so he doesn’t take a loss on ARM. ARM is worth no more than $10B, so he needs to find a strategic buyer.


There would be a certain symmetry if Apple took a (non controlling) stake in ARM to keep it independent given that sales of ARM shares in the 1990s were a significant factor in giving Apple financial stability in the 1990s.

[1] https://appleinsider.com/articles/20/06/09/how-arm-has-alrea...


NVIDIA's stock is so massively pumped up. They will simply make an all-stock offer that's pretty much impossible to refuse and that will be the end of it (most likely).


NVIDIA is the only hardware company that takes software serious.

In every hardware thread, someone complains about hardware people being behind on software technology. And then there is NVIDIA, the company that provides the best development environment for GPU development, the company that used chip simulation for their first processor.

AMD was kept alive by oil money and console contracts. NVIDIA made it on their own. They were innovative in the past, they can be innovate in the future. To me, that's an opportunity for much more future growths.

How much is too much if NVIDIA can take Intel's market? And we are only at the start of machine learning and product simulations.


This is why I think it is likely to happen too. NVDA needs to replace the pumped value with real value and this is a great way to do it.


Sounds like a protectionist who doesn't want ARM to be owned by any foreign entity.

"most of the licensees are competitors of Nvidia"

Is this true? It seems like quite a stretch. Nvidia doesn't offer a chipset for the mobile, battery powered market(it tried, and gave up because of competition from Qualcomm).


> Sounds like a protectionist who doesn't want ARM to be owned by any foreign entity.

Should've thought of that before they sold it to a Japanese fund.


Well, he has according to the article.


Tegra (as found in Nintendo Switch). Have they given up on that line of SoCs?


They still make them and they're expanding them, but mostly for the mobile/robotics market. AFAIK they're not venturing into mobile again.


And shield, as set top box.


>Sounds like a protectionist who doesn't want ARM to be owned by any foreign entity.

Yeah, so? If the 21st century taught us anything, is that protectionism makes sense for many products/services (not that big countries really forgot it - they only "forgot" it when for the time that their products was the only game in town).


The 21st century has taught us that free trade makes the most sense.


Not really. Except if one confuses free trade and protection with planned vs market economy (which is a different distinction).

The big powers that profited most in the 20th century, weren't following free trade - they were colonial and post-colonial overlords, enforcing their trade deals and terms on lesser countries.

Free trade was OK for those big countries like the US, UK, France, Germany, etc they were running the shots, and were net top exporters themselvess...

And even at that, they themselves could not care less about "free trade" doctrines. Sponsoring your own country industries, using your political force to get global deals, subsidies, etc, were OK for the mid-20th/early 21st century, when they had done it.

When that changed, and their trade partners got increasingly bigger, "free trade" got too much for them and pressured their partners countries (like with the Plaza Accord).

Now that the US can't do the same to China (due to its sheer size and power), old globalization and free trade flag-bearers discovered suddenly that free trade might not be the "be all end all".


>Now that the US can't do the same to China (due to its sheer size and power), old globalization and free trade flag-bearers discovered suddenly that free trade might not be the "be all end all".

This is one of the main reason why US is fighting China.


Isn't Tegra their mobile chip?


Tegra started out as their mobile chip, but has evolved into their robotics/automotive/industrial-SoC platform through the Jetson series of modules which are alive and well.

[Sidenote: this has resulted in the hilariousness of taking an Android kernel and development infrastructure and them slowly trying to make it back into something that looks like a normal Linux system. Their board support package is a hot mess, but gets used in real products because the hardware actually works well. A software company NVidia is not.]


It's "mobile" as in cars and robots. It's not mobile as in cell phones.


It's also in the Nintendo Switch :)


Only the older chip series they could not sell to tablet makers. The newer chips are too power hungry for mobile and are targeted at automotive where Nvidia has gotten traction. Nintendo better hope Nvidia has something new coming up in time for a next generation.


It might happen with a process shrink but that remains to be seen.


AMD have to license their cpu to China in order to get a life line. when AMD stock was in the toilet.

Softbank doesn't want ARM anymore. what do Hermann Hauser proposed? Let ARM holding die(?). He offer no solution.


If I understand correctly, his solution is to make ARM a publicly traded independent company, rather than selling to another company like nVidia.


> Let ARM holding die

Why would ARM die? it's very healthy and financially stable right now. it's softbank who's bleeding money.


Competition breeds innovation they never should have sold in the first place. If an investment firm won't hold them they should start an IPO instead of seeking value validation from contending clients and players in the hardware industry.

I'm tired of the tech industry's standard duopolies. ARM breaks that trend by licensing to Qualcomm, Samsung, Apple, MediaTek, & others; surrendering independence today could hurt the future of tomorrow.


I would dearly love for the industry sh1tstorm that would happen if Huawei would buy ARM (please!) Would like to see Donny try to ban a Huawei ARM, or order Google to only supply to hot slow Intel Atom mobiles, with everyone of the asian makers walking away from the US market, while the world switches to an OSS Android fork.

Although I still for the life of me can't understand why Miyoshi son is holding on to wework and selling ARM.


Because ARM is one of the few things he invested in that actually makes money.

It's a cash-crunch driven deal, nothing else.


Yep - I was always told, keep your dividend-paying winners, sell your losers.

Wework is definitely a loser, ARM is a winner - its completely upside down. If he wants cash, he should sell his trash. And avoid future losses by not buying trash in the first place.


Rather Alibaba and Yahoo Japan are Softbank's dividend-payers. ARM has proven nothing more than a bargaining chip to kickstart the vision fund and now a crunch zone to keep the group liquid. ARM's valuation has barely moved despite the continued dominance of the platform.

Selling ARM lets Softbank keep WeWork living for a couple more years, enough time to bring the non-japan markets into cashflow positive status. Japan by some miricle was cash flow position from the start, he must think the model can work overseas.


That would never happen. Huawei has made their own bed by consorting and being inextricably tied to the Chinese Communist Party, they deserve everything they're getting, while on a personal level I despise Trump, I don't mind that he's sticking it to China on privacy and industrial imbalances between the USA and China, while also stating the obvious that it's bad to use Chinese state controlled hardware at the heart of your communications systems


Surely a sale would be allowed if regulatory compliance compliance is met? I was part of a private semiconductor company bought by a public American company and the buying company had to convince the US gov they would not turn the product line into a captive portal. I expect it's the same here?


Btw. I'm still curious about your fiction writing side gig. Post an update about that again some time?


So what happened?


They had to get clearance and satisfy the USG the products would still be available to third parties. Had to do this in Korea and China too. Once the governments were satisfied, the acquisition was completed.


conversely, it may be just what RISC-V needs to go mainstream sooner.


I think he's right. A sale to Nvidia would be detrimental to ARM's current business model - now Nvidia may have other plans but the thought of that would certainly scare existing licensees even more.

Should the UK government intervene? Yes. Will they? Probably not.

The detail here is more nuanced than protectionist vs free market - but I don't think most people realise that and see it as just protectionism. It isn't I don't think.


It’s just protectionism with different justifications.

There is no evidence the UK government could run ARM better than NVidia. In fact the opposite is true.


Any other real non US company can benefit from it except Chinese companies ?


Some companies business models are simply better when they are independent rather than taking ‘strategic’ guidance from a parent company.

If Nvidia does buy ARM it’s time to buy stock in anyone who does RISC-V


There are all kinds of Euro-based electronics firms such as Alcatel or Phillips or Siemens.


I’m aware of them but do they really develop any mainstream devices ? I might be wrong but aren’t those companies working in their niche eg cell towers etc. Isn’t that a way tooo narrow application of such a great tech ?


None of them can afford arm though.


NXP Semiconductors.


Infineon?


Me too in agreement. NVIDIA got interested in ARM after the launch news of Japan Supercomputer. They want to suppress, I think.. as it's bad for their business. (Twice faster than First Supercomputer ) ...


This IS good news . If nvidia's history is something we can go by then the arm licencing etc will be so miserable that people will flock to riscV


--


I think you're confused. He didn't own ARM, it wasn't his choice to sell. Hauser was vehemently against the SoftBank sale.


You're right. I retract my criticism of his position. It's completely consistent. Thank you.


it's been said before, they need to just IPO.


Not for RiscV ;)


British government should not bother to clean up the mess capitalists make. They will have a hard enough time living with the consequences of brexit and corona without buying up companies.


How could it not be?


I suspect Apple would buy ARM before NVIDIA would ever get a chance to.

That is of course, to protect their own interests as opposed to making it into a successful business in its own right.


Apple walked away from the opportunity. Anyone know why?


Apple founded ARM and walked away with a perpetual license. They have zero need for ARM, purchasing it would be a huge distraction and conflict.


Yeah, Hermann? Where was this grave concern when you sold to SoftBank? Why don’t you man up and just admit you play favorites?

Edit: His concern was far more muted when it came to SoftBank in 2016 https://www.express.co.uk/news/politics/690588/ARM-Holdings-...


So you're arguing about the level of concern? I can think of lots of reasons to be more concerned about a chip manufacturer buying the design/licensing business literally at the core of the majority of mobile phones than of a technology investment firm.

Or maybe you should just admit that your original comment was not well founded?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: