Hacker News new | past | comments | ask | show | jobs | submit login
The Case for IBM Buying Nvidia, Mellanox, and Xilinx (nextplatform.com)
76 points by Katydid on Feb 7, 2017 | hide | past | favorite | 47 comments



Please, no. I work with both IBM and Nvidia and I can say such an acquisition would be a cultural mismatch that would make Daimler/Chrysler look like a boardroom tea party.

IBM is hollowed out and tries to cover that up with headline-making acquisitions and effective marketing. It's aimless and sclerotic, trying to pursue growth markets but is largely incapable of making good products; constantly over-promising and under-delivering.

NVidia has it's best days ahead of it. I think the direction it's going with ARM has more legs than what they're doing with POWER. They make innovative products that work, and have fantastic partnerships across the industry.

Perhaps more importantly, the target markets of these two companies are diverging, not converging. IBM is desperately trying to move up the value chain, away from hardware, to remain competitive. You could hardly say the same about Nvidia in how it's pursuing the automotive industry.


While IBM has made some good acquisition * , the executives are either clueless or too busy with infighting. Their wild flailing around has caused several good acquisitions to be squandered and if the "blue-washing" (aka. onboarding) doesn't alienate the staff, then neglecting the product and asking them to work on/with horrible IBM technologies will.

NVidia would be doomed instantly, but it might take a few years until the effects were felt.

* I believe when this happens, distinguished engineers or architects were consulted and their advice was followed.


what is the verdict on compose.io? will it be blue-washed?


well, like e.g. Ustream and unlike the infamous storage mergers (Softlayer, Cleversafe, Cloudant, XIV, Storwize, TMS, etc) it seems to have remained a separate company. maybe because it doesn't fit an existing IBM division? so it might be safe, as long as it isn't incorporated into the shitshow that is bluemix.


It's typical to be a separate company until bluewashing occurs.


Most compose.io people are still there. Integration is/was pretty thoughtful, so lots of what crushes tiny companies within IBM was delayed.

I think these acquisitions go wrong for customers _after_ the original team leaves, for the most part.


I knew lots of good people at IBM, but they're almost all gone now. Nvidia can be a pain in the butt (read: how they cram their software stack to run on Linux), but I can only see things getting bad quickly if they were to be acquired by today's IBM.


Nobody every got fired for choosing IBM... 'cept for the IBMers.


You're going a little easy on IBM.


Wow! My thoughts exactly! Having worked at both companies, I can confirm.


>> NVidia has it's best days ahead of it. I think the direction it's going with ARM has more legs than what they're doing with POWER.

I'm thinking NVidia doesn't really like ARM, they are putting a RISC-V core on future GPUs after all.


I think the RISC-V cores they're putting on their chips are more for management than as application processors. If they didn't use RISC-V there, it'd probably run one of their own ISAs rather than ARM.


It is for management, but it is also replacing the one they had with their own ISA. One of the slides also indicated a desire for "Rich OS support" - see the third slide:

https://riscv.org/wp-content/uploads/2016/07/Tue1100_Nvidia_...

I keep thinking their future may eventually be Tegra with Risc-V replacing ARM.


> I keep thinking their future may eventually be Tegra with Risc-V replacing ARM.

Why? RISC-V doesn't have nearly the acceptance that ARM does, and they already get ARM for minimal cost (e.g. licensing fees).

No one is forcing NVidia to design their own cores that conform to the ARM ISA (as Project Denver [0] was rumoured to be NVidia's take at x86 re-purposed for ARM after they were refused an x86_64 license).

So if they just want to slap their GPU onto a reference ARM design and sell it, they'll be like a bunch of other ARM players (Hisilicon, Amlogic, <insert not Apple/Samsung/Qualcomm chip maker here>).

The only reason I can see for including RISC-V in their GPU is that they'll get it for free, and they can modify it to their hearts content without having to tell anyone about it due to RISC-V BSD style license.

This is about NVidia locking down their platform more, through proprietary modifications to RISC-V, than it is about anything else.

[0] https://en.wikipedia.org/wiki/Project_Denver


Sometimes I live in fantasy land. Imagine the Android Runtime is ported to RISC-V in the next year or two - IIRC it uses LLVM which is already there. Go is expected to make it in 2017 - Google likes riscv a lot but hasn't said much about what they want to use it for. I would not be surprised, but neither would I hold my breath.


> IBM is hollowed out

But they have Watson, right? They could use NVidia for the hardware side.


The Watson technology and playing Jeopardy is nice, but it's not that great when applied to some real world problems, as much as Press Releases try to convince people otherwise


A horrifying proposal based on the assumption that servers and network are the end-all and be-all of everything.

"Commodification" is a thing, and Xilinx fits everywhere, not just in places destined to have no profit. In fact, their best markets are the new ones with no pre-existing specialized chips available to replace FPGAs.


Can you please explain what you were referring to by 'new ones with no pre-existing specialized chips available to replace FPGAs.'? I've never really head of any thing new that can replace FPGAs.


Once a market for a particular chip is big enough, it's much more cost effective to replace an FPGA with an ASIC (a custom silicon chip). The non-recurring cost to have an ASIC developed is often millions of dollars, so it only makes sense in high volume applications.

FPGAs excel in low volume, and in products that are still being developed.


Ew, no. IBM would likely kill Geforce, which is the whole reason all y'all are talking about everything but that right now...

Long before Titan or Pascal or any of this built-for-ML stuff, a high-end PCIe gaming card (or quadro/firegl, which were built for sci/eng visualization) was the only way to dabble in OpenCL et al.


You only need to see what IBM did to SoftLayer to foresee what would happen to Nvidia.


They are doing the same to the Weather Company.


On top of the other problems mentioned by commenters: Intel and Altera are an arguably better fit (Intel had already worked with Altera closely as their foundry) and despite the clear strategic alignment of FPGA's and high performance servers, it seems they are struggling to bear fruit.


IBM market cap:$170B

Nvidia market cap: $65B

That's a tall order...


People try the weirdest things... take a look at "Something lighter."

https://www.bloomberg.com/view/articles/2017-01-30/immigrati...


Oh man, I knew that'd Matt Levine. The only email newsletter I subscribe to - and I hate email newsletters. That entire page is worth reading, for anyone with a few extra minutes.


Please stop thinking of IBM as a vertically integrated technology company. They are a service company, with a veneer of technology and intellectual property. These kinds of hypotheticals should be formulated the other way around, specifically: "What if IBM sold off the Power architecture and IPO'ed it big enough to go acquire a sustainable hardware technology stack?" Power will eventually wither within IBM due to the impedance mismatch in R&D investment.


It might be good, as the ensuing obliteration of value might wake people up to the dark side of the era of cheap debt and crazy financial engineering that runs today's world.


If IBM tried to do this I can see other players stepping up to outbid them.

Didn't IBM divest much of these business lines and focus on services?


IBM is in the process of selling off its assets and returning capital to its executives and shareholders. The service thing is probably smoke and mirrors to prevent a run on the stock.


They divested their x86 systems business (thinkpads & servers) but still have a power servers division.


But they don't actually manufacture the chips any more...


Nobody besides Intel and Samsung manufactures their own chips any more; it's not relevant.


Yeah, but IBM doesn't have a good track record of buying out companies and making them profitable.

Anyone remember when IBM bought out Lotus? Anyone remember when IBM was making OS/2 with Microsoft?

IBM got out of the PC desktop, laptop, and printer markets. They got out of the Intel server markets as well.

IBM should just provide service for Linux and make their own FOSS apps for Linux that they can sell support for. Maybe invest or submit code to ReactOS, HaukOS, AROS, or OSFree?

The least that IBM can do is document every OS/2 API call so someone can write an FOSS alternative to it, or a WINE like environment for Linux.


IBM acqui-kills firms and products. It is the great Sargasso sea of IT.


I am sure that a move like this would make perfect sense for IBM, but it would hollow out great parts of the HPC vendor ecosystem. That ecosystem is already quite fragile with the ascendency of Intel x86 chips everywhere


It may create a little cavity in the 10G and FC HBA market, but a unified product vision spanning the three is dizzying by comparison, and hardly a reason not to.

It's easy for me to imagine how SoCs with the best of each could be a catapult for HPC.


IBM should instead buy Intel and Microsoft, that would at least be funny.


Xilinx makes sense. Let Xilinx operate on it's own as long as they deliver a FPGA+Mainframe system for BlueMix, and a FPGA supercomputer for Dept of Energy.


It would be easier for Intel to buy Nvidia...


I'd rather see Intel buy rights to AMD's upcoming Vega GPUs and kill off the entire Xeon Phi line as the pointless misadventure it has been so far.


I thought the Xeon Phi line is better now that it is all one processor instead of a separate coprocessor. I saw a few weeks ago here on HN on an actual use case for its MCDRAM and spreading the load over its separate clusters of cores.


The problem isn't the fact that it's a coprocessor, I think. NVIDIA GPUs are all coprocessors, and they are wildly successful.


Could you point me in its direction?


seeing as IBM just sold their chip fab to globalfoundries, this seems unlikely.


We have one good GPU manufacturer and you want to destroy it by having it swallowed up by the shambling corpse of Big Blue?

No, just no.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: