Hacker News new | past | comments | ask | show | jobs | submit login

I think SGI failed to understand that there was a point where desktop PCs would be good enough to replace dedicated workstations. Continuing to make hardware that's much better than the best PCs wasn't going to save them after PCs crossed the good-enough line - whatever they had, would be relegated to increasingly rarefied niches - the same way IBM now only makes POWER and mainframes - there is no point of making PCs, or even POWER workstations anymore for them, as the margin would be too narrow.

SGI could double down on their servers and supercomputers, which they did for a while, but without entry-level options, their product lines becomes the domain of legacy clients who are too afraid (or too smart) to port to cheaper platforms. And being legacy in a highly dynamic segment like HPC is a recipe for disaster. IBM survived because their IBMi (the descendant of the AS/400) and mainframe lines are very well defended by systems that are too risky to move tied to hardware that's not that much more expensive than a similarly capable cluster of generic and less capable machines. As the market was being disrupted from under them, they retreated up and still defend their hill very effectively.

The other movement they could do was to shift downwards, towards the PC, and pull the rug from under their workstation line. By the time Microsoft acquired Softimage and had it ported to NT, it was already too late for SGI to even try that move, as NT was solidified as a viable competitor in the visual computing segment, running on good-enough machines much, much cheaper than anything SGI had.




IBMs market was also just far larger. SGI sales were around 3 billion, Sun were around 3x that. IBM was even more. SGI and Sun were based Unix so Linux could disrupt that far more easy then IBMs systems.

IBM also came into the game more vertically integrated. Having your own fab is expensive, but if you read on what Sun and SGI had to do in order to get chips, that route also wasn't great.

In the beginning there was a chance that Sun and SGI could have merged, the reason it didn't happen was mostly because leadership couldn't agree on who would lead the company. Between them they duplicated a lot of technology while sitting right next to each other. Both doing their own RISC chips, at times Sun was doing their own graphics cars, competing in 'low' priced and mid priced work stations, incomparable Unix developments, competing in the large SMP market against each other. If they had been together and things could have been different a larger install base and more investment into the architecture might have given them a better chance.


> I think SGI failed to understand that there was a point where desktop PCs would be good enough to replace dedicated workstations.

I think the real problem here is PC workstations, with Windows NT (later Linux) and an Intel chip, could do 90% of the SGI workstations for a fraction of the price. By workstation I mean an honest workstation with a high end graphics card, 10k RPM SCSI drives, and hundreds of megs of RAM.

The price of SGI workstations was "call us" which translates to tens of thousands of dollars per workstation. PC workstations didn't and couldn't replace all uses of SGI workstations. What SGI was not able to handle was the fact their customers suddenly having a viable option besides paying them tens of thousands of dollars for their workstations. Even their Visual Workstations weren't a real improvement cost wise as those were still overpriced compared to competitors' PC workstations.


This is my recollection of the era as well. A quality PC like a Compaq was already a good alternative during the Motorola era. In a lot of ways the whole RISC thing was a dead end as all that happened was SGI, IBM, HP and Sun cannibalised each others sales.

ARM is the only one left standing from that era which with hindsight seemed so unlikely.


Keep in mind several of the others survived long past their public visibility. There were MIPS smart phones for a while. Both PPC and MIPS long outsold x86 in number of CPUs - just at low margin licenses with the bulk going into embedded uses, like ARM.

ARM had the advantage in that space of starting from the very low end, and being able to squeeze margins instead of being squeezed.


Don't forget IBM is still selling (and improving) their POWER family, running AIX (which had the low-end eaten away by Linux) and IBMi (which is the only minicomputer family still standing). IBMi is extremely well defended as it's not even fully documented (unlike mainframes) and, therefore, doesn't even have a way to be emulated outside IBM. And I would not be surprised if that "secret sauce" to be kept in an underground facility under a mountain in the middle of a desert, in a completely offline system behind multiple biometric locks.

ARM survived this long because it had a niche others couldn't break into (and still can't) as the highest performance per watt anywhere.


I think it would be more accurate to say the RISC workstation built entirely of custom low volume components was a dead end. Even if SGI decided to cut their margins to compete on cost their lowest prices would still be way above a PC workstation. Compaq benefitted from economies of scale that SGI could not.

RISC architectures live on today. Your house likely has dozens of MIPS chips in various components. You've got more ARM chips for sure but no shortage of other RISC chips in various components.


“Custom low volume components” is a marketing strategy. They could sell them to competitors and make money off the components. Late in its history SGI owned MIPS and sold their hardware to others. I have a WindowsCE laptop running on a MIPS R4000 processor. Runs NetBSD with effort, but with dignity.


I think your analysis of the shifting technology landscape is largely on target. However, I'm not convinced that the true root of SGI's failure was the technology. Clearly their tech did need to evolve significantly for them to remain competitive but that's a transition which many companies successfully make. Even though SGI chose not to evolve the tech soon enough, fast enough nor far enough, I suspect they still would have failed to survive that time period due to an even more fundamental root cause: their entire corporate structure wasn't suited to the new competitive environment. While the "desktop transition" was most obviously apparent in the technology, I think the worst part for SGI was that the desktop shift changed the fundamental economics to higher volumes at lower costs.

SGI had invested in building significant strengths and competency in its sales and distribution structure. This was one of their key competitive moats. Unfortunately, not only did the shift in economics make this strength irrelevant, it turned it into a fundamental weakness. All that workstation-centric sales, distribution, service and support infrastructure dramatically weighed down their payroll and opex. This was fine as long as they could count on the higher margins of their existing business. While it's easy to say they should "just layoff all those people and relaunch as a desktop company" that can't be done in one quarter or even one year. It requires fundamentally different structures, processes, systems and skill sets. Hiring, training and integrating all that while paying for massive layoffs and shutting down offices, warehouses etc takes time and costs a lot of money.

Worse, once their existing workstation customers saw them shutting down the SGI the customers had bought workstations and service contracts from to become a different kind of company entirely, sales revenue would have taken an overnight nosedive. SGI's stock would also have tanked far more immediately than it did as the fickle stock market investors sold stock they'd bought because SGI offered a specific risk/return expectation which just became much more "risk" and much less "return" (at least in the near-term). In making such a dramatic move SGI would have effectively dumped much of their current quarterly revenue and the value of one of their core strengths - all at the same moment. Thus turning them into one of their emerging startup competitors with all of a startup's disadvantages (no big ongoing revenue streams, no big cash pile (or high stock valuation to leverage for cash)) yet none of a startup's strengths (nimble, lower-paid staff and more patient venture investors).

The point of my earlier post was mainly that a true disruptive market shift is nearly impossible for a large, established incumbent to successfully survive because they basically have to rapidly turn into someone else almost overnight. How can a champion sumo wrestler survive a shift so dramatic that their sport quickly turns into a track meet? Even seeing it coming doesn't help. How does one even prepare for such a shift since losing mass turns you into a bad sumo wrestler long before you even start being a viable sprinter? As Christiansen observed, such disruptions are often enabled by technology but the actual cause of incumbent death is often due to the shift turning an incumbent's own strengths into weaknesses almost overnight.


I don't think you need to shut down that distribution. Because fundamentally, existing costumers often continue to buy existing product at existing prices, as long as they get a faster product. This was something that Gordon Bell pointed out. Existing costumers are design to work at that level of expense. Expensive graphics workstations and rendering servers and so on still exist. Sure it would go down eventually as all business does in the long run.

The real failure is not picking up new business along the way. With the N64 they showed they could design a consumer product. But outside of that they were in no future consoles. 3dfx and ArtX both came from former SGI. You don't need to stop selling workstations just because you make chips for consoles and other such devices. Nvidia barely survived and might not have if not for consoles. There are other markets where their expertise could have been applied.

Surviving changes like that often requires finding other markets. And then when it comes to making hard choices you need to cut part of the thing that is unprofitable. But this is really hard to do and in some ways it goes against the 90s US corporate philosophy of focusing only on the 'Core' business. DEC for example sold profitable business units that might have been helpful to have. DEC had a printer business and had the potential for a Database business. Oracle calls RDB one of the best acquisitions.


> the N64 they showed they could design a consumer product. But outside of that they were in no future consoles. 3dfx and ArtX both came from former SGI. You don't need to stop selling workstations just because you make chips for consoles and other such devices.

Commodore tried for a play where their - still much lower end - newest generation chipset would have scaled (with the same chips) from being a full low end computer, console, or set-top box, computer (it had a PA RISC core on chip, so could run standalone), a high end graphics card for PCs, and the chipset for a higher end Amiga at the same time.

They ran out of money - short maybe a few million - before being able to test if that would have worked as a way to widen their market.

I wish we'd have gotten to see how that'd unfolded, though Commodore's deep management dysfunction probably still would have made it fail.


During the Amiga era Commodore's board was controlled by old-school big business / Wall Street suits who had literally zero clue about computers or even technology in general. For a while toward the end the inside rumors were that the powers controlling the board were primarily benefiting from international currency and tax games and were cutting costs to harvest whatever cash they could into offshore tax havens while refusing to invest a cent back into saving the company.


Legend says Commodore was at some point approached by Sun willing to OEM their Amiga 3000/UX machines as entry-level Unix workstations and sell them through their distribution channels. The same accounts say Commodore management refused (maybe because the deal had a non-compete part).

They also missed an (earlier) boat with the Commodore 900 workstation, which would run Mark Williams' Coherent OS (a Unix-like system).


Commodore is one long history of management failures... As a big fan at the time, without insight into how dysfunctional it was, it was thoroughly depressing both to see from the outside, and then reading up on the internal chaos years later. It's a miracle they survived as long as they did, and at the same time there were so many fascinating lost opportunities.

ChuckMcM was at Sun at the time, and mentioned a while back he tried to get Sun to buy Commodore outright:

https://news.ycombinator.com/item?id=39585430 (his later replies in that sub-thread are also worth reading)


> They also missed an (earlier) boat with the Commodore 900 workstation

They didn't really miss the boat on that. The C900 wasn't really an attractive machine. They would have sold it at a pretty high price. At the same time you could buy a PC with Unix on it that was arguable better. It would have just been another market where they got clobbered by the PC. Not like Zilog was the right horse to bet on anyway.


Many Amiga engineers were UNIX heads, and even Amiga DOS /Workbench started as what might be another take into UNIX like systems with multimedia hardware, when they pivoted away from a games machine.

There are some vintage computer club talks where they dive into this.


Commodore could have used Coherent as the Amiga OS.


On the other hand, thankfully they haven't taken that approach, as otherwise wouldn't have been the multimedia machine it turned out to be, or have OS design decisions like its use of Libraries, Datatypes and everything scriptable that are yet to be mainstream.

With exception of a few UNIX systems like Irix, Solaris with NeWS, NeXTSTEP, Apollo, everything else tends to be the same deck of cards reshuffled.


Your analysis except for your first sentence is good.

But it is not true that SGI failed to understand there was a point where desktop PCs would be good enough to replace dedicated workstations. They had built a $3500 PC graphics card (IrisVision) way early on, and did the Nintendo deal before PC 3D really became a thing, they partnered in 1991 with Compaq on an investment and a $50-million joint effort to develop a workstation (https://www.encyclopedia.com/books/politics-and-business-mag...), and they were themselves taking advantage of the ability to squeeze more and more 3D into a single chip; the tech trends were obvious.

SGI was a client of tech research a firm I joined at the time and it was heartbreaking to see them lose and very hard to figure out what they could/should do. It wasn't my explicit role to solve their problem but I spent a lot of time thinking about it.

You do capture some of the dynamics well but you don't capture the heart of it. The heart of it is point #1. The rest below are also inhibitors.

1) SGI had revenues of $2 billion/year and the 3D market revenue was, say, $50/million/year. (OK, maybe a bit more than that; Matrox 2D king had what, $125 million in revenue?) How do you trade the former for the latter? And on top of that trade high margins for low margins? When 95% of your stakeholders don't care about the new (PC games) market?

2) Engineering pride/sophistication/control. The company started out focused on Graphics but, being in Silicon Valley, had grown/acquired huge engineering strengths in other areas besides graphics, CPUs design (MIPS), NUMA cache-coherent SMP hardware+SMP UNIX design, etc and that's before you get to the Cray acquisition and bits. They were the "Apple" of graphics workstation vendors but there was no "iphone" integrated vertical play for them downmarket (except maybe consoles and they half-tried that with N64 and even Nvidia while using that has minimized that due to its low margins and low opportunity for upside.) It was hard technically to give up/deprioritize all those levels of engineering sophistication in favor of competing on graphics performance and price-performance when the PCI(/AGP) bus was fixed, the software API layer was fixed, the CPU layer was fixed, and you had to compete with 80+ fledging companies to win in both performance and price/performance in a low margin PC 3D games graphics which is just a much lower value-added play for engineering.

3) Compensation. Employees with knowledge of how to make a 3D graphics chip had a bigger upside outside of the company than inside with that knowledge. They left. 3dfx founder left. ArtX guys left. Later other guys left for Nvidia.

4) Slow/different engineering cycle times/culture. SGI cycle times for new products were 3-4 years. Sure they'd do a speed bump at the half-way point. Some volume 2D chip companies would have tweaks to their designs coming out every 2 weeks. PC graphics vendors needed a new product every 6-12 months. Nvidia's most critical life-saving measure in their product history was to cut their cycle time radically by using simulation because there was no other option, and it left them with 2/3rds of their target blend modes not working. https://www.acquired.fm/episodes/nvidia-the-gpu-company-1993...

5) Pride around owning/controlling the 3D software layer. Having developed OpenGL, they didn't/wouldn't figure out how to let Microsoft control/change/partner with it for gaming markets at the API level. Yes they licensed OpenGL, and eventually they caved and cross-licensed patents, but Microsoft was never going to let them own/control the API nor was Microsoft ever going to fully support an open API. And there was no love lost between Silicon Valley engineers and Microsoft in the late 90s. Hard to partner.

6) Executive leadership (I don't know any of this firsthand.) The founder who cared about graphics, Ed Clark, by some accounts saw the workstation/PC threat a ways away. When the critical timeframe came to deal with it though (92-94), he also saw the web and saw how big that wave was, and PC 3D graphics was a small wave in comparison so he switched focus and left behind him was the CEO who was an ex-HP executive who had for a decade grown SGI from $5m revenue to billions by remaking SGI into the image of HP rather than focusing on graphics, graphics, graphics and was not equipped to bet the company on rebirthing/growing the company out of a nascent PC 3D games market growing.

As an industry analyst serving SGI as a client (and its competitors) and seeing the forces facing them in the mid/late-90s, what was known at the time was: - 3D games were going to fuel consumer usage, on both consoles and PCs, thanks to one-chip texture mapping (+ over time geometry processing) - Wintel controlled the PC space but did allow third-party graphics cards manufacturers and were fine with that - Linux was good enough, but was for a transition period not nearly as good as conventional UNIX - There was a huge room for improvement in 3D graphics on the PC at first, but at some point you would get good-enough bang for your buck and then the market for your graphics processors would stagnate. Screen resolutions grow more slowly than Moore's law, and once you can do enough shading for every pixel on the screen 4x over and enough polygons matrix operations for one polygon on every pixel on the screen, how much more compute do you really need?

But in 92-95 it was hard to advocate for SGI downsizing/betting-the-company based on this alone.

In mid-1993 Windows NT 3.1 marked "good enough" Windows to compete with UNIX (process isolation) and Windows 3.5 in Nov 1994 solidified that. In Nov 1995, with Pentium Pro specInt figures coming out, it was clear RISC was going to die but just cognitively hard to recognize.

SGI clearly saw the problem and tried to diversify out of it Alias/Wavefront (1995) and going upmarket (buying Cray whom they were cannibalizing) but neither of those "saved it".

I remember thinking at some point charting out with a colleague how the industry consolidation might happen that they really should join up with Apple (both really rode the 'halo' effect of high-end sexy products to sell bread and butter products effectively and both were very vertically-integration oriented); I don't know if that was ever considered. Apple was 100x weaker then and I don't recall if the actual market cap economics would have worked.

What wasn't fully recognized (at least by me or others I read voraciously at the time), but is clearer in hindsight was that: - while SMP parallelism was part of the wave of the future (and required both work on both CPU+OS layers of the stack), and GPUs contained a mixture of ASIC-based parallelism for both shading and vertex operations, that one could construct a general purpose coprocessing engine that would have real market uses beyond graphics (first in HPC then in AI) and a longer-term value proposition that would outlast a gamer-centric niche market and be a general compute engine.

The term GPU alone that Nvidia used early on in marketing now implies and delivers on that vision, but it didn't imply it (to my eyes) at the time; SGI and others had used the term GPU informally even before Nvidia marketing did, to refer to chips with geometry/matrix computations on them and Nvidia was entering that game (and of course talking up their book.) But the true vision of creating a general-purpose parallelism compute engine coprocessor and API along with it was really fleshed out at Stanford PhD work by Ian Buck a decade later in 2004 and he then graduated and took it to Nvidia and it became CUDA. https://graphics.stanford.edu/papers/brookgpu/brookgpu.pdf At least as far as I can tell.

It was always possible as a graphics chip company in the 1990s to see the next 1-2 generations of possible technical improvements, but it was very hard to see a multi-billion dollar business. Tens of millions absolutely. Hundreds of millions, sure. But billions? And this goes back to my point #1.

There was a very real possibility they would enter the 3D PC gaming fray, not be able to compete and lose, and then all that Silicon Valley graphics marketing halo they benefited from so much have would faded. But it faded anyway.

I suppose they could have cross-licensed their texture mapping patents to any startup giving them 5-10% equity in return for capital and then tried to buy them out as they grew. They tried fighting over that issue with Nvidia later. But they would have hemoraged engineers to that approach and I'm less sure that would have worked.

In my view, they should have just bit the bullet, rolled the dice, and played the PC 3D graphics gaming game and stayed true to their name, "Silicon" "Graphics". Not "Silicon Software" (alias/wavefront) or even "Silicon parallelism" (Cray). It can be true that if you stay in a niche (3D graphics for the masses), you end up in a ditch. But they lacked the courage of their potential and went the wrong way. In hindsight, someone probably should have kicked out McCracken in 1992, not 1997 and they could have gotten a more visionary leader less tied to the HP executive way of looking at problems/opportunities. But I don't know how they could have transitioned to a leader with better vision or where they could have found one.

I'd be interested if there was ever an HBS case study on this based on internal SGI documents. Or if others have pointers to internal SGI discussions of this dilemma. It still bothers me 30 years later as you can see by the length of this post. Too bad I posed this on hn a day late.


Your analysis is amazing. Thank you.


Thank you! It encourages me that someone got to see and appreciate it.

What is hard to convey to people outside the 3D hardware space is that the chief problem once you have the 3d pipeline down is really a market-development problem.

How do you sell an ever increasing amount of coprocessor compute?

Because the moment you hit the inevitable S-curve flattening out of your 3D value proposition, your coprocessor gets integrated into the (Intel) CPU die, just like the intel 387sx/dx floating point unit or earlier generations of IO coprocessors. Hence a frantic strategic push always into raytracing, HPC, AI, etc.

In hindsight it looks visionary, but the paranoia is at least as much a driving factor.

It’s now, for now, incredibly lucrative and the mastery of hardware parallelism may last as a moat for Nvidia, but I can sympathize at SGI not wanting to squeeze themselves back into a coprocessor-only business model. It’s a scary prospect. We can see only with hindsight it could have worked. Both business and technical leadership had such huge success diversifying their graphics hardware expertise into full system skills that they couldn’t refocus. They would have had to die in order to live again. Or perhaps slightly less exaggerated, to survive they would have to forsake/kill off a huge part of what they had become.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: