Hacker News new | past | comments | ask | show | jobs | submit login

> Similarly, it feels like Silicon Graphics is a case where they really should have become more standard. Now, unlike Amiga, they were too expensive to catch on with regular consumers, but I feel like they should have become and stayed the "standard" for workstation computers.

I think you highlighted very correctly there, though, why SGI lost. It turned out there were cheaper options, which while not on par with SGI workstations initially, just improved at a faster rate than SGI and eventually ended up with a much better cost/functionality profile. I feel like SGI just bet wrong. The article talks about how they acquired Cray, which were originally these awesome supercomputers. But it turned out supercomputers essentially got replaced by giant networks of much lower cost PCs.




Hypothesis:

What smaller businesses are using will tend to be what takes over in the future, just due to natural processes. When smaller businesses grow, they would generally prefer to fund the concurrent growth of existing vendors that they like using than they are to switch to the existing "industrial-grade" vendor.

At the same time, larger organizations that can afford to start with the industrial-grade vendors are only as loyal as they are locked in.


I see the same trend in programming languages. Say a really solid career lasts from about 20 to 60, 40 years long. Say that halfway through your career, 20 years in, you're considered a respectable senior dev who gets to influence what languages companies hire for and build on.

So in 20 years in, the current batch of senior devs will be retiring, and the current noobies will have become senior devs.

*Whatever language is easy to learn today will be a big deal in 20 years*

That's how PHP, Python, and JavaScript won. Since JavaScript got so much money poured on it to make it fast, secure, easy, with a big ecosystem, I say JS (or at least TS) will still be a big deal in 20 years.

The latest batch of languages know this, and that's why there are no big minimal languages. Rust comes with a good package manager, unit tester, linter, self-updater, etc., because a language with friction for noobies will simply die off.

One might ask how we got stuck with the languages of script kiddies and custom animated mouse cursors for websites. There's no other way it could turn out, that's just how people learn languages.


Back in the old days there was a glut of crappy bloated slow software written in BASIC. JS is the BASIC of the 21st century: you can write good software in it, but the low bar to entry means sifting through a lot of dross too.

My take: that’s just fine. Tightly crafted code is not a lost art, and is in fact getting easier to write these days. You’re just not forced into scrabbling for every last byte and cpu cycle anymore just to get acceptable results.


I mean, there are corporations who only sell to very large corporations and have had plenty of success doing so. Stuff like computational fluid dynamics software, for example, has a pretty-finite number of potential clients, and I don't think I could afford a license to ANSYS even if I wanted one [1], since it goes into the tens of thousands of dollars. I don't think there are a ton of startups using it.

But I think you're broadly right.

[1] Yes I know about OpenFOAM, I know I could use that if I really wanted.


I still dream of having a Beowulf Cluster of Crays.

One day ...



This betting wrong on specialization happened over and over again in the late 70s and 80s. The wave of improvements and price reduction in commodity PC hardware was insane, especially from the late 80s onwards. From Lisp machines to specialized graphics/CAD workstations, to "home computer" microcomputer systems, they all were buried because they mistakenly bet against Moore's law and economies of scale.

In 91 I was a dedicated Atari ST user convinced of the superiority of the 68k architecture, running a UUCP node off my hacked up ST. By the end of 92 I had a grey-box 486 running early releases of Linux and that was that. I used to fantasize over the photos and screenshots of workstations in the pages of UnixWorld and similar magazines... But then I could just dress my cheap 486 up to act like one and it was great.


Atari ST and Intel PC are not distant categories. Both are "'home computer microcomputer' systems". Not all home computer systems can win, just like not all browsers can win, not all spreadsheets can win, not all ways of hooking up keyboards and mice to computers can win, ...


They were distant on market tier but most importantly on economies of scale. The Intel PC market grew exponentially.


Sure, but the economy of scale came from the success. The first IBM PC was a prototype wire-wrapped by hand on a large perf board.

When you switched to Intel in 1992, PC's had already existed since 1981. PC's didn't wipe out most other home computers overnight.


Yeah, I'm more annoyed about Amiga than SGI. They were priced competitively with Apple and IBM offerings.

I guess it's just kind of impossible to predict the future. I don't think it's an incompetent decision to try and focus entirely on the workstation world; there are lots of businesses that make no attempt to market to consumers, and only market to large companies/organizations, since the way budgeting works with big companies is sort of categorically different than consumer budgets.

But you're absolutely right. Apple and Windows computers just kept getting better and better, faster and faster, and cheaper and cheaper, as did 3D modeling and video editing software for them. I mean, hell, as a 12 year old kid in 2003, I had both Lightwave 3D (student license) and Screenblast Movie Studio (now Vegas) running on my cheap, low-spec desktop computer, and it was running fast enough to be useful (at least for standard definition).


Of course, the reason they got better so fast is volume. There was just way more investment into those platforms. Which means this explanation is somewhat circular: they were successful because they were successful.

I think a more useful explanation is that people rate the value of avoiding vendor lockin extraordinarily high, to the extent that people will happily pick worse technology if there's at least two competing vendors to choose from. The IBM PCs were not good, but for convoluted legal reasons related to screwups by IBM their tech became a competitive ecosystem. Bad for IBM, good for everyone else. Their competitors did not make that "mistake" and so became less preferred.

Microsoft won for a while despite being single vendor because the alternative was UNIX, which was at least sorta multi-vendor at the OS level, except that portability between UNIXen was ropey at best in the 90s and of course you traded software lockin for hardware lockin; not really an improvement. Combined with the much more expensive hardware, lack of gaming and terrible UI toolkits (of which Microsoft was the undisputed master in the 90s) and then later Linux, and that was goodbye to them.

Of course after a decade of the Windows monopoly everyone was looking for a way out and settled on abusing an interactive document format, as it was the nearest thing lying around that was a non-Microsoft specific way to display UI. And browsers were also a competitive ecosystem so a double win. HTML based UIs totally sucked for the end users, but .... multi-vendor is worth more than nice UI, so, it wins.

See also how Android wiped out every other mobile OS except iOS (nobody cares much about lockin for mobile apps, the value of them is just not high enough).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: