Hacker News new | past | comments | ask | show | jobs | submit login

In the late 90s I was in the last year of high school. Silicon Graphics came to do a demo of their hardware for students that were interested in taking a computer science course at university in the following year.

The graphics demos looked like trash, basically just untextured and badly shaded plain colored objects rotating on the screen. For reference I was playing Quake III around the time which had detailed textures and dynamic lighting.

I asked the SGI presenter what one of his Indigo workstations cost. He said $40,000, not including the graphics card! That’s extra.

I laughed in his face and walked out.




In the late 90s, SGI demos were much more impressive than what you describe. It was used by technical folks to do real stuff, with stringent criteria.

More importantly, the things that made Quake III so great were state-of-the-art for gaming. But those things couldn't render lines quickly and well (a mainstay of CAD at the time), or render at very high resolution (which IIRC was 1280x1024 in that era).

Here's what Carmack said abotu the SGIs a few years before: """SGI Infinite reality: ($100000+) Fill rate from hell. Polygons from hell. If you don’t trip up on state changes, nothing will come within shouting distance of this system. You would expect that.""" SGI was also key for map builds before PCs were capable.

But yes, 1999-2000 was just around the cusp of when SGI went from "amazing" to "meh".


If I remember correctly, their cards had a high compute rate even at double precision, but had tiny memory buffers and basically couldn’t do texturing at all in practice.

It turned out that double precision was a mistake that was sold as a “professional” feature. By sharing edges correctly and using the correct rounding modes, single precision provides pixel-perfect rendering. Efficiencies like this allowed the consumer GPUs to run circles around SGI hardware.


They did texturing just fine.


Most SGI graphics cards had no texture ram at all, and had zero texturing capability. At the time there was one card that had 4 MB of texture RAM, but in the same year a typical PC GPU had between 16 or 32 MB of shared memory, most of which was used for textures.

A “budget” SGI card cost more than my parents’ car. I bought three different GPUs by that point with my pocket money.


I honestly don'tt know what you're talking about- I wrote many programs that did high performance texture mapping on SGIs and they had both texturing capability and RAM. When you say "SGI card", it makes me sound like you're talkijng about something other than an Octane2 or InfiniteReality.


The Indy/Indigo2 graphics - the Elan/Extreme/XZ - never had texture hardware, but would fall back to software. It wasn't until the High IMPACT series was released for Indigo2 in 1995 they had hardware texture mapping on their "low end" - i.e. desktop sized systems.


1995 is when I started using SGIs in earnest (on Indigo2 and RealityEngine). I do recall a lab next to ours that had $15K indys that were absolutely useful for doing anything with graphics.


I always wanted an Indy. Preferably the WebForce one with all the extra software.


You’re very confused. In 1992 you could get an SGI system equipped with 16MB of just texture RAM. Long before 3D accelerators for PCs. Later iterations such as InfiniteReality of course had much more texture memory. The SGI VPro could be configured with 108MB for texture memory.


The curve that maps fucking around with finding out is not linear. By the time you start finding out, it's very hard to stop finding out much more than you would like to.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: