Hacker News new | past | comments | ask | show | jobs | submit login
Steam: Half-Life 2 Hardware Survey (2004) (archive.org)
159 points by nazgulsenpai on April 6, 2022 | hide | past | favorite | 183 comments



Fun fact but for years updating GPUs led to a decrease rather than an increase in framerate.

Basically with a solid GeForce 4 series card you could already cap the 100 fps framerate on Half-Life, but update to a much more powerful 9600 pro and framerate would start getting much more lower. The Quake 1 engine would be so old that getting better GPUs would make it run worse.

Also, another fun fact about HL is that scaled massively on ram frequencies. You would get a higher boost in framerate by overclocking your ram than your cpu or gpu.

It was only in late 2000s and the era of much more powerful 6th series that solid 100 fps on counter strike 1.6 were a given.

I miss that era of the internet, when socializing on irc and public servers was the norm, players controlled the servers rather than companies. Today we have replaced all of this infrastructure with ease of use and queue buttons and playing online leads to barely beyond any meaningful and short lived interaction.


When reading through your comment I had a blast from the past and remembered how I was browsing the "All-Seeing Eye" [1] to look for Quake 3 servers and then share them with friends via IRC.

Looking back at those times I remember the Internet as a much more social place (but that might be just be). I think it's because people had to get active to use services for social interaction as opposed to getting hit by a notification in your face.

[1] https://en.wikipedia.org/wiki/The_All-Seeing_Eye


ASE was great, I think it was possible to filter servers by country, and I think I used to find new servers and players from my country to play with.


>Fun fact but for years updating GPUs led to a decrease rather than an increase in framerate.

Sounds more like a specific technical issue with an specific engine and drivers that lacked optimization rather than an axiom that can generalize games and GPUs for several years. If you have a source for this please share it.

Also, IIRC, back then the game physics were tied to the framerate, so having too high framerate made your gameplay completely wonky as it messed up character jumps and weapon balistics, so I assume that those framerate limits with more powerful GPUs could have been there on purpose to keep the game playable.


In the modern era we still have problems of this sort - higher core counts can cause games to perform worse due to problems like a game suddenly having 96+ threadpool threads competing for small amounts of work, often belonging to the driver. It's still the case that a high core count Ryzen will have worse performance in some games (or problems like microstuttering) that you won't experience on a slower, lower core count Intel chip.


That's because game devs optimize for the most common denominator hardware gamers have at home, which were mostly quad core chips with 8 threads until very recently.

96+ thread pool is way outside the norm for the average gamer or consumer, so you can't expect to be surprised that game devs ignore the 0.01% of gamers running server/workstation CPUs with this insane thread count.


In that case you might want to mask off some of the cores/hyperthreads, or even do core pinning. That's very common in applications where latency is important.


masking off threads and using pinning does help, but you still end up with 96 threadpool threads (they're just competing for fewer cores now). the real solution would be to actually tell an app or game that you have a set number of cores (like 8). sadly AFAIK neither windows or linux can do this, so the only option is a VM.


specifically, it sounds to me like ATI/AMD's rather-notoriously-poor OpenGL performance in action... another way to rephrase OP to take the emphasis off hardware generation (which makes no sense) would be "when I switched from NVIDIA to ATI I lost 40% of my performance"... which is still an experience many people have today with OpenGL on the AMD cards.


I could be wrong (I never really got into gaming as GMA950+Linux was an exceptionally bad combo for it) but I thought source was exclusively DirectX back then.


HL2 had both OpenGL and DirectX on paper, but the DirectX implementation was notoriously poor such that everyone used and recommended the OpenGL renderer.

https://arstechnica.com/civis/viewtopic.php?f=22&t=1007962

https://arstechnica.com/civis/viewtopic.php?f=6&t=931690

https://arstechnica.com/civis/viewtopic.php?f=22&t=987307

Ironically now with Valve's push for Linux gaming, I'd expect that Linux is probably a great platform for HL2, probably even better than Windows. Intel GMA drivers might still be bad though, not sure if that is too far back for the open-source driver that showed up in the Core era.


I believe Source had both OpenGL and DirectX renderers, alongside a software renderer.

Edit: I'm thinking of the GoldSrc engine used in Half-Life


> players controlled the servers rather than companies

There are still some games like this! Admittedly fewer and fewer, but I've been enjoying Squad for example — a tactical FPS that encourages teamwork. Lots of great public servers with good communities.


I could never really get into squad; enemy visibility is just too poor so you spend 20 minutes lugging yourself across the map only to get shot by some invisible guy, like literally there was not a single pixel on the screen representing him.


LOVE Squad! I could not imagine that game with OWI hosted servers. Especially with the mod scene that game has.


I spend _way_ too much time playing Squad.


On that note, I still regularly get extremely bad FPS on Quake-based games today with my RX 580. Only renaming the executable seems to solve the issue.

https://techreport.com/review/3089/how-atis-drivers-optimize...


It's amazing how many workarounds never get removed from big codebases. For example, look at Chrome/Chromium's about:gpu page. Some driver workarounds are still applied for bug reports that haven't been looked at for a decade.


That sounds.. interesting? Any examples/read on that?


Your last few sentences really hit. This is what occurred to World of Warcraft. I've been playing the re-release of Classic WoW, Season of Mastery, and I find the experience much more compelling than any common day mass multi-player online experience.


Yes WoW is exactly the primary example of "arcadeization" as I call it and why companies should not listen to their vocal playerbase sometimes.

WoW went from a game where player interaction was the norm (both negative and positive), going to a dungeon was a massive adventure just for getting there, you would have meaningful interactions with other players, coordinating to get to a place, giving directions, traversing what felt a living world with living players. Also the opposite was true, you would dislike some people for how they acted or played or ganked, but all of that had human interaction at heart.

But then going to dungeons became a chore for acquiring tokens, and this all stopped being fun, and we got automatic queues and people barely speak to each other anymore, and if they do, it's for insulting a player for a bad pull or for not waiting the healer.

It's sad but I believe that modern online gaming exchanged meaningful and durable human interactions for ease of use and quick consumption.

I remember talking to thousands of people and knowing pretty much everyone in a hundred of Counter Strike clans in Italy 20 years ago.

This image [1] will be 20 years old in few months, this is the SMAU Italian Lan Party 2002. 1200 PC gamers brought their own computers, setup up a giant lan, to meet and interact and fight each other and form friendships in the same place! 1200! 20 years ago!

Today it's a miracle if you can get few dozens people to play in the same place, there's barely any human interaction from gamers, neither in person nor online.

It's sad, maybe I'm just old and it's nostalgia, but I liked it more as it was before.

[1] https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcRm_T4I...


And yet at the time, we thought those systems were dumb. 'why do I have to run around looking for a group? Cant you just match me with one?'

I think more generally, what we want at the time and what we remember years later are often different and sometimes even opposed.


I miss the prevalence of IRC. I understand why platforms like Discord are popular today, but I’ve always felt that it missed the mark…


> Today we have replaced all of this infrastructure with ease of use and queue buttons and playing online leads to barely beyond any meaningful and short lived interaction.

Yup. Slowly stopped playing TF2 once the Valve server and queue crap started in. I liked community servers and was a paying No Heroes member. I liked popping in and seeing familiar faces and saying hello. I liked community maps (less common as time went on). I loved community quirks. e.g. servers using their own sound packs creating in jokes and seeing people outside of the community. It was fun until the community was instantly paved over by shiny hip new instant play and queues.


These were the years of the GeForce 4 and the brand new amd64 processor. The Direct X version supported by GeForce 4 couldn’t even display the real pretty water textures at that time, or at least mine couldn’t on my GeForce 4 + amd64 laptop. Little guy was a beast back then. Compaq, I think?

That game was so good and around that time, people were figuring out how to get HL2 to be multiplayer in story mode, and gmod was in its infancy right around then.

Good times.


I recall the Radeon 9800 Pro was being sold for about US$300 and came with Half-Life 2, which was designed for it. Doom 3 was out at the same time and was optimised for NVidia cards. It was a sort of "battle of the graphics platforms", both hardware-wise and software-wise.

And now two short decades later we're paying 10x the amount for top-tier cards, and HL3 still isn't out.


Don't forget that Valve later rendered HDR on ATI 9600 series card with no perceivable performance penalty, and showed that elbow grease and ingenuity can remove theoretical hardware limitations.

Also, I remember that quote (from Gabe himself, I think): "We may have polished graphics further, but that game won't be Half Life anymore.", a direct stab taken at linearity of Doom of that time.


> elbow grease and ingenuity can remove theoretical hardware limitations

Of course, 'theoretical' hardware limitations aren't hard limits -- they're soft[ware] ones. However, if you find yourself battling concrete hardware limitations, what you need is to put your back into it and a little know-how.


The thing is, HDR is considered too expensive and impossible on these cards without specific hardware capabilities and relevant API support.

Valve said "Ha, let's see!" and implemented great HDR visuals with integers only. So it's not just let's hack together and see how it goes. IIRC, it took three re-implementations to get it right and performant.

Similarly, I'm not a stranger when it comes to hardware and its limits, but what deemed impossible by many, even knowledgeable people, doesn't mean it's a hard, cold truth (e.g. demoscene and PoC||GTFO).


My apologies, I read too deeply into your comment. I just enjoyed (too much) the use of the phrase 'theoretical hardware limitations'.


No worries, enjoy your stay. :)


> came with Half-Life 2

I think mine came with Vietcong, which was... underwhelming.


Is that the game that had this in its soundtrack? https://youtu.be/fuYL-keoGvU

I still listen to it on occasion. Don’t even remember how I found that song because I’ve never played that game, I don’t think.


Yes, the soundtrack and setting was objectively the best part of that game


Nope, I got a 9600XT, and it came with a serial number for HL2. I added a Zalman cooler on top of that card, and it was both silent and stable.


I got the 9600XT for Half-Life 2 which was kinda silly in the end since HL2 took so long to come out that I could have gotten a newer card. However I believe it also came with a copy of Counter-Strike Source Beta which I enjoyed greatly. My card had tiny LED's on it that prevented me from sleeping with my computer on. Great times.


I guess it depends on who made the card? Probably Sapphire and MSI and others had different bundled stuff (maybe also different by country, and I'm in EU)


Just shows how big the markup was, to be able to bundle a 60$ game with a 300$ card.


The Radeon cards came with coupons for HL2 a while before the game was released. My brother bought one from someone in school, there was quite a bit of trading going on :)


I still have mine - it came with my 9800 XT. The last AGP card I ever owned.


The GeForce 4 MX (first in the list) was a rebranded GeForce 2, so just T&L and no shaders. The "real" GeForce 4 (second in the list) supported shaders just fine, as did the GeForce 3, so it got the pretty water.


Kid me got burned so hard by this. Upgraded my GeForce 2 MX to a GeForce 4 MX. It was slightly better and had I think 4 times the VRAM but I learned to pay a lot more attention after that.


And yet, my box with a 50eur GeForce4 MX440 with 64mb VRAM ran Half-Life 2 on launch.

AMD Duron 1800+ and 256mb of 333MHz DDR and the damn thing ran HL2!

A machine with a 50eur CPU and 50eur GPU ran it! They must have optimized it well.

And that machine is included in that 2004 Steam hardware survey. :)


There was a very different attitude to budget offerings from nvidia back then, and despite a lot of complaints about them I'm not sure the new way is better.

Nvidia seemed to like to take their old hardware design and crank it up a bit, slap a new version on it and sell it as a budget offering. No new features, but it did a respectable job of playing older games which is what I think most people want out of a cheapo card. Later offerings usually featured super cut down cards that had more of the new features but not really enough power to actually turn them on and get playable framerates. And that is how things have sort of gone since then.

Like you said, the GF4 MX440 was actually a decent performing card. Sure, you were stuck with DX7 rendering path but it did a good job with it.


AMD Athlon XP, 256 MB DDR2, GF2 MX 200/400. HL2 ran.

Crazy. It was crazy.


I had a Geforce 2MX 200/400 back in the day. It was... interesting.


>The Direct X version supported by GeForce 4 couldn’t even display the real pretty water textures at that time, or at least mine couldn’t on my GeForce 4 + amd64 laptop.

If I remember correctly you needed Shader model 3.0 support to get those fancy graphical effects which required GeForce FX series & 6th Series (6600/6800 etc).


So today you would do it all in a programmable shader pipeline, create and update vertices according to some displacement from a 2d water simulation or just noise, and color the appropriately in another pass with the correct texture etc. As far as I know with my limited understanding from writing a few shaders.

But what did you do back then? I was pretty young, but I'm not even sure you had such a clean programmable separation in the render pipeline? :-)


For the geometry you'd do basically the same thing but on the CPU and just update a vertex buffer every frame.

You'd probably do some cute register combiner trick for the texturing. You can kinda think of register combiner stages as pixel shader instructions. And in hardware of the time they're the same thing. On GeForce 3, loading a pixel shader is just blitting the register combiner hardware config registers all at once. GeForce 2 could have had a shader model version 0.5 or something with only four instructions if they wanted to expose it in the API like that.


Was it an MX series? They sucked.


I have nostalgia for the games of that era far more than the hardware. While the hardware has come leaps and bounds, the gameplay has gone backwards. Take battlefield as an example; back then it was a tactical team-based shooter, today it's a gallery shooter. Same thing with the Crysis series; every entry got dumbed down.


The Crysis 1 -> 2 experience felt so disappointing. I loved how Crysis 1 there were no limits to the weapon attachments or switching suit functions. I remember Crysis 1 with the silenced pistols with laser pointers you would 100% hit where the laser was pointing. I'd go full cheese and sneak up to the enemy, de-cloak, 1 shot dead, re-cloak.

All to push a story which started with an off screen killing of Nomad.


I remember pushing the limits of "fast" gameplay with really fast suit mode switching, and before that modifying the suit values in an .ini file in the game directory to have flash-level speed and infinite energy.

It's a shame that they dumbed it down in the sequels so much for the consoles. And that the game became so much worse _as a game_ (even if my first play through had my jaw hit the floor when I entered the alient ship) after the midpoint. Punching through plywood shacks for an ambush, jumping high for a grenade, using speedmode to get a flank (which the AI would be unaware of - if they couldn't see you!).

Yeah it was a good time.


One of my fondest memories around Crysis is the whole "but can it run Crysis" unofficial benchmark. I remember everyone looking at the game going "yeah my machine can't run this". Ended up the game ran pretty well even on my potato machine back then, just had to keep all the graphics on low. I'm kind of tempted to buy the remastered version and see how much better it looks on my Xbox Series S like a decade later


Unfortunately, Crysis 1 is seriously single-thread bottlenecked. It was designed in anticipation of the "10 GHz Pentium 4" school of CPUs that never ended up existing, so it runs like trash even today with high-end hardware.

The remaster does not fix this and actually is a step down in visuals in some respects, due to the ever popular "we remastered from a shitty port that never worked right in the first place". Halo also comes to mind, with the Halo PC version having completely broken shaders and much different weapons balance from the xbox version. In the case of crysis I believe it was the other way around, the game was remastered from the lower-fidelity Xbox version... so it actually is visually worse in some areas.

https://gamerant.com/crysis-remastered-review/

https://www.youtube.com/watch?v=l_Az4-2o9AI


Yes definitely! Being more of a sandbox game offering different play styles I always gave it a go when I upgraded my PC.


Crysis 1 was definitely more fun. Tranq darts, freeze rays, throwing frying pans with Maximum Strength, "hacking" your .ini file to make yourself absurdly overpowered, etc.

But I have to give Crysis 2 credit for its graphics. Compared to 1, it had a smoother, more cinematic quality. I distinctly remember the opening "cutscene" ending and seamlessly transferring control to the player, revealing that it wasn't a prerendered cutscene at all -- they were doing it live! I just sat there stunned for a minute in disbelief.


The multiplayer of Crysis 1 was unique. I wish "something" happens to either bring back some people to the old one (there are ways to play it after the official servers shut down, but the community is dead), a new mod, or the multiplayer being added to the official remastered version.


Yeah absolutely. The dumb down of the story almost killed me back then. I had so much hope for Crysis 2 but was disappointed that it was just an ordinary shooter with meh enemies. The tech is there but the design is way worse.


Not quite following the dumbed down story part. Crysis2 had Richard Morgan responsible for the broader story beats and it explored some quite interesting topics.


Maybe it's just my personal preference but I feel the aliens of 1 is a lot cooler than the one in 2. 1 also gives kind of hang at the end but I feel 2 concludes it a bit too quickly. Again just my preference. Don't know the right word for it, but I like when an idea is not presented clearly to the audience, for example in 1 the aliens are essentially lovecraftian, we have a lot of questions unanswered. I feel 2 answers these questions too clearly.


I'm not really sure which Battlefield you're referring to as "tactical team-based shooter" but it was never any Rainbow Six (3), Swat 3, ARMA or Operation Flashpoint, games I'd consider actual tactical games in the 2000s. Battlefield has always been a bit arcade-y, not not as much as Call of Duty.


Back then to me a "tactical" Battlefield would have been either Battlefield 2(Especially Project Reality) or 2042. Maybe if you only compare it to modern games you could say 1942 was tactical.


Referring to BF1942, I mean tactical in the sense that you had to cooperate on tactics with your team, for example you might spend 10 minutes traversing a hill to flank the enemy. In the modern games, flanking is redundant as you respawn just a few meters from wherever you died.


Oh man, SWAT 4 was SO good. I miss that game.


A team fan of that game is making a game named "Ready or Not", I didn't played it yet but it has stupid high fan reviews... I saw it reach 98% of approval on Steam or something like that.


I guess some individual series have gotten worse. Debatably the monetization models have gotten worse too. But I think game design has generally improved. I loved Half life, but is the gameplay really better than games like Doom Eternal and Titanfall 2?


If we're talking Half-Life2 then one could argue, that it brings in a whole other dimension compared to the two games you mentioned: physics. Something that is almost absent in DE & T2 and has (afaik) no gameplay effect.


Doom Eternal, sure, the original Doom did not have the sort of forced pace that the newer Doom games do

Everyone says its better, but I don't like it as it forces me to play the game their way, not my way


Yeah I think a lot of commenters here are seeing popular gameplay diverge from the gameplay they prefer. I like the gunplay of Half-Life, Quake, etc, and still fire up openarena with some friends every once in a while, but I spend most of my time playing games with different gun models. Right now I'm busy with Hell Let Loose for example, which is about as far as you can get from the arena shooter gunplay model. People looking for gunplay more like 2004 gaming might want to look into boomer shooters instead


I totally agree, seems most FPS today do not come close to the story and gunplay of HL2.


My dad bought me HL2 back in Nov 2004. At the time my PC spec was: P4, 512 RAM, GF FX5900XT, HDD 80Gb, Dial up 56k, LCD BenQ 1280x1024, Win XP.

The game was in English and I didn't speak it at all at the time so had to download the translation (700Mb audio files). Took me a couple days. With constant disconnects as dial up wasn't really reliable. Luckily, Steam allowed you to resume the download.

Still using that Steam account.

The game ran well. Although, my friend had Radeon 9600 Series and it performed noticeably better on his PC (or maybe I'm confusing it with a different game - Far Cry).


Ahh I had a lowly fx5200 and dreamed of the day of being able to afford a better card. But that meant skipping 6 months of lunches and by then I’m sure the better card would not be compatible with my motherboard anymore anyway.


What's the difference between "Korean (Teen)" and "Korean (Adult)"? I did not realise Steam had that language distinction. Do they still have it?


the difference for teens is caused by korea gaming laws, if i'm not wrong games have to limit the play time for 3 hours a day


Is there perhaps some kind of legal thing where they turn off the red blood for the "teen" version?


That's Germany I think? I seem to remember playing Carmageddon with green blood and that was some kind of German mode?


Korea definitely has non-red blood in plenty of games. PUBG has blue blood in Korea for example.

https://img.gurugamer.com//resize/640x-//2021/03/29/the-red-...

https://www.reddit.com/r/PUBATTLEGROUNDS/comments/7z0aot/how...


Carmageddon 64 in Germany also replaced the zombies (which were already replacements for pedestrians) with dinosaurs. Truly bizarre.


Very cool, what a time machine!

I wonder what the single person with a 9-pixel wide game screen was doing, and how they enjoyed their gaming experience. :)


Also curious what happened with the 127bpp display or the € processor.


Probably a misidentified or misreported uncommon resolution.


The RAM! I've had 16GiB of RAM in my PC for almost 7 years at this point. If you'd have asked me how much RAM my PC back in HL2 days had I would have said probably 4GiB, but that's unlikely given the stats here. Interestingly I haven't felt the need to upgrade from 16GiB since and that seems to be standard on "developer spec" laptops even today.


Until 64 bit CPUs and OSes became mainstream, it was not easy to use more than 4GB on a 32bit system (or even 2 in Windows XP, later raised to 3.1-3.5-ish with the /3GB switch in boot.ini).

You had to use PAE (Physical Address Extensions - https://en.wikipedia.org/wiki/Physical_Address_Extension ), basically a 3-level page table structure, provided the CPU and the OS offered it.

It could cause a few compatibility woes with binaries not expecting it, esp. stuff running in kernel mode, and of course in the Microsoft world it was used as an excuse to charge you more - the "Standard" versions of Windows server x32 were limited to 4GB, forcing you to buy "Advanced/Enterprise/Datacenter/Whatever" SKUs.

So anyway 4GB was the maximum it would make sense to install on Windows for a long time, pretty much until 64 bit CPUs and Windows 7 became common (Deliberately omitting Vista, which so many opted to skip).


64 bit Windows 7 Home is still artificially limited to 16 Go of RAM, I lost a day trying to figure out a couple of years ago why my extra RAM was showing up in BIOS, Ubuntu and Memtest, but not Windows !


I totally forgot about the 32/64 bit thing! I can't remember ever being concerned with it. I never used Windows beyond XP and switched to Linux full time around 2010 so I guess it didn't really affect me.


> how much RAM my PC back in HL2 days had I would have said probably 4GiB

Probably 512MB or 1024MB. It was only 3 years after release of Windows XP. Vista came out in 2007 and even then people cried about what 512MB minimal requirements is way too much.


I'd still be fine with 8GB, aside from Electron shitware eating 25+% of that for one app, and virtual machines being memory-hungry.


I just looked through my emails, in 2008 I bought an "Ultra DIMM 2 GB DDR2-667 Kit" :D


I just looked through mine and I built a PC in 2006 with a 2GiB DDR2 kit. I later upgraded it with another of the same kit in 2008. If you go to late 2008 in the archive all of a sudden almost 40% of people are in the "2Gb[sic] and above" category.


Amazing how cpu clock speed seems not out of place at all, while the rest seems so out of date (like 56k modems)


Biggest differentiator being that back then most CPU's were single core - one of the first dual core CPUs was the Pentium D that only came out in 2005 iirc.


Back then having a "Dual Core" of "Quad Core" processor was such a selling point, I remember getting my first Macbook (white plastic one) thinking it was a beast for having a Core2 Duo. Then like a year later, my Android phone had a quad core processor inside.


May 25, 2005 https://en.wikipedia.org/wiki/Pentium_D

May 2005 https://en.wikipedia.org/wiki/Athlon_64_X2

What a race! Looks like the era of 90nm processing, and Intel just beating AMD to the punch, with both of them moving to 65nm in 2006!

Anandtech preview of Pentium D in April 2005.

https://www.anandtech.com/show/1657

Anandtech preview of Athlon 64 X2 in April 2005.

https://www.anandtech.com/show/1665

Review of Athlon 64 X2 May 2005!

https://www.anandtech.com/show/1676

> Both AMD and Intel appear to be playing release date games with their latest dual core processors.

> Intel's affordable dual core desktop solution, the new Pentium D, officially launched in the middle of last month, but has yet to be seen in the channel.


IPC and caches have grown a lot (and obviously core count, dramatically).

A 2000 to 2005 ish pentium might have 50 to 200 million transistors, an M1 Pro is more like 12 billion.


It always comes to intersection of physical world and computers. After all electrons can only move so fast that is so far in single clock cycle... And I think more complex chips are internally "longer"...

We could probably make some older generation chip now and run it much faster, but it is easier and more efficient to add lot more chip...


I don’t think it’s the electrons moving that matter. It’s the electric fields, which in conducting materials, is approaching the speed of light.

The latency you’re thinking of is more related to the speed with which a semiconductor can change from conducting to non-conducting states.

The speed of electrons in a wire with a potential difference across is it is actually quite slow (cm/s or so).

Also, modern chips and electronics aren’t significantly larger in a physical sense, the wires/traces aren’t longer, but the density of transistors has increased.

Someone please correct me if I’m wrong.


The P4-era Intel Netburst pipeline, in order to chase the highest clock speed, at some point (Prescott) degenerated from 20 stages into a 31-stages deep abomination (which would suffer so badly at any branch mispredict).

The pipeline actually has a couple stages called "Drive"... allegedly for allowing signals to propagate between key areas in the CPU.


Interesting, let's take a look at the instructions per clock. Oh, yikes.

https://cpu.userbenchmark.com/Compare/Intel-Pentium-4-380GHz...


36% better memory latency.

Four times the single core performance.

Two decades.


Ten times cheaper


It's not much compared to the two decades of 1980-2000, but it seems there's nothing interesting on the horizon for 2010-2030.


Aren't you comparing a fairly high end processor at the time (I wasn't alive when the pentium 4 came out, I think) with the bargain bin one from today? The new pentiums have the same microarchitecture but I don't know if they have all the execution units turned on + less cache.


HL2, Portal, CS:S. Valve really hit it out of the park in those days. I wonder how much of Steams success can be traced back to it. Just the Valve catalog alone carried it for me and my friends.


When HL2 came out we were forced to install Steam whether we liked it or not. I hated it at the time because I bought the game on DVD (as was usual at the time) and just wanted to play it. I was on 56k dial-up and didn't want to be forced to download updates. I remember thinking they must have been completely crazy to do this.

It was about a year before always-on broadband became widespread for everyone. Then it all made sense. It seemed to me Valve wasn't just riding the wave but actively driving it.


Regarding the first part, I'm pretty sure that Steam had an offline mode from the start. Perhaps it required an initial connection to set up an account, though.

I was lucky to be in an area that had very early access to DSL, so it wasn't a problem for me. The "always on" features was actually pretty awesome and novel if you had a permanent connection. In general I've been impressed by Steam from the start.

On the other hand, we can see what the success has caused down the line. Now every publisher wants to replicate it with their own proprietary launchers, undoing all of the cleanliness and centralization that made Steam awesome in the first place.

If anything it's now more confusing than ever to keep track of which launcher has which game and which friends had been added where.

I suppose the total Steam hegemony was always going to be a temporary thing.


> Regarding the first part, I'm pretty sure that Steam had an offline mode from the start. Perhaps it required an initial connection to set up an account, though.

It did but it still had problems. You did indeed have to be online initially to "activate" the game but the main problem is whenever it detected there were updates available you were forced to wait for them to completely download before you could play. This led to me being very careful about when I would allow Steam to have internet access. If you weren't careful you could go online and then not be allowed to play HL2 until you spent hours downloading an update (which you couldn't always do until night time because of tying up the phone line). It was incredibly frustrating.


Steam wouldn't have happened if Windows had a proper package manager (and if Valve hadn't been screwed over by Sierra).

I'm wondering about what cleanliness you are talking about, after all Steam (thankfully) never got a 100% market share (Games For Windows Live, Stardock's Impulse, GamersGate, Matrix/Slitherine, Blizzard games, Good Old Games, Ubisoft's horrible launchers, various boxes-only games, indies...) - if anything it has killed indie gaming that was briefly freed from having to have a full blown publisher and physical distributor - these days smaller developers have a hard time to sell from their own website and sometimes even to just forego from using Steamworks-only services (which make non-Steam versions second class citizens) like multiplayer matchmaking and mod Workshop.

(And some other programs like Xfire have tried to centralize game libraries too, also offering other services like chat, matchmaking, game hour counts, video recording and upload...)


I'm sure there are some significant downsides to Steam being a dominant platform, as is always the case. That being said, I think Steam has been pretty decent for Indie devs in other ways. The Greenlight program seems pretty successful, and the barrier for entry into the Steam catalog isn't too steep. However, you are right in that you are pretty much forcing indie devs to come in under the Steam umbrella to get exposure, and that's of course questionable.

As for pure practicality as a gamer, Steam was unbeatable in the 10's. There were a lot of other offerings like you mentioned, but none were really competing on a large scale. The push of every publisher to restrict their games to their own Steam clones is a thing that seems to really have ramped up during the last ten years or so. Some were early though, Blizzard being a prime example.


> it has killed indie gaming

That's quite the take. I'd say Valve did more good for indie gaming than anyone else with the introduction of early access. Almost all the indie gaming being done in my social circles is through Steam's early access program. Before that people just didn't play indie games.


Indie means independant. (Activision-)Blizzard (which owns their distribution platform) is more indie than the small developers that would have gone bankrupt if Steam kicked them out (or shadowbanned them), and were forced to pay a ~30% tax.

And I'm pretty sure people did play (small developer) indie games before - just flash games alone...


It was pretty terrible. I had dial up at the time and so used offline mode a lot. It was a regular thing where the game would get to 100% ready, I'd launch it online to see that it worked, switch to offline and disconnect and it no longer worked. So you'd repeat this about 6 times and maybe it would stick. There was NO avoiding being online for the first install. I had to spend 2 nights connected to dial up to get HL2:Episode 1 playable and I bought that from a DVD. I played HL2 on Xbox just to avoid Steam.


> If anything it's now more confusing than ever to keep track of which launcher has which game and which friends had been added where.

I've been using Playnite to consolidate my games list across launchers. It works pretty well once it's set up, except for titles where the launcher you bought it on launches another launcher


> except for titles where the launcher you bought it on launches another launcher

That's just comedy. I know this happens for a lot of Ubisoft games in my Steam library. I guess they want to keep the cake and eat it to. Exposure on Steam, walled garden in Connect.

I'll definitely check out Playnite. Thanks for the tip!


It's even worse with Rockstar games on Epic


A lot of us predicted horrible things would come from Steam. Ads everywhere, forced updates, forced removal of games for all kinds of bullshit reasons, etc. Basically all the same arguments you get today about digital ownership and platforms. Fortunately for us, Valve has actually been a pretty good gaming-community citizen with Steam so we can happily say we were wrong.


Forced updates did happen (only way to play the game if Steam was aware there was one available - and God forbid you had a slow/flaky connection - no gaming for you today, not even through Offline Mode !).

Combined in at least one instance with the new version not being able to load the old multi-hours save, and no rollbacks, so hopefully you manually backed up the whole game ! (At least the devs fixed it in a patch after some days...)


I'd say if I have one big complaint about Steam, it's that it's update control needs work. I don't know if it's intentional or not but rolling back to an old version is...impossible? I'm vaguely aware there is an option to maintain the version you already have but I believe it's quite buried.

I've had updates completely change game balance, making character load outs I'd grinded up not work the way they did before. I've also had new linux versions break compatibility. This is understandable in multiplayer games, but I was playing these games single player.


Some devs enable (an often limited form of) rollback through the "beta" functionality.


Probably more due to Half-Life, TFC, and CS. The enormous player bases were forced to adopt Steam. And then Valve continued to put a few more good games.

Honestly CS:S was a bit of a miss—largely considered a failure by the 1.6 community—but it didn’t matter because TF2 came out and was a smash hit.


Also L4D seems largely forgotten compared to TF2 but was definitely a hit in its own right.


Loaded up the Ubisoft app recently. Pure trash, apps / syncing wouldn’t work till I accepted all cookies, things wouldn’t download in the background while playing single player, etc.


I tried playing an old Ubisoft title recently, Farcry Blood Dragon. But it wants to connect to a long dead ubisoft server EVERY time I his Esc to go to the menu and takes 30 seconds to time out. Every time! I decided I just won't buy anymore Ubisoft games after that.


Fwiw I played that for a few hours, I got it via their monthly subscription. I did not have the issue you describe.


I think my Steam account is my oldest account that I still actively use as it has a registration date in 2003. Gmail probably the runner-up but that debuted in 2004. To that effect, my Steam username is an email address where the domain long expired and is no accessible by me.


Take a moment for the 32k people using Windows Me. We will remember them.


My parents got me a brand new computer in 2001. I think it was a PIII 700mhz with 64mb of ram and onboard graphics. Day 1 windows me wouldn’t boot past a blinking cursor. I had to reset it a random number of times and it would successfully boot. Some time after through who knows what (didn’t even have an internet connection, only CDs), I figured that if I uninstalled the USB devices before shut down it would boot fine in one shot.

I believe Windows Me is the reason I was way ahead in IT troubleshooting in high school and got me a position at a pretty good company which that led to my career in IT. We didn’t have money to pay a tech very often to fix our stuff so I had to bang my head through keeping that shit running so I could play Driver, CS 1.6 with bots and Test Drive 5.

That computer kept running until 2008 somehow. I upped the ram to 256 by scavenging the recycling pile at work and it would do OK opening google and writing word docs


Some people don’t know you have to skip the odd ones. There’s even people who like Star Trek vii.


I'm more concerned by the one person with a nine-pixel screen width.


What's funny, that I had friends who were running Windows Me very successfully, without any issues whatsoever and they liked Windows Me. I guess, they were lucky in that regard.

Mine experiences was totally different.


Part of my job was tech support at the time.

ME was notably worse than everything else. IIRC they also decided to screw around with placement of stuff on a bunch of settings screens for no reason, adding insult to injury.


I think LGR did a pretty good video on this; it boiled down to what hardware you had and if there were good drivers available for it. Most hardware configurations did not and contributed to the well deserved reputation.


I knew of one reliable Windows Me machine that needed a driver update (perhaps printer?) it became the most unstable operating system I'd ever seen. It would suddenly blue screen halfway through a Quake 2 session.


The Geforce 2 MX and (to a lesser extent) Geforce 4 MX were spectacular little cards for the price when they came out, I'm not surprised they occupy first and third place in the Graphics card rankings.

Also I am curious who was running Steam on a computer with <24MB of RAM in 2004, that seems ... odd.


The Geforce 4 MX was a rip-off, trying to extract money from people who didn't know better. It lacked shaders entirely, so it was, feature-wise, on the same level as the first GF from 1999.


Yeah, while the GF2MX was a stripped down GF2 the GF4MX was NOT a stripped down GF4, rather a slightly improved GF2MX. My feeling at the time was that the original MX was powerful enough for most and ate into sales of the higher end GF 2 GTS cards.


I remember running Half Life 2 on some card way below the recommended spec. There was some commandline switch to run the game in DX7 or DX8 mode, turned everything to lowest and resolution like 640x480. Looked worse than Half Life 1, but at least was somewhat playable.


Some people managed to run Doom 3 on Voodoo 2/3 cards, applying some patches and resizing the game textures, etc... I managed to run Doom 3 on a Creative GeForce 2 MX DDR using the patches for Voodoo 2 and doing the same trick of resizing textures. The game was playable, except for the boss guarding the soul cube.


I was one of the poor bastards playing Half-Life 2 on a Voodoo 3 around that time... stuttered like crazy, required setting everything as low as possible and then some with console commands... ran at an admirable ~20fps, but I was used to worse on my n64, so I just sucked it up lol.

Looking at some screenshots, it didn't look all that bad... https://imgur.com/a/cwBHywJ


NVidia Riva TNT2 Series! I spent the better part of a summer doing odd jobs to earn enough to purchase one from the local Electronic's Boutique. Great memories :)


Was my first dual GPU setup! An AGP TNT2 and another PCI TNT2 for my third monitor.


Steam still produces this software and hardware survey with interesting statistics: https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...


I know exactly my hardware stats of that time:

Prescott-256 3.06GHz (Intel Pentium 4 based Celeron D) 1 GB RAM ATI 9250 300kbit/s Internet

I was sad when I realized that we had a “budget” PC that couldn’t run the latest 3D games even though it was so expensive to buy.

We had bought the computer with monthly installments.


> Prescott-256 3.06GHz (Intel Pentium 4 based Celeron D)

    <~Flash> how does one explain the difference between celeron and non celeron to a non tech ??
    <+bomarrow1> one is for toasting bread
    <+bomarrow1> the other is for processing instructions
http://bash.org/?769729


I wonder how much of the 1.8% "Other" video cards is a 3dfx Voodoo Banshee. It came out at about the same time as the Voodoo 3 and was more of a budget friendly but still good card.


I feel like the words "3dfx Voodoo Banshee" just unlocked a part of my brain, full of IRQ conflicts and blue screens, that I had been keeping firmly closed.


Haha yeah.

Depending on what games you were into, the term "Voodoo" in general may also unlock this part of your brain: https://www.youtube.com/shorts/sHeuZ-m1tek


I'd love to try that 127 bpp display they mention.


Who were the 15 hustlers that rocked less than 24Mb ram with steam in 2004?


Probably the same person that also had a 9-pixel wide screen with 3 bits per pixel color depth.


Very curious what the rest of the setups looked like for the 0.02% of people with a 1920 wide screen display back then…


The monitor would have been something like this HP L2335 LCD monitor from 2004 (this [1] review is from 2005).

According to benchmarks [2], Half Life 2 was pretty vertex and/or cpu bound, so a mid-to-high range GPU ($300-$400) would be able to run at that resolution.

[1] http://www.anandtech.com/show/1729/2

[2] https://www.anandtech.com/show/1546/


CRTs were all 4:3, and had better specifications than the LCDs of the time. Back then LCDs also followed PC standard 4:3 resolution. It wasn’t until LCD tech was cheap enough for mass production in TVs that 1080p monitors became the norm.


Google tells me the Apple Cinema displays came out around this time as well, so surprised there aren’t more 2560 wide users.


Apple and gaming did not coincide much at all then, and as I recall the cinema display couldn't be used with a PC without a weird adapter


Valve added macOS support somewhere around 2009 if I recall correctly.

I imagine a number of people with Apple Cinema display connected to their PC would be really limited.


I haven't realized SSE2 is so old. Almost half of the machines had it back then already.


Amusing to see only 6 people having a quad core CPU.


Those were most likely people with a motherboard with 4 CPU slots and with 4 CPUs plonked in there.


I was going to suggest it could be dual-socket, with HyperThreading enabled, but looks like the survey is HyperThreading aware and only counts real cores.

So must be Quad Socket motherboards.


I don't think there was even any dual dual-core solutions for x86 available in 2004... So must have been some quad socket weirdness...


Unless I'm misremembering, the first dual-core CPU on the market was the Athlon 64 X2 in 2005, so I agree that these results are likely from servers running Steam.

Edit: a quick bit of google tells me both Intel and AMD released dual core CPUs in May 2005, with Intel beating AMD by a few weeks, so there you go.


I’m guessing it was game server software running on rack mounted servers.


The pentium D hadn't come out yet in 2004. Which quad core CPU's are you thinking of?


It's fun to think I'm some data in that survey, as it feels like a lifetime ago now. I was on team ATI and team Intel back then, now team Nvidia and AMD. Given that AMD bought ATI, I guess the only loser there was Intel.


CPU frequencies didn't go up much in the last 20 years


The survey looks like it picked up a couple of corrupted CPU Vendor strings like cAMD, ntel, and €. I wonder what happened?


Overclocking was pretty common (and I guess might still be in gaming communities) so I wouldn't be surprised if some machines were unstable and running with corrupted memory.


Good old S3 ProSavage VGA, sweet memories...


There were still some users with a Voodoo 3, and some (few) poor people with a non-AMD and non-Intel x86 CPU...


I'm incredible surprised how "mean" my 2004 computer were. P4 at 2.66GHz, 512MB RAM, a 128MB GeForce4 MX 440, and a CRT display that runs 1024x768 at 85Hz. The amount of hours of CS 1.6 and CS:S were really big on that machine :-)


Ahh.. The time when installing Omega Drivers and placebo effect following it were the thing.


Good memories of a time when I was into PC hardware and had a good knowledge of the product landscape. These days I have absolutely no clue as this MacBook Air (2017/i5/8/256) _just works_ :-)


OpenGL was and is a nice thing.

Linux was missing. Fixed ten years later :)


Really, really disappointing to see that it's still just 1%, and Mac is only 2.5%.

https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...


AMD 46% Intel 54%


That's odd - I'm sure I remember completing that survey - and I was playing on Mac, but there are no Mac stats.


Steam for Mac (and HL2 for Mac as well) only came out in 2010. In 2004, you're also still very much in the PowerPC era, so you wouldn't have been able to play the game on a Mac without some serious emulation effort.


Thanks. I must have misremembered when I completed the survey


HL2 is probably one of the last generation of FPS games that are accessible to general modders. Good time.


Wow, I never realized how much the cpu clocks have stagnated in the last two decades.


Well the clock speed of a Pentium 4 from back then isn’t exactly comparable to a Core i9 from today. Heck it wasn’t comparable to AMD’s offerings at the time which IIRC were uniformly 2.5GHz


This was back when the friends feature on steam just didn't work




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: