Hacker News new | past | comments | ask | show | jobs | submit login
1991's PC technology was unbelievable (2011) (zdnet.com)
136 points by doener on Feb 27, 2021 | hide | past | favorite | 235 comments



.... and the Amiga did all of that, and more, in 1985. The basic Amiga cost almost twice as much as a basic PC where I lived, which was why it wasn’t as popular.

But in 1985, a PC that had capabilities comparable to the Amiga cost over 10 times as much as the Amiga: you’d need specialized graphics card to get 4096 colors at once, specialized sound cards to get 4 channel 8-bit stereo sound (or 2 channel 13-bit) - hardly any were available, and even if you got them, you didn’t ha e use for them because almost no software supported them.

The o/S did good cooperative multitasking with just 512K ram. Oh, and it ran faster than a real Mac while emulating a Mac (used the same CPU, but also had a graphics accelerator that the Mac didn’t).

In 1985.

Sound blaster (8 bit sound card) and VGA (only 256 colors at once at low resolution, and no sprites) became common around 1991-1992. It took a few more years for it until hardware better than Amiga’s 1985 debut - 16 bit Soundblasters or Gravis Ultrasound; and Extended SuperVGA cards. Still no sprite or genlock features. Win 95 was finally comparable to AmigaOS, if somewhat more buggy.

Alas, commodore was horribly mismanaged - and we are left with the PC legacy.


I bought an Amiga in the early days. I was pretty annoyed with it because you had to buy the peripherals (like monitor, keyboard, etc.) only from Amiga. They were standard parts, but would have an interface that was just different enough to prevent the use of standard parts.

I figured this sort of nonsense would kill the Amiga (like what happened to the Rainbow PC), and decided to not invest in making a compiler for it.

Edit: if I recall correctly, while it had standard 3.5" diskettes, it couldn't read/write DOS disks. C'mon, guys. I don't recall what I did with the machine, probably gave it away.


> and decided to not invest in making a compiler for it.

For those who are unaware of the significance of this statement, Walter Bright wrote the Zortec C compiler for DOS (eventually rebranded as Symantec C) which he later extended to be the first compiler that translated C++ directly to machine code without first going through C as an intermediary language and is currently the lead author of the D programming language and its DMD compiler. This man knows compilers!


“It didn’t use standard parts” is a silly criticism of a personal computer in 1985; to a first approximation there were no standard parts yet, as MS-DOS/x86 was still in the process of driving CP/M-80 out of the commodity system marketplace. Even keyboard layouts weren’t fully settled: Look at the layout of the Amiga 1000 keyboard vs Atari ST and the later Amiga models.

And monitors were always different—though you weren’t actually limited to Commodore, any 15kHz analog RGB display would work!—because resolutions were always advancing. Expecting to use (say) a CGA monitor with Amiga is just silly, it’s insufficient for the output. You could also always use composite out or an RF modulator if you really had to use a different display that wasn’t 15kHz analog RGB.

Finally, one of Commodore’s big selling points for Amiga was PC compatibility: One of the pieces of launch software was official Amiga-badged PC XT emulation software, and one of the pieces of launch hardware was a 5.25in PC-compatible floppy drive. Commodore explicitly promoted that you could exchange documents with and even run MS-DOS, unlike most other systems at the time.


> Finally, one of Commodore’s big selling points for Amiga was PC compatibility: One of the pieces of launch software was official Amiga-badged PC XT emulation software, and one of the pieces of launch hardware was a 5.25in PC-compatible floppy drive.

I had to go hunting for this one; I didn't remember the Amiga 1000 launching with a software PC emulator, but I found what you were referring to, "Amiga Transformer"! But I confess I definitely don't remember it being a "big selling point", perhaps because from the reports I can find, it was absolute shit, running at the equivalent of 0.3 MHz.

What I remember was the slightly later hardware "emulator" for it called the Sidecar; I put "emulator" in quotes because it was, well, basically a complete PC XT clone.


it was mostly a bandaid of hype to tide people over until the Amiga Sidecar was ready, which was essentially a full 8086 pc in an add on box. IIRC it used some rather expensive dual-port ram as the interface between the two systems by which the PC side's display and basic I/O could be merged into the Amiga UI. This later became the rather more elegant full ISA compatible Bridgeboard used in the 2000 series Amigas.

I am fairly sure a comparable x86 software emulation solution existed for the contemporary Mac of the days, called MacCharlie, which like the software-based Amiga Transformer was useless for anything other than a proof of concept demo.


While I didn't remember the name MacCharlie, I found the Wikipedia article and the description kind of rang a bell:

https://en.wikipedia.org/wiki/MacCharlie

As it turns out, it's a hardware device more like the Sidecar, except from the description much worse. :)


Yes. There were loads of home computers and they all had slightly odd things about them. One of the main selling points of one range of home computers (the MSX range) was standardisation.

https://en.wikipedia.org/wiki/MSX


PC compatibility is either "silly" or a "selling point," but it can't be both.


That was a problem, but that didn’t kill the Amiga. It had enough advantages for people to pay more - but Commodore mostly rested on their laurels. When the Amiga 3000 was out, it turned out that the CPU was faster than the blotter, for example - over 10 years, CPUs effective speed was almost 10 as much, but the ECS (which included the graphics coprocessor) wasn’t even twice as fast as the original 1985 version if I remember correctly.

And Commodore was really horribly mismanaged at the time in every perspective.

And yet, the Amiga lived a very, very long time - i know it was still used in TV stations into the early 2000s, despite being dead for a few years at that point, and the PC world having caught up to it and surpassed it in capabilities.

Amiga hardware and software is still coming out. But it’s the afterlife now.


I remember running a program on my Amiga 3000 that replaced calls to the blitter with calls to the CPU. Performance for common operations, like scrolling text, was greatly improved.

Except for faster CPUs, the Amiga barely changed from 1985 to 1993. The AGA machines (1200, 4000) were too little, too late.


> if I recall correctly, while it had standard 3.5" diskettes, it couldn't read/write DOS disks.

The Amiga floppy drives took standard 3.5" media, and they could read MS-DOS disks, but the software for this wasn't included with the OS until 1992.

Section 9.7 in http://www.faqs.org/faqs/amiga/introduction/part4/ says you could do this with Workbench 2.1, which was released at the end of 1992 (http://www.gregdonner.org/workbench/wb_21.html).


Amiga disks were pushing 880KB onto a floppy whereas the PC floppies only wrote 720KB. Amiga's disk controller was the more flexible one and could easily read various formats, including MS-DOS but PCs couldn't adapt to reading Amiga disks.

Of course, PC's later had double-density disks that could host 1.4MB but Amiga only ever had single-density disks in practice.


> Amiga disks were pushing 880KB onto a floppy whereas the PC floppies only wrote 720KB. Amiga's disk controller was the more flexible one and could easily read various formats, including MS-DOS but PCs couldn't adapt to reading Amiga disks.

Yes, they could; software to read/write alternative formats on the PC was, well, I’m not sure how common, but it was floating around. It wasn’t a controller issue, it was a DOS format issue. Similar software was even available for the 360KB (DOS formatted capacity) 5¼” floppies on PCs before 3½” drives were available.

> Of course, PC's later had double-density disks that could host 1.4MB

“high-density”, and 1.44MB DOS formatted capacity.

> Amiga only ever had single-density disks in practice.

“double-density”; single-density 3½” disks may have existed, but I don't think anyone used them in significant numbers.


I remember my dad had loads of double-density (800 KB) disks for his Mac LC in '92 -- we kept the cases with the disks well into the late 90s (long enough for me to find them and try inserting them into a Windows 95 machine at the age of 8).


I don't think it's fair to ding an early 80s platform for not caring about IBM compatibility. It wasn't anywhere near as dominant a standard then as it is now.

The Amiga couldn't talk to a PC screen with any amount of pin adapters anyway. It used a different frequency, which is seriously aggravating today if you want to fire up an old Amiga for nostalgia's sake.

It could read DOS discs with a driver iirc but the PC definitely couldn't read Amiga discs. It got extra capacity out of them (720k vs 880k) but at a definite cost in compatibility.


> I don't think it's fair to ding an early 80s platform for not caring about IBM compatibility.

The IBM was the dominant platform, and there were plenty of 3rd party keyboards, drives, and monitors available. The keyboards, for example, had the same keys but wouldn't work with the Amiga.

Running PC programs was not important, but being able to read/write the floppies in DOS format was very important.

Amiga was clearly trying to create a proprietary platform in order to charge premium prices for keyboards, etc. I thought this was a big mistake.

Whether that killed the Amiga or not, who knows, but as far as I was concerned it was a factor. It certainly drove me away from supporting it.


> Amiga was clearly trying to create a proprietary platform in order to charge premium prices for keyboards, etc. I thought this was a big mistake.

This is by no means a given, considering that Apple has succeeded fantastically by not only doing exactly this, but doing so in pursuit of margins Commodore-Amiga would have considered fatally aggressive.


An Amiga outputs either NTSC or PAL, depending on the region it was made for. So with the right cable, you can just plug it in to a TV made before the mid-2000's. Later models also supported VGA.

https://en.wikipedia.org/wiki/Amiga_video_connector


Its outputs include analog RGB, so if you had a VGA monitor that could sync to NTSC or PAL vsync signals, you could get a usable image. IIRC most monitors could not though.


The DEC Rainbow was interesting because it highlighted a couple things in the early 1980s timeframe.

- If you were buying a PC right after the IBM PC came out, it still wasn't obvious whether you bought an IBM PC with DOS (or one of the clones that were starting to come out) or if you bought a Z80 system running CP/M-86 (or an Apple III). I bought a PC clone in, I think 1983, but I had been shopping for a good year--those things were pricey--and it wasn't obvious when I first started shopping what the right choice was.

- None of the minicomputer makers, including DEC, really understood compatibility in the context of PCs. They were so used to making completely proprietary systems so just selling something that could run an off-the-shelf operating system already seemed like a bold concession.


I bought my IBM PC shortly after it came out. It was obvious to me that it was going to do well.

DEC certainly misunderstood compatibility, but the DEC-heads certainly did. The DEC-heads were anxiously waiting for the DEC entry, and assumed it would be awesome like the PDP-11. Instead, DEC proudly introduced a crippled, incompatible PC. The DEC-heads literally laughed at it, and walked away, crushed, by what DEC had done. DEC never recovered after burning their godlike status.

DEC should have introduced a PDP-11 priced competitively with the IBM PC. That would have knocked it out of the park.


They did, it also went nowhere. The original DEC personal computer strategy was three-pronged:

DECmate, a PDP-8 variant (12-bit), for word processing and office automation at the low end; for the DECmate II you could get a CP/M card and a graphics card as well, and I think the DECmate III had graphics built in. It ran a variant of OS/8.

DEC Rainbow, a system with both Z-80 and 8088, for general CP/M and MS-DOS software at the midrange. This was their “consumer” system. MS-DOS software that stuck to the APIs would work, anything that expected PC hardware broke.

DEC Professional, a PDP-11 variant (16-bit), as a high end workstation. It was literally a PDP-11 Qbus system, and could run RT-11 or RSTS or even UNIX. This was essentially a PDP-11 development workstation or small office server.

The single biggest problem with this strategy is the confusion caused by incompatibility, even moreso than not being PC-compatible. For example, if everything had been (one of) either PDP-8 or PDP-11 based, it would have been a strong product line: Office appliance, home computer, workstation. If everything had been based on either CP/M or MS-DOS (or both, like Rainbow), they could have potentially been a much bigger player in those markets, maybe even driving more software to target APIs rather than hardware (as NEC’s non-PC MS-DOS systems did in Japan).

Instead, well, that part of their business wound up being just another PC clone maker after a while. At least the VAXmate was innovative in being a PC-compatible that could boot over built-in Ethernet from a VAX…


They actually didn't, because none of these were miniaturized PDPs in the sense they were compatible. As you say, they were variants, and they didn't actually run the same software as the real PDP-8s or PDP-11s. Substantial modification was required and some OSes and application software was never ported. The DEC Pros in particular suffered from poorer performance as well (the 380 finally bettered this but was too little too late).

If DEC had actually sold a compatible microcomputer PDP system, they might have done better commercially, but instead they used the PDP architecture to build related but incompatible systems so as to not cannibalize sales of their big iron. In that sense, they were successful, because they sold very poorly.

I like my 380 with Venix, but it's a dead end.


In a way, none of this mattered at the end of the day anyway. If one leaves aside the sui generis and complicated story of Apple, PCs became a commodity business in any case. It wasn't a sustainable business for any of the large companies that also wanted to fund research labs and otherwise invest in new technology areas.


Probably the closest we ever got in the US to a home PDP-11 was sold via Heathkit as the H-11 and H-11A. These had standard QBUS cards packaged in a Heathkit chassis.

At the time, in high school, I drooled over the thought of having one, especially since the school had its own 11/34. In retrospect, it would have been fun but literally nobody I knew would have had one.

It's ironic that the biggest single market for home PDP-11 clones was... the Soviet Union.


I did have an H-11. I loved that computer. See my background picture:

https://twitter.com/WalterBright


DEC had a significant business making desktop personal computers. The first models had to be hooked up to a real computer in order to do much, but later versions could be expanded into stand-alone machines.

Of course, the disruptive innovators were selling the whole computational environment for the same price as a VT100.


> I figured this sort of nonsense would kill the Amiga (like what happened to the Rainbow PC), and decided to not invest in making a compiler for it.

There is a lesson here for today's smartphone makers.

Today it's less peripherals (everything's integrated) and more drivers, but it's the same problem. Now the thing people can't use with your device is software. This happens at both the OS level and the app level.

If you're Samsung or Huawei and your devices can't run a vanilla Linux kernel then they can't run anything but Android. That, in turn, cements Android as the only OS anybody develops for your devices. Which makes you dependent on Google and allows them to extract concessions from you. Sort of the opposite of commoditize your complement; it's "let a third party monopolize your complement."

And this matters. PCs not working this way was what allowed Linux to take over the server market from Windows Server and proprietary Unix and everything else on defunct systems list. Which in turn means the hardware vendor isn't reliant on Microsoft. Important to PC hardware vendors both because Windows Server may not have out-competed proprietary Unix and because it's a huge license fee that cuts into the hardware vendor's margins.

So that's what Samsung is handing Google by not making it easy to replace the OS on their devices. And that in turn gives Google an effective monopoly on Android apps. That's problematic in a number of ways, many relevant to the hardware vendors, but let's start with this one. Samsung has an app store.

It only runs on Samsung devices because it's impractical to install it on other vendors' devices when they can all only run a specific version of Android that enshrines Google Play.

Now suppose that Samsung made it easy to run other operating systems on all their devices. Then other operating systems would exist. They might not overtake Android, but they would have millions of users, like desktop Linux. Enough to cause competing smartphone vendors to enable their phones to run it. And in turn to run competing app stores, allowing them to actually compete because they run on more than one vendor's device and developers have more incentive to use them.

They're really shooting themselves in the foot by not enabling this.


I agree. I also suspect Amazon could sell a lot more Kindles if they made it able to run regular Linux. I bet the Kindle could become a valuable platform that has nothing to do with reading books.


Japaneses had the Sharp X68000. Released the same year as the Amiga 500, it had noticeably better graphics and audio. But the release price also was noticeably higher ($2600 vs $700). Sharp X68000 was an arcade turned into a desktop computer.

It had an interesting mouse device that you could use as a gaming pad: https://youtu.be/7g7j0iv2QYg?t=132

https://en.wikipedia.org/wiki/X68000


Ok fine, but the PC had better demos. Future Crew 4 ever!!

https://www.youtube.com/watch?v=rFv7mHTf0nA

(hehe I am a bit delighted in a retro way to see that arguing about PC vs. Amiga is still a thing)


> Sound blaster (8 bit sound card) and VGA (only 256 colors at once at low resolution, and no sprites)

The color depth of the OCS and VGA are not directly comparable. OCS and ECS allowed 32 out of 4096 colors at once. To reach the full 4096 you needed to use HAM, which while impressive, was not as flexible and required careful image design. VGA allowed 256 out of 262 144, to get something similar in the Amiga you needed AGA.


While true, VGA only did that at 320x200, HAM6 was 360x576 with sprites, and iirc split screen with other modes was already possible.

It is not directly comparable, not being pixel addressable, but was incredibly useful for anything that wasn’t fast moving and had some time to optimize the displayed image.

It also took a while for anything actually use 256 color VGA for more than mostly static pictures.

I wrote a side scroller in 1993 for the PC. I had to resort to a weird mode whose name escapes me now, where you could only address every 4th pixel without a bank switch - without this. Scrolling was too slow and there was no double buffering on VGA. It became the norm around 1994-1995 but without it, screen updates were severely limited by CPU speed.


"While true, VGA only did that at 320x200"

I remember being able to get VGA to run at 360x480 with 256 colors in mode-x.

"I had to resort to a weird mode whose name escapes me now, where you could only address every 4th pixel without a bank switch"

That's mode-x. You could also for example set all 4 pixels to same color with a single write, write mask was a 4-bit value. It was great for filling polygons.


Mode X?

With mode X you could reach weird resolutions like 320x480 with 256 colors, but it was planar.


"OCS and ECS allowed 32 out of 4096 colors at once."

Per horizontal line! Copper could swap out palettes between/during horizontal lines. There was also 64-color extra halfbrite mode. Another 32 colors were at 50% brightness.


to be fair these copper list tricks had their own performance tradeoffs and didn't see any use at all for years after the Amiga was introduced, and even then only really suited static image display.

nonetheless I still fondly recall the thrill of seeing the all-stops-out pinnacle of this technique in a demo that ran at super hires, extended overscan PAL interlace, displaying a pin sharp image of portraits and wildlife in full 4096 color depth at something like 1510 x 602, on 1989 (!) consumer hardware (Amiga 3000).


Was it fast enough to replace the palette during horizontal blanking? A quick Google search took me to this forum: http://eab.abime.net/showthread.php?t=50518

According to the information provided there, trying to do a 32-color full palette swap every scanline would starve the CPU.


Yeah, it would steal a lot of DMA time, and would starve a CPU not executing code & accessing data in the Fast RAM.

But at least you can display pretty pictures without HAM fringing. :-)


Fair enough. I'm always amazed at the kind of tricks expert programmers could pull off on those old 2D machines.

Have you seen the Sega MegaColor demo? A crazy guy managed to play full color video at 30 fps in the Sega Megadrive. No graphics, no sprites, just change the background color for every pixel through DMA!


A massive amount of creative people or creative content related company were born on the Amiga I believe.


Indeed; both EA and Blizzard had some of their earliest successes on the Amiga.


The thing is, by the time VGA appeared, the new hotness was "chunky graphics", something that was IIRC very bad on pre-AGA Amigas and even AGA-based models didn't excel at it.

And Chunky graphics became crucial for growing amounts of 3D software.


Can you explain how chunky graphics differs from non-chunky graphics? I'm not familiar with this term.


"Chunky" Graphics are the kind that most people are used to these days - where your graphical data tends to be addressed as pixels where each pixel has a single address and stores all data in one place.

For example, your typical modern RGBA buffer will be composed of linear array of 32bit values, each of them representing 8bit Red, Green, Blue and Alpha values of individual pixels.

Planar graphics works differently, with separately addressed bitplanes for different types of colour. For the RGBA example, you'll end up with 4 different buffers, each buffer representing red, green, blue and alpha values separately, requiring separate writes etc. - however one write in single bitplane could mean multiple pixels affected.

Both approaches have their pros and cons, but in this case we're going to focus on Amiga's downfall-related specifics.

When Amiga's graphic hardware was designed, a lot of it was built with features relevant to then common "home computers", even if much more powerful. This included things like "playfield" and hw sprite support, things especially useful in 2D games. Multiple playfields allowed compositing of two colour-limited screens on top of each other by dividing available bitfields between the two screens. Sprites allowed limited pasting of objects from memory at realtime on the screen (for example, player character, enemies, missiles in a typical platformer or SHMUP game). There was also hw collision detection between sprites.

All of that advanced hw however didn't exactly help with software-rendered even pseudo-3D that became popular around that time - some features could be definitely reused (semi-programmable graphic pipeline through copper, blitter helping with bitmap ops, dual playfield allowing composition of, let's say, cockpit and exterior). Instead, you might want to compose variable-size bitmaps of pre-rendered airplane at different angles (method used by LucasArts 1988-1991 trilogy of WW2 aircraft sims), which didn't nicely match with sprite system, instead considerably taxing the blitter capacity (especially since blitting might have to be done in multiple passes due to planar graphics). Chunky graphics is also generally simpler, but changing requirements from applications definitely had their impact.

More importantly, as someone mentioned, Amiga chips were innovative and powerful... in 1985. In 1986, VGA debuted, and while not particularly powerful on its own or widely available, the increasing power of the CPUs in PCs quickly eclipsed the benefits of dedicated hw chips in Amiga, especially since the OCS (Original Chip Set) only received incremental update in 1990 (ECS) and another in 1992 (AGA). Someone already mentioned writing faster blitters running on CPU than hw blitters of Amiga allowed, plus there were considerable limitations regarding DMA capabilities (IIRC, DMA for graphic system was only allowed from "Chip" Ram, which was limited to 2MB at best? Someone correct me if I'm wrong).

There were also other areas where a dedicated enough PC owner could spend more and build a more powerful machine (wider availability of (E)ISA controllers, 1990's XGA with its higher resolution than AGA and hw blitter, etc. etc.).


The Amiga CD32 fixed that issue by adding hardware-accelerated chunky/planar conversion. It came out too late to make a difference, though.


> The o/S did good cooperative multitasking

I always thought the Amiga has preemptive multitasking.


you are correct, the Exec micro kernel was fully preemptive, or as much as it could be without the benefit of an MMU. Message passing through pointer exchange, no protection context,, etc. not pretty but it made for usable multitasking in a paltry 512k of memory at 7.16mhz.


I still use Word 97 for most of my writing, because it never nags me for anything, it's always ready to go in a split-second, even though it's running inside a VM, and it always saves a file with no more than two keypresses, one if it's already been saved before.

(Clippy is an optional component I did not install, as much as I miss that face sometimes.)


This resonates with many things from 80’s, 90’s and even well into 2000’s.

There has been a severe erosion since then. From car dashboards to web design, software apps to online communities - it’s an erosion on all fronts.

Fuck everything after 2010. It sucks. Fight me.


The progression of hardware has stopped the progression for software. As a result, developing software has become so easy that we're running Electron in spacecraft right now because who even bothers learning to properly use the native UI toolkits in $currentYear.

Windows XP was considered a memory hog because it needed a whopping 256MiB of RAM to run properly. I run text editors that use more of that these days, and they still feel slower than notepad felt on XP.


This. Right here. We could perhaps forgive the memory bloat if it was only in proportion to the feature set, but stuff is in all practical cases actually SLOWER to do the core functions than it was years ago. Only Chrome, for all it's faults, actually seems worth its memory demands among the major apps I use when you consider how many tabs loading crappy bloated pages you can have open simultaneously while it all pretty much runs. But in general, it's absolutely unacceptable how most modern apps run given their resource use.


This is why I have to laugh when people suggest browsing with JS enabled. Do you have an idea what it would do to my system performance to open 20 tabs of this crap. Yet I still see pages that say, well, I can see that you turned off JavaScript, visitor, but you're wrong, and my site is a special snowflake (which needs 5-20 JavaScript includes from third-party domains) and please enable it before I show you a blog post with a couple of images...

The good news is that almost without fail, the content on these sites is as good as the coding, and I've proven to myself time after time that I lose nothing by just closing the tab. In fact, I gain the time I would've wasted on reading low-quality content that rarely adds anything to my life.


In my experience, Firefox has better memory usage than Chromium-based browsers. Of course it still uses a lot, all things considered.


A change in perspective might be helpful. If you only look for the bad, that's all you'll see.

We have, among countless other things:

- mobile devices, networks, and services that are allowing developing countries to leapfrog their progress in the world

- development toolchains and frameworks that empower a single person to create an order of magnitude more output, enabling smaller teams or even solo enterprises to create major value

- reusable rockets bringing high quality and affordable internet access to the underserved

- mass produced electric vehicles for goodness sake!!

And that's only a few tech-related advancements, the world is largely moving in the right direction in so many areas. I'd never bet against the steady drumbeat of technological progress.


I am not trying to look for good things.

We need to look at what we can improve. Of course things have improved, but not uniformly across all domains and many things have gotten worse.

If we keep patting ourselves on back, these too will degrade. Using hindsight to definitively say what has gotten worse is how we improve including the things you are mentioning.

I kind of think of progress as navigation on a complex multidimensional domain space filled with lots of local optima. To not consult hindsight (gradient) is a bad strategy. Similarly, to not explore is to stay at the same spot. Sometimes, it is important to roll back and go to the old days and try to explore in another direction.


But all of this has it's rise and it's stagnation and bloat-fall.

What does a new flagship phone do better, compared to a 2year old one? A slightly better camera, maybe more storage space, and that's it. "Back in the days" we had phones with 9,10 7-segment displays, then single line lcd screens, multiline, graphical, color, touchscreens and smartphones (each step taking only a couple of years), and since then (so, the last ~5 years) not a lot of "new stuff", that would warrant buying a new phone (except the failing battery on the old one).

Websites have got massively bloated and slow, even on modernish computers, and show even less informations than they did in the past (fscking gallery pages). We've gone from text-only sites, to basic html (with images!), javascript, to 'web 2.0' with dynamic loading... to bloatware.

As mentioned... word has gone from a five, ten...ish megabytes to hundreds of megabytes while not offering much more compared to older versions. Yes, it's a bit more polished, some things are easier now... but not a lot. Going from text to graphical mode was a "revolution", but again, not a lot new after.

Even operating systems... what does windows 10 do, that windows 7 didn't?

I mean... we do get a new feature here and then, but the push to "make a new version" is stronger than the actual need for a new version.... not at first, but once the software gets mature enough, there's (almost) "nothing new to add".


There is plenty of good to appreciate in every moment of every day, and it's helpful to do that. Even in software, there are a few standout gems which were not around just a few years ago.

However, this particular thread seems to be about the quality of user-facing software, which, overall, has been on a continuous nosedive for about 10 years, and every time I think it can't get any worse, someone figured out a new way to make it suck.


One quiet improvement has been cloud-backing & sync. Remember 12 years ago when Dropbox was revolutionary? But today, I can fluidly migrate between a phone, table, PC, and laptop, changing as I move through the day, and (other than open tabs) everything is just there, accessible on every machine.

I think this is one nice thing that is very easy to take for granted.


Yet another perspective is looking for how things can be improved.

Mobile connectivity is a wonderful thing as long as it is a means rather than an end. Sometimes it takes a glance into the past to understand the former, to realize that networks were created to enable communication rather than to captivate audiences.

Modern development toolchains are great, yet they typically cater to professionals who have the time and incentive to learn them. We don't have to look very far into the past to realize that there was a time when accessibility was as important as productivity.

Creating a world that everyone wants to live in requires a great deal of reflection regarding what we need and how to accomplish it. Letting technological development run amok may achieve some of those goals but it can also be regressive.


I've heard of someone only using wifi, and disusing the cell networks, keeping the device in airplane mode when not in use. It saves battery, and the devices can stay powered up for days without charging.


Well, the PS4 is certainly better than the PS3 and I'd say the Nintendo Switch is certainly better than the Wii. Modern ARM chips are certainly better than what we had in 2009 in every respect. There are certainly things that did improve, though I agree a lot of things did regress-- I'm just saying that not EVERYTHING regressed. It's all about perspective.


> Well, the PS4 is certainly better than the PS3

I don't know about PS3, but the thing that bugs me about PS4 is that it requires installing games. It can't just run a game straight off the disc like PS1 and PS2 did, it has to copy it to its hard drive first.

> Modern ARM chips are certainly better than what we had in 2009 in every respect.

Modern processors are, generally, great and very fast, regardless of the instruction set. Their greatness is usually compensated with mediocre software that is being constantly rushed to meet imaginary deadlines that no user cares about. The result is that the end user never really realizes how fast and capable modern hardware actually is.


Well, when optical drives are no longer capable of performance fast enough to avoid load times, HD installation is going to be required. That was inevitable and I certainly could foresee it going back to the PS2 era; the creation of HDLoader made it pretty clear.

I'd still say modern ARM is a major improvement simply because performance/power ratio-wise it's already neck and neck or better than x86 in a lot of areas. x86, dating back to the P4 era and onward, has been a painful slide into power consumption/waste heat hell.


The PS3 had the same problem, I think. It’s miles behind the Switch, but I’ve found installing on the PS4 to be less of a hassle.


I really miss Kinect though. Yes I know everybody wants me to put on a VR headset but I just want to move in front of my screen and have fun without forcing on myself something I don't feel comfortable with. I recently tried to play Power Up Heroes online. At first, I thought something was wrong with my Live account, but no - I just couldn't find anybody to play online. A quick look at the leaderboard showed two records. In the whole world.


Hah, it was mostly a tongue in cheek comment with a broad stroke, of course you're 100% right.


You'd like this read: https://danluu.com/input-lag/

And it's only worse in modern systems that round-trip EVERYTHING to the "cloud" for reasons understood only by Google.


The reason is simple. The software doesn't run on your only computer, it runs on all your computers simultaneously. And it's updated every day, not once a year from a disk in a box.


And it's updated every day, not once a year from a disk in a box.

...and that's a problem, because it encourages half-baked non-solutions and a "we can always fix it later" attitude. As a result, nothing actually improves; it just erratically surges around a local maximum.


Yep, that's a great essay. Thanks for reminding me of its existence.


There used to be news groups that would talk about "the last good version" of software, because it was widely recognised that good software would bloat until it broke.

https://www.pcworld.com/article/137703/article.html


Great article. Ironically, one of those sites which doesn't show images without JS.


Fortunately, I think the pendulum has already passed its highest mark, and is already picking up momentum in the other direction.

And this time around, it will be even better!


> Fuck everything after 2010. It sucks. Fight me.

I think you are just getting old and not willing to discover new things


What is a new website with interesting, quality content, and which does not abuse its visitors, nor have abusive UI patterns?


SourceHut, maybe? https://sourcehut.org/


The fact that only SourceHut comes to mind and the author himself complains often about the sorry state of the software world says a lot.


Thanks! I have not been there in a while, good to check it out again.

Do you know of any sites where people just talk and share, without political nonsense, flamewars, and such? Perhaps focused primarily on humor?

Hacker News is indeed such an oasis, but has a rather narrow interest range. Few people are interested in discussing spirituality here, for example.


You just triggered a thought:

Reddit originally had that goal. And was self-sustaining, basically funded by users. This has all changed, and that's all I want to make for a point here. (why is another discussion)

HN isn't self-sustaining that I know of, though it's also not needing to make it's users into products too.

So, is there a middle ground?

Take the HN code base, clone it, and enable these discussion norms?

We have amazing moderators here. Wise.

That's needed too, or the outcome is something we will see elsewhere, and that's by definition, not what we have here.

How to fund that?

Need hosting, moderator or two, and what else?

Maybe not too much. So maybe it won't take that much money either.

I would find any experiment along these lines both something I would want to participate in as well as very interesting to observe, depending on my role, and the nature of the discussion focus.


Certain subreddits with good moderation can approximate this experience, as long as you avoid/unsubscribe the firehose of everything else and focus only on reading those.


Yes, I agree: Office ‘97, 2000, and XP were the high points of that lineage as far as my experience goes. But I stopped using Office when I switched to Mac in 2003 and have used Word and Excel occasionally since then (preferring the iWork suite’s analogues Pages and Numbers respectively).


Agree on all counts. I chose ME for size and startup time, but 2000 is even better. Truly the peak of Windows, IMO. XP is more compatible, but it also started down the road of candy-coating the native widgets and rejiggling Control Panel. You could opt out of both, but it was the beginning of the decline.


Microsoft finally did something with the new version of Office that makes it worthwhile - when tied to a Microsoft 365 account with some ungodly behind-the-scenes hack involving Sharepoint you get multiplayer PowerPoint - a gamechanger.


Similar with me for old non-naggy tech.

I'm on an Apple Silicon MacBook Air but still often run WordPerfect 4.2 and DataEase 4.53 under DosBox (also Boxer is good but not tried it on M1 yet).

BTW if you're interested in the nuts and bolts of running WordPerfect DOS in emulation, Columbia University (NYC) has a great site at http://www.columbia.edu/~em36/wpdos/ which may even help fixing other DOS software issues.


For Word 97, I find that a Windows ME install in VirtualBox works better than Wine.


Sometimes I use Office 2000 under Windows2000 under Virtualbox, and it's amazing how snappy the apps feel when the whole VM has only 64MB of RAM. Everyone should try it once in a while, just as a reminder of what is possible!

The only Office app that I strongly prefer in its modern form is Excel. Modern Excel has some really nice enhancements (sequences and spillover, let and lambda, flash-fill, table formatting, etc) that I miss when I use older versions.


I'm surprised Word 97 doesn't work in compatibility mode.


Office 97 in Windows XP works extremely well.

When XP was released, there was intended to be no support for Office other than Office XP.

Thousands of questions were raised whether Office 97 would work in Windows XP and it was totally discouraged by Microsoft every time.

After about 6 months somebody (like a big company) tried it and since it worked quite well, the news began to filter out.

Within another 6 months there was all kinds of support info for Office 97 under XP as if the disinformation campaign had never existed.


It probably does, but the native experience is better.

I also have the VM set up already for IE, which def does not work right in Wine.


>compatibility mode

Do you mean Wine? It's imperfect emulation for large apps, and it's slower than a VM.


When I boot an old box (win95, nt5) I'm surprised at how few things I miss.. and how some things feels better. Go figure ..


I use it regularly for compatibility testing. Here are a few things I find myself missing in relatively modern computing, off the top of my head:

Accessibility. Everything, every single little nook and cranny is keyboard-accessible. And I mean accessible not in the sense that I can keep pressing the Tab key, and eventually the barely-visible focus rectangle may arrive at the control I want to use. I mean, everything has a visile keyboard shortcut, and the Tab key also works, and the focus rectangle is visible, and it works every time, not just when the application feels like it.

Keypresses don't get dropped from the buffer even while the application is thinking.

Fast as heck startup time. Fast menus. Fast application startup. Fast, fast, fast. My current daily driver is a 10-year-old laptop with a spinny drive, and Windows 95 is running inside a VM, which is competing for resources with my entire dev environment, so it's not like the hardware is not comparable to what it was designed for.

Consistency in look and operation. Everything has the same widgets and dialogs, and they all work about the same way, at least between Microsoft apps. Netscape and other browsers are a different story.

Flat, dull, boring interface. Everything is a flat gray, nothing is shiny, nothing is animated. Everything is a frame around the work that I'm doing, and feels like it wants to just help me, assist me with what I'm doing, and then get out of the way.

Oh yeah, and I the Save and Open File dialogs are fully-featured file managers with sorting, multi-select, copy, paste, rename, the works.

And IE6 is challenging to develop for, but I fucking love using it. It feels just like the Windows UI (Trident, after all), it's snappy, has focus rectangles on everything, and modern enough to have a DOM with all the works like createElement() getElementById().

IE3 has the most beautiful toolbar of any browser I've seen to date, and also has focus rectangles and basic CSS support. I can still use it to post to my blog.


And IE6 is challenging to develop for, but I fucking love using it. It feels just like the Windows UI (Trident, after all), it's snappy, has focus rectangles on everything, and modern enough to have a DOM with all the works like createElement() getElementById().

It also has a proper (native!) settings dialog with tons of configurability and security zoning. While modern browser's rendering engines and JS have gotten more featureful, I think the UI has gotten much worse in contrast, with everything going in the same direction of dumbed-down-Chrome-clones. My ideal browser would have the configurability and UI of early IE/Firefox, with the JS and rendering engines of current browsers.


>My ideal browser would have the configurability and UI of early IE/Firefox, with the JS and rendering engines of current browsers.

If you don't mind a learning curve and primarily keyboard driven, e.g. if you already love Vim, consider giving qutebrowser a try. You can even run it straight from the source tree, since it uses Python for its wrapper.

It uses QtWebEngine and Chromium, which does mean some issues and drawbacks, but the frame around it is nearly perfect. It's an amazing piece of software, and truly next generation browsing for nerds, IMO.

It's a bit like Vimium, but the keyboard bindings are native and always work, as opposed to being at the mercy of JS injection into the page. All of the settings are accessible via :command mode, and permissions are as granular as you can imagine. There's no graphical settings, but other than that I think it matches your description.


> Keypresses don't get dropped from the buffer even while the application is thinking.

This annoys me to no end. Some applications (Firefox) even reorder keystrokes!

I'm beginning to think we've been using the wrong programming paradigm for user interfaces for the last 20 years. I never had these sorts of complaints about DOS programs, or even Windows up to version 2000. Then something changed. Computers have been feeling slower and slower ever since.


It,s almost like they,ve gotten smart enough to play dumb ;)


yes, yes, and yes. mostly yes.

another factor is that computers were limited in presentation and computation, they'd present to you abstracted, dull tables and operations on data. But it did serious jobs. Now we have extremely (extremely, as a CGI fan, I cannot deny that) fancy presentation systems, but the actual data displayed is linear list of paragraphs, a few pictures and data points. Well computer is hypermainstream so obviously it talks to the average user.. but it says something about what computing was and is now.


It's amazingly frightening to me that companies were willing to pay MILLIONS for a Cray 2 supercomputer and I carry a phone MORE POWERFUL than that and I basically use it to browse the internet - and I don't know anything to do on it that would be worth millions to do.


Yes, I've been reading about this for a few years, you can find some people on slashdot or similar discussing peak power comparison between CRAYs and core i5 and they were already ~comparable (CRAY still won for sustain workloads). Recent machines are even more powerful but yeah the utilitarian aspect is different to say the least.

Got me thinking about why computers in the first place.. in the 90s a computer was mostly sold as an office like tool for the home. "Get access to big corp tools !" marketing feeling. You get to write document, print them, organize your stuff (some people had movie tape databases), finance etc. But in retrospect we didn't use much of that (it was more a gaming machine for me, and later CGI tool) and with the web.. printing went away.. local db went away. People probably still do financing. But now electronics have become a social gateway .. see news, pics, chat. A billion dollar device production requires either a nation, a big company or 7 billion customers :)

Do you ever try to imagine the face of a 50-80s scientist if you came back in time to show him an iphone ? he'd faint of laughter I guess.


Or he'd figure out a way to use each of the 2 million pixels to display useful information! One thing about old "business" UIs designed for terminals or small screens - they certainly didn't waste space.


I believe things wave, advanced business maximize efficiency as much as possible, the mainstream variant following relaxes that constraint that's why we have better tools but underutilized.


I think today's pocket devices are comparable to a Cray only in raw flops, but do not hold a candle in other respects.


No product activation either. Those were the days.


Let's go farther back. I did/do code editing, not WYSIWYG word processing, and a good DOS editor let you do everything you needed without taking your hands off the keyboard. You could zoom through your code.


It,s true, when there is no pointer to speak of, you have no choice but to make it keyboard-accessible. I think accommodating DOS users is one of the reasons Windows started off so accessible to the keyboard. Compare that to Mac, which started off with a GUI and mouse, and to this day remains inaccessible to keyboard by default.


>Word 97

Is there a FOSS program with a comparable user experience?


Not that I have seen. I'm way into FOSS, but I'm compromising on this because nothing even comes close.

It was made at a time when Microsoft cared a lot about user experience, poured millions of dollars into user studies, design, and ensuring that every pixel was perfect, every function was keyboard-accessible, and every workflow was smooth. And it shows.

Trying to use e.g. LibreOffice or Google Docs, after that, it's like comparing a science fair project made by a kid who just wanted to pass with a C versus an A+ student whose project was also made by their parents, who also happen to be scientists.


Even comparing recent versions of Word to the older ones is facepalm-inducing. I still don't understand what the designer was thinking with the ribbon.

Compare to the DOS days where every major program (wordperfect, lotus 1-2-3, etc) had a function key template. It was annoying to learn at first, but easily discoverable, and eventually you get muscle memory. Modern apps have lost that, in many ways.


Abiword.


No.

I've used Abiword a bit, but it has been a few years, and Abiword doesn't really come close to the Microsoft Word experience back in the day. People generally have their favorite MS Word version that they like before it "went downhill", and mine is Word 5.1a for Macintosh. This is the last version of Microsoft Word that was specifically designed as a Macintosh application, rather than a cross-platform application with a Mac port.

Microsoft's turned out some very high-quality Mac applications in the 1990s and 2000s. Another notable one is Internet Explorer 5, which is unrelated to the infamous Internet Explorer 5 for Windows (the one with poor standards compliance), other than the fact that the products are made by the same company, have the same name, and have the same version number.


I have a G4 Mac in my test pool, ans IE5 for Mac OS X is one of the best browsers ever made.

IE5 is also the most ported IE ever, it was ported to close to 10 different platforms.


I suspect porting had something to do with the quality; software that is actually _ported_ (as opposed to Java or Electron write-once-run-everywhere) often ends up better for the exercise.


Internet Explorer wasn't ported to Mac, though. Instead, a new piece of software with the same name was created. This is not some kind of nitpick or technicality; they were really quite different.


It's true, and it's acknowledged up-thread. All the other "IE5" releases are Trident ports, however, and I believe there are up to nine of them, depending on how you count:

Windows 3.1, Windows NT 3.x, Windows 95, Windows 98, Windows ME, Windows NT 4.0, Windows 2000, Solaris, and HP-UX.

In addiion, as you mention, an "IE5" of different lineage is also available for both Mac OS Classic and Mac OS X.

Here is another thing which may seem crazy in today's world: IE 5.01 was officially supported by Microsoft on Windows 2000 until 2010, more than 10 years after IE5.0's release!


That's a great point, I haven't thought about that much.

It's like you're re-raking the code with every port, essentially doing a partial rewrite and refactor, with all the benefits that come with that.

IE5 is one of the most solid browsers I know.


Abiword is decent. I open it, and I start typing right away, no dialogs. The startup time is rather fast.

However, saving does not provide me with a default filename, so there is friction there.

In the status bar, the indicator text runs right up to the border edge, nobody took the time to ensure vertical and horizontal spacing matches up.

The menubar does not provide keyboard accelerators until after I press the Alt key -- too late in terms of discoverability.

The cursor actually disappears when I use the arrow keys to move around. It's not even blinking, it's just GONE, until I stop using the arrow keys, then reappears after a delay.

The spell-checker finds an spelling mistake in both "Abiword" and "abiword".

Using only the keyboard, I cannot figure out how to change the font size, except by going through the Format->Font menu. The Font dialog has invisible or nearly-invisible focus rectangles on most controls.

I just installed it to write this comment, and I found all this stuff in the first five or so minutes of using it.

It's a nice little word processor, and I'd use it in a pinch, but it's no Word 97.


I think some of these things can be fixed in Abiword and other apps with GTK configuration. Eg.:

> The menubar does not provide keyboard accelerators until after I press the Alt key -- too late in terms of discoverability.

I think this is a GTK theme setting that you can override https://askubuntu.com/questions/329668/always-show-keyboard-...


Thank you. So far, I think that only addresses the menu accelerators issue, and none of the others I listed.

I think the fact that it is off by default in the first places illustrates my point quite well. And so does the fact that I have to seek out separate configuration for the toolkit.

Not only that, but on the very page you linked, people are complaining that even the setting does not work reliably. I know that when I tried to adjust it in the past for other apps it did not work for me.

Perhaps I just needed to reboot? :)


thanks


Not to mention the price - you can easily buy it for $10, and it's really blazing fast. I wouldn't use it for complex documents though.


word perfect for me!


I was doing Macintosh programming back then. Programming was much harder because there was no stack overflow. If you got stuck, you had to figure it out yourself or ask someone who knew something at a user group meeting or call a BBS and pay long distance. There was one book : Mac Programming Primer that was the only beginner book that existed. The Inside Mac books were large and heavy and the only official guides for programming macs. There was no online help or search, so you had to open these heavy books and flip through them every time you needed anything. Think C had a pretty fast compiler, but there was no protected memory, so every time you null pointer dereferenced, or otherwise corrupted memory, the machine would crash and you would have to reboot. Version control and other programmer tools were very primitive. Working with other programmers was extremely clumsy.

My favorite memories of the era were the BBSs. These were dial up services with one or two lines that a hobbyist would run on his home machine. They were very local, because calling long distance was expensive. It was a tech nerd only local hangout club for the local area. Also, at this point in time, nerds were very split off from the rest of society. We were really our own strange order of weirdos and very very uncool, while nowadays a much larger percentage of people would say they're in tech and everyone uses tech and is a tech nerd now.


> Programming was much harder because there was no stack overflow

Oh, stack overflow definitely existed, but only the kind that makes programming harder.

EDIT: Corrected a typo.


Well, stack overflow existed, but Stack Overflow didn't.

(This is probably the one time in my writing career when the initial "Well," actually makes a difference in communicating the idea I intended to convey.)


a decentralized world where everybody had their own private stack overflow for their own malefit.


mac tutor was a great magazine - got into the real nuts and bolts under the mac hood


BBSes were awesome. It was this whole other level of subculture within the already nerdy computing subculture. And the people you were connecting with were all in your own community!

I remember I would grab the local computer user newsletter every time I'd go to the computer store with my father just to pore over the BBS listings to see if there were any new ones that had popped up recently.

It was really a fun time to be part of that whole early (and very local) era of 'online computing'.


I was lucky to have access to THINK Pascal and later Metrowerks CodeWarrior, and adults who would answer programming questions or tell me how pointers work.

I don't remember the Inside Macintosh books being especially large and heavy; they were broken into volumes which were a half-inch or inch thick, except for volume VI. It wasn't terribly long before these books ended up online on Apple's developer website but I don't remember when that happened.


The original Inside Macintosh books were three-ring binders three or four inches thick. Apple would update them by mailing you new sections on paper. Definitely large and heavy.


> the machine would crash and you would have to reboot

I self taught myself on macs. And not knowing how to correctly validate my inputs, I would crash, reboot, try again until I got it right. So many reboots.


I remember optimizing my MS-DOS 5.0 config.sys and autoexec.bat to get me back and running as fast as I could, shaved boot down to a few seconds if I remember correctly.

Crashing and rebooting wasn't that bad before multi-process became the norm, as the crashing app was taking your data down with it anyway, who cares if you needed to wait 5 seconds for the computer to boot again.


I did the same. My 25MHz 386SX boots to a usable screen faster than any Windows or Linux machine I own, despite promises from certain prominent programmers.

I could park my hard drive and shut my computer down every night and think twice about it. Now I don't bother, because my coffee will be done before the computer will let my login.


It's so strange how unevenly the future was distributed back then. You could get a Quadra 900 or a color NeXTstation. The SE/30 was two years old. You could run Mathematica. Matlab 3.5 was available on DOS and the Mac, and Matlab 4 was a year away. AutoCAD was pretty well established on PCs and Macs for 2D CAD. LabView was five years old. Microsoft Word for the Mac was six years old.

Gopher and the rudimentary gopher-with-pictures web already existed. Mosaic was two years away.

If you were a hobbyist, you might buy an Amiga 3000 instead of a mid-tier Mac or PC- the Amiga 4000 was a year away.

On the PC, Wolfenstein 3D was year away. DOOM was two years away.

And yet well into the nineties, PC nerds were amazed to be able to spin up multiple Wordperfect 5 windows, and they may as well have been at the vanguard as far as most people were concerned! Quickbasic! Lotus 1-2-3!


More antidotes to the memory manager nostalgia:

- O'Reilly had been in business for thirteen years, had been publishing X11 manuals with animals on the cover for three years, and had just published the first edition of Programming Perl

- Python existed in some form (for two years already!), but no one had ever heard of it until the Grail browser four years later, which no one has heard of now

- the Video Toaster for the Amiga was a year old. So was Photoshop (on the Mac). NIH Image (also Mac) was even older. Frame grabbers were already well established in scientific imaging. Array processors were a thing (the one I saw was for the PC), as the kids say, if you had the budget for one

- IRC was three years old


> It's so strange how unevenly the future was distributed back then.

Totally! My best friend had an SGI Indy workstation (with a webcam!) at home when they came out... That was in 1993 (?) : )

Can you imagine getting to play on a SGI workstation to then come back home and find my 486 (which I'm pretty sure I got in 1991)?

Thankfully Linus releasing Linux wasn't far away.


SGI made such nice machines. The out-of-the-box demos were amazing; you'd get hornswoggled into thinking you were going to be doing all that kind of crazy stuff with the computer. Remember Iris Explorer? http://yohanan.org/steve/projects/iris-explorer


Yes, I can imagine that!

I was part of a mentoring program in 1993 where I was a shadow at the local hospital's IT department. They were able to purchase an Indigo for some sort of research and allowed me to explore it.

My home computer at the time was a 286, which could barely run Windows 3.0.


After dealing with single-IP-owner computer infrastructures, it was pretty clear that the IBM standard was going to win as long as the industry prevented IBM from taking control again (like they tried with the PS/2)

A lot of people didn't want a single point of failure after watching Atari and Commodore blow themselves up spectacularly.

Also, the early 90s were when the IBM market started to be commoditized enough to bring prices way down(). The self-implosion of Amiga and a bit later a near-death experience on Apple's side just combined with the relatively cheap hardware on the IBM side enough to kill off nearly every non-Wintel platform for average desktop users.

This was also the point where Computer Shopper turned into hundreds of pages of sellers offering hardware at all price points, allowing easy entry into the market for a new owner. The internet sped that up further.


We're talking about 1991, when Apple was pretty solid in their niche, and if you wanted to talk other more interesting computers, i.e. someone else's money:

  - the SGI Indigo was released
  - the Symbolics XL1200 was a year old, and the MacIvory III was released
  - the Sparcstation 2 (SunOS!) was a year old
University departments were already throwing away old LMI lisp machines.

The really big deal was when everyone bought a Windows 95 machine to get on the internet with AOL about five years later.


I assume you are writing about 1991? Quattro Pro had come out a few years prior and blew the doors off 1-2-3.

Visual BASIC 1.0 for DOS was a year away, a little sluggish and too late to change the world, but still amazing for those of us who could not afford the RAM to run Windows.


Yes - the article is about 1991. Of course there was more than one DOS spreadsheet in 1991. To give you some perspective: Excel for Mac was six years old in 1991, Informix Wingz was three years old, and Lotus Improv was just released.

They were already showing Visual Basic for Windows at trade shows and Hypercard on the mac was four years old.


> Heck, most people didn't even own mice. You actually had to go out and buy one separately, from Microsoft or IBM if you wanted to run Windows.

You actually could run Windows without a mouse (I did it for a while). The only thing that didn't work was Paintbrush. Everything else was fully keyboard accessible. Every or nearly every menu item and form field had a prominently underlined keyboard accelerator, and you could always rely on the tab order (for form fields) or the arrow keys (for menu items).


OTOH the only non-PnP DOS monitor I saved was one having a plastic holder on the side where you kept the mouse most of the time.

So it wouldn't clutter up the physical top of your desk until you wanted to occasionally run a program like Windows 2 which supported a mouse.


Oh my the bootdisks. Funnily I recall managing a lot of MS-DOS config.sys and autoexec.bat stuff (we run this stuff for games until 96 or so) but I definitely didn't understand it. Now I wonder were I got the ideas to change such arcane things like HMA or Interrupts from. Game magazines, probably.


You were really running up against DOS memory limits in the 90s, so there were all sorts of tricks to make use of some of the memory range above 640K. There was also the expanded memory spec which allowed memory from an expansion card to be basically paged into the usual memory range. So people ended up having a bunch of config files that they booted into depending upon what they were doing.


QEMM was a must-have to keep conventional memory as free as possible. Also digging around for low-RAM-usage mouse drivers, etc helped a lot.

Even then you'd run into headaches with things like Ultima 7's JEMM memory manager that insisted you not have EMM386/QEMM running. To say that managing a DOS install was somewhat painful is an understatement.


That was right around the era where computers started having more memory than DOS could "use" and so things like LOADHIGH became so important; memory below 640K was precious especially as TSRs became common.


I remember I learnt most of this stuff from a neighbor, he took an "informatics" course that taught him all that wizard stuff.


While Linus was just starting Linux in 1991, there was also Net/2 which came out in 1991. Net/2 was 4.3BSD with the AT&T code removed, and is the precursor to all of today's BSDs.

In a lot of ways 1992 and 1993 were more interesting than 1991, as they saw the first releases of Slackware, FreeBSD, NetBSD, Windows NT, Solaris, etc.


This article is really about PC nerd nostalgia. From the "If you believe the Steve Jobs iPad snake oil" sentiment to mentions of Token Ring and IRQ conflicts - I loved it. Memory lane! :D


That storm was the Microsoft and the GUI storm that would eventually bring us to the computing model we are using today

Huh? I'm wondering if the writer knew there Macintoshes (and Amigas!) with GUIs over a half-dozen years before 1991. And Windows 3.0--I was there, and used it--was bloody awful.


He was a PC guy, and apparently still is. They really were like that back then.


Back in 1989 I hauled a Toshiba portable up to the top of San Jacinto via the Palm Springs aerial tramway to do some work on AutoCAD. I didn't get much work done; mostly I just had to explain what this computer thing was. Toshiba probably ended up with a few sales from that.

Computing was definitely a niche industry back then for most folks.

I kinda miss it.


Those old Toshibas with the gas plasma displays were so cool. Razor sharp monochrome graphics, easy on the eyes, and flat. I had the use of one for a while in 1993 or so when it was already quite obsolescent - it was like a laptop for a giant, with a full sized, full travel keyboard and a card cage in the back for ISA cards.


You'd have been my hero. In 1989, I was 12 or 13 years old, and was actually living in Palm Springs! I lived in Palm Springs or Desert Hot Springs for a decade. Never did get up there on the Tram though. A shame, looking back. Small world!


AutoCAD 9 users unite! (or was that still v2?)


And on the other hand, 1991 saw the release of a pen-based operating system and one of the early tablets: https://en.wikipedia.org/wiki/EO_Personal_Communicator


They even came with a wireless cellular network modem?! In 1991? Colour me impressed.


"I remember though, what impressed me about Rachel the most that day. She told me during my OS/2 demo that she knew how to tweak her WIN.INI file. I knew right then and there she was a keeper." I love this!!


The sociological/anthropological aspect of this are interesting.

1991: 100MB = serious work

2021: 100MB = so tiny you can't buy storage that small, can't be serious


I remember trying to buy a 2 GB SD card once for an old device that didn't support SDHC/SDXC. Was almost laughed out of the store.


While not as extreme, I recently almost hit the same barrier trying to get a SDHC 16GB card to revive an old Nook.


Should have looked for an industrial SLC SD card.


They probably don't sell these in retail and I didn't feel like waiting a month to get one shipped from China.


Yeah, my $4000 486-DX2 with 8MB RAM and a Weitek P9000 graphics card was amazing(ly expensive)!


It was also such a fast-moving era that your 486 would get crushed in a couple years by the arrival of Pentiums.

I still regularly use a 2012 laptop, but if you were using a 9-year-old PC back in 1991, it would be a 4.77 MHz Intel 8088, or something like a Commodore 64.


To quote Weird Al,

My new computer's got the clocks, it rocks/But it was obsolete before I opened the box/You say you've had your desktop for over a week? Throw that junk away, man, it's an antique/ Your laptop is a month old? Well that's great/If you could use a nice, heavy paperweight.

These days, my parent's 8-year-old computer with 8GB+ of memory, a high-end (for the time) CPU, and an SSD is good enough for everything they do.


In the 10 years between 2011 and 2021, the only computing development to make my jaw drop was the SSD (well, maybe LCD multi monitors, too). I just upgraded a 2009 vintage AMD desktop that was headed for recycling with an SSD, and it's once again useful.

Smartphones came further, obviously, starting from nothing in 2006m but I see 2014-2015 as the start of a plateau. When we started seeing glass-backed cases and folding screens, I knew we'd reached the land of diminshing returns.


I think that's the thing that is most amazing to me - once we got to the SSD era there really haven't been amazing "performance leaps" like we used to have in the 90s-early 2000s.

I remember selling my gaming computer at college every 3-6 months and building the new hotness and it being NOTICEABLY better each time.


Yeah for sure. My next one after this was a Dual CPU Pentium Pro.


Lucky you! I dreamt of owning a machine like this. I thought the dx2 cpus had appeared later though (92?).

I was stuck with a 386/16Mhz/40 megs hd, and a whopping megabyte of ram.

At least it taught me that it was sometimes worth optimizing my code...


The 8087 chip I added on to my 8086 system sits on my monitor pedestal, forever. Right now, it's next to my ()&(^&^%%$%^ Pixel 2 XL USB C audio connector. I was a grad student in numerical analysis and holy smokes! Hardware IEEE 754 floating point!

Sometime aroundish 1988 I paid $600 for an 80MB hard drive, out of a $12K/yr stipend. Good times.


I still have my first CPU, a 12MHz AMD 80286--which I realize is pretty 'new' but I'm only 43.


Every respectable hacker remembers the spec of their first computer.

286 + 1MB RAM + 40MB hard drive.

This was 1992 and it was valued about 1000$.


4KB TRS-80 Color Computer, which would have been $400 in 1980. Wasn't my own computer as it was my father's and I was a bit too young to get deep into programming until 1983 or so.

First computer that was strictly mine was a CoCo 3 in 1986, and the first one I built entirely myself was a 486-66 back in 1993.

Yep, we definitely remember those first machines.


I was basically the same, although I think we upgraded it somehow to 16K. I remember typing in source code from magazines (Rainbow, in particular).

I thought MS Extended Basic on the CoCo was pretty cool.

Ended up moving to Apple ][ a few years after that, but still have very fond memories of the CoCo, even though everyone else I knew had something else and liked to refer to my computer as a Trash 80.


It had the better CPU by far. 6809 = 8 bit goodness!


I too spent a lot of formative years on a 286, envying my friends who all had 386s and could run Windows and play cooler games.


Windows ran fine on a 286. Even with just 1 MB of RAM. You just couldn't launch more than a couple of big applications at a time without severe swapping.


Yep, you could run 3.1 in "real mode" instead of "protected mode" but the 286 was kind of a dog of a CPU and it had some pretty nasty flaws that led to 286 machines not really being all that common compared to 80886 and 386 ones.


3.1 couldn’t run in real mode. It could run either in standard mode or in enhanced mode. The enhanced mode was 386-only. 3.0 on the other hand could run in real mode so it technically could be used with 8086.


You're right, I had Real and Standard mixed up. You definitely didn't want to run 3.0, though. That was a complete mess, and the beginning point of where Microsoft started to make a turn-around in stability.

3.0 would crash every 15 minutes on average. 3.1 would crash every 45 minutes on average. 95 would go a day at a time, 98 would go several days, XP was stable a month at a time when there weren't major exploits or bad driver behavior.

...and so on. I may be overstating the impact SLIGHTLY, but 3.0 really was crash prone to the point magazines heavily pushed people to 3.1 when that came out, and 95 was definitely significantly improved over that, and so forth (skipping ME, of course) until XP.


In my experience, Windows 98 crashed more often, mostly due to running out of "resources". Windows 98 GUI was more complex, thus wasting more of these "resources", running out and crashing.

XP, on the other hand, practically never crashed, unless you had bad drivers. Just like Windows NT4 & 2000 before it.


Almost identical to mine--with a SuperVGA 12" CRT to boot!


WYSE 286:

12.5 MHz

1MB ram

2x 5.25" floppy drives (one HD, one DD)

10MB hard drive

HGC (640x200 monochrome) with amber monitor.


I also had a 386SX-16MHz with 2MB of RAM, a 40 Megabyte hard-drive, a VGA adapter and a 14” screen.

That was in 1992 and it was an abysmally slow system.


Very close to my first "real" (well technically my dad's) computer - he had a custom workstation at home to do some work on but the 386 was the first PC we got. He went out of his way to get a DX and it had an 80 MB SCSI drive and 4 MB of RAM.

I spent literally HOURS trying every single driver on the SimCity 2000 CD until I found one that worked - it was an OAK videocard with the bare minimum of RAM needed.


Yep. A friend of mine convinced his mother to get him a $4000 Zeos machine in 1992 for his birthday. It was one hell of a beast for that era in time, being the next to highest end machine Zeos was selling and I remember playing The 7th Guest on that thing up until I got my own CD-ROM drive.

His machine also came with a Windows graphics accelerator video card, which was a pretty impressive piece of hardware. Back then, you could see the difference in redraw speeds between accelerated and non-accelerated very easily.

These days, 2D is so simple that nobody even thinks about whether your card can keep up or not.

Back then, even in MS-DOS you had a lot of questions about if your card could do VESA modes, undocumented video modes, etc, and sometimes you'd need SciTech Display Doctor to get those modes. Linear framebuffers weren't commonly supported, and getting the best possible performance required that.


I had a similar system with a 17" Nanao monitor that by itself cost something like $1000. I also had a first generation Gravis Ultrasound, which I still own. I kept that machine for a while, it ran DOS, then Windows, then OS/2 and then Linux and FreeBSD and at one point had all of them installed at the same time. So much time wasted messing with partitions and file systems.


If I remember correctly, the final use of mine was running a home email server on Debian Linux with a dial-up modem.


I remember those CPUs being a lot more expensive than that. And (after looking it up) launching in 1992.


A lot of baked brain cells between then and now....I don't deny my memory is fuzzy.


Fair enough. I guess I was mostly retroactively jealous - at the time I was rocking an 8086 with CGA.


Hmmm, really? I had a DX-4 which was not expensive at all around 1996-ish.


Pentium 200Mhz (Socket 7) came out in 95; so a dx-4 in 96 was definitely old and should be not expensive.


When the 486 launched in 1989/1990 a typical system cost like $10k+.

In 1996 it was very cheap.


And to think of $4000 in 1991 dollars...


i never spent $10k on a home system in 1991. What ???


Yes, back in day when things where expensive, but housing was cheap.


Oh man, all that high memory stuff brings back memories. In 1991 my dad and I custom build a 486-33 (for way cheaper than the $7,000 quoted in the article). After I got it all set up, I spend hours tuning my config.sys and autoexec.bat. Not only did you have to put stuff in high memory, but you had to do it in the right order. There were programs that would help, but you could tweak it manually for added efficiency.

I remember how proud I was when I tried every combination and got max efficiency.


I remember having a set of boot floppies with different memory configurations. Depending if you were going to doing gaming or not you'd use a different boot floppy.


Got to reboot to squeeze every ounce out for playing Dark Forces!


The 50mhz 486 in 1991 had about 9 years of use before total obsolesce. not bad


Multiple boot floppies? Never had to do that. I was very excited by the new version of DR-DOS that let you build a menu system in CONFIG.SYS to load a different set of drivers and memory managmemt at start up.

Also the wierd IRQ issues. Simpler when you had to set them with header pins on the card in question. When early plug and play came in it was awful, automatic IRQ conflicts and sometimes no way to change them. Was a real pain when the knock off sound blaster clone I had would auto assign itself anything other than IRQ 5 and some games would stop making sounds!


> early plug and play

Also known as plug and pray. Yes, I remember that time.


As far as I recall that was not a hardware problem. ISA PnP cards describe what they can do and then the driver picks one of the possibilities and programs the card.

It got fun when real sound blasters had broken config, so a generic driver (for a unix system) would fail.

In my experience (playing with unix) IRQ conflicts was much more a dos/windows problem than hardware. Though the hardware was nasty too.


I had problems with that all the way into Windows 98. I had a motherboard that would not stop sharing IRQs between a WinTV PCI card and the onboard audio. Attempting to use the TV capture card would crash the system, and there was literally no way to stop the motherboard from doing it.

XP, and the hardware of that new era, for all its faults was a HUGE step forward on UPnP.


Yeah, that irked me too when I read the article. I had only one boot disk, with a very stuffy CONFIG.SYS/AUTOEXEC.BAT that was running like 12 different configurations for what I wanted to run. Sure, I had to Ctrl+Alt+Del to restart but the diskette never left the A: drive. Actually I still have that diskette virtualized inside of one of my DosBox VM's.


Ah, ViewMax memories in DR-DOS 5.0.


dwcfgmg.sys had my heart broke


The Dos & Windows Configuration Manager... if I remember correctly bridged the gap between DOS and Plug ‘n’ Play (or Pray) functionality.

If one were not touched by good fortune whilst specifying, assembling, and configuring one’s system, then one might reasonably expect interact much too often with this obscure system component.

(Kind of like mDNSResponder on OS X.)


You remember well.

Here’s the thing though. In that transitory phase between DOS and Windows one often has to “drop to DOS” to run some software, and even assemble “Boot Disks” to provide a customised stripped down environment, especially for games.

The joy in this particular piece of software was that not only was it obscure and poorly documented but that even once you figured out it was required it still broke some games that required access “all” of the lower 640k which often left you with a choice between not having sound and not playing the game at all ...


> Very few PCs had CD-ROM drives and multimedia software was nearly non-existent on the PC platform.

Transferring data meant either a stack of floppies, or something like LapLink which used a special cable for the parallel port, or a null-modem cable for the serial port. LapLink could transfer at 115200 baud. And it cost over $100 1990s dollars.

https://books.google.co.uk/books?id=kggOZ4-YEKUC&pg=PA92&red...


> Transferring data meant either a stack of floppies,

Not often. Usually one floppy (maybe two) would suffice: programs and data were proportionally smaller. And if you had two floppy drives, as many computers did, this was no big deal.

The only time you needed "a stack of floppies" or LapLink is when you wanted to backhp or transfer an entire hard disk.


> LapLink which used a special cable for the parallel port

One summer, I used a similar product called Brooklyn Bridge to hook our old 286 with 40 MB HDD to our 386 tower because I ran out of disk space. The filesystem mounted with no problems, and I played the VGA remake of Quest for Glory over it. It was unbearably slow, but I still thought it was awesome.


Remarks about IBMs OS/2 are very true, but there was one more aspect: Win 3.1 was coming on 2 floppy discs, OS/2 was coming on 10 or more, can't remember exactly, but this was something huge, bigger than anything else at that time. But, indeed, technically that was a great achievement.


MS-DOS up to 4.1 was coming on 2 floppies, but Windows 3.1 came on a whole bunch of them (probably no less that 10). Windows 95 ans OS/2 came on a enormous stack of floppies, more than 20. I did this install more then once back then.


Office was just as bad - 97 Professional came on _55_[1]. I remember having to install a version that came on 33 (one of which didn't work, but if you told the installer to ignore it, the final install still worked - I assume some component I didn't use would fail), and that was a "fun" way to spend an afternoon.

[1] "Microsoft Office 97 Professional edition is provided on a total of 55 diskettes" https://docs.microsoft.com/en-us/previous-versions/tn-archiv...


I had installed Windows 3.0 on the 20Mb HDD of my Hyundai 16v (8088, 10MHz!) via 20 360Kbyte floppies :)

(edit, yes Windows 3.0 could use CGA 640x200 with 1 bit colour)


Windows 3.1 was on 7 disks, with the last disk being printer drivers.

This is a beta version, but still should demonstrate nicely: https://ia801800.us.archive.org/view_archive.php?archive=/30...


I thought I was big time when I bought a 9600 baud modem.


To clarify, 1991 was 30 years ago so is it zdnet’s 30th anniversary or were they founded in 2001 and this is just a bigger reflection beyond that?


Ten years ago, this article was published. For their 20th anniversary.


Thank you for pointing that out. Totally missed the 2011 at the end of the link


Yes, even the M1 doesn't come close to 1991's PC technology in terms of surprise-factor.


prince bypass megahit


stopped reading at "If you believe the Steve Jobs iPad snake oil"

eyeroll


The article is from 2011. Keep in mind the iphone is from 2007 and the ipad from 2010. At the time, it wasn't that odd to be skeptical about this (I know I was!).


I kept reading, because Windows seems to have survived and though I wouldn't call Jobs' prediction pure snake oil, I think the idea that tablets would wipe out the keyboard computer never came to pass.


> And oh look at the column on the left, Apple is going to license Mac OS to expand market share. How open!

That would have been nice had it ever happened.



The licensing did happen. There were licensed Mac clones for a while.


IIRC, the biggest clone maker (I think that was Power Computing?) had great ads. They seemed more enthusiastic about selling Mac clones than Apple did about selling Macs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: