It is amazing how much we regress as software gets bloated with feature creep.
Modern web apps are so much more inferior in certain areas when speed, immediacy and efficiency is required. They do have their upsides but more often than not, it is just a mess. I recently logged into my medical insurance website and the fucking thing ajaxing all day and I have no idea what part of the page is waiting to load. I don't have a speedy internet connection. I actually prefer the page to refresh than some small segment of it reload without any indication. Sometimes spinners are fixed in place to indicate "loading" but it is just a frustrating thing to look at.
I've used IBM AS400 and a bunch of factory management software - it is an absolute pleasure to use. Yet, most operators on the factory floor complain "We are just stuck with this old archaic system". Every single day. I feel like one of these days some asshole consultant company is going to convince our upper management for a web-based MRP.
Large percentage of people do not appreciate old pieces of software. They look at the aesthetics of it and write it off. I really think an average joe off the streets can be completely bedazzled by a modern React app with animations and frivolous bells and whistles that takes 5 seconds to load. Joe does not know the difference between aesthetics and functionlism. Jill on the other hand is a linux aficionado and goes out of her way to make things hard. Spends ages perfecting her vim config file, and looks down on anyone that doesn't use Arch Linux and some form of a window manager.
Perhaps there is a sweet spot between Joe and Jill's philosophy towards useful, easy to learn and efficient software that doesn't require an obsessive linux nerd to use. Like WordPerfect 6.2.
If anything, CLIs are thriving on Macs and Linux boxes (in the cloud). Few people of my generation (70s/80s) would have ever predicted a whole new generation of hipster programmers and web designers today who think nothing of dropping to bash and ssh'ing and running npm. Or using vim and REPLs. We thought the whole world would go full GUI by the early late 90s (and it has, yet the console has continued to dominate).
Kids these days are more comfortable with the command-line than a whole generation of Windows programmers before them.
As for AS/400, I can't say I've interacted with it too much apart from a few brief occasions interfacing with DB2 on AS/400 and it was less than elegant (involved lots of emulation). We had a hard time building anything on top of it. The interoperability of AS/400 with non-IBM systems didn't seem to be that great, even though the system itself was rock-solid and ran mission-critical workloads extremely reliably.
I do miss the warm glow of a Turbo Vision IDE (I wrote my first real TUI application on Borland Paradox 4.0) and how quickly one could zip around and get stuff done.
You're right about AS/400 - it is a giant pain for developers in our company. But for the user, it is amazing.
I agree with the rest of the take but you seem to be coming from a developer's standpoint and I am talking about the user side. Developers back then also had access to terminal and knew how to use it. It's just that the entire software industry has grown.
When I was 10 years old, we had to write a turtle program in DOS. Tell me which 10 year olds today know CLI? They're all plastered to their iPads.
> Tell me which 10 year olds today know CLI? They're all plastered to their iPads.
Prime example of juvenoia and biased stereotypical thinking right there. "Back in my days ..." ™
I know more kids in the ages up to 13 who are interested in actually exploring and working with technical stuff (programming and electronics for example) than kids who spend their most time just "consuming" and using these things. The main reason is that these things are much more accessible these days. YMMV, depending on your social and demographic environment I guess ...
Since I don't have concrete data to back it up, I usually view it as a case of every generation had children and youth who were interested in exploring technology and those who were simply interested in using it. While there is probably some shift in numbers, I doubt that it is a big shift.
Granted, I also set the bar relatively high. (Example: are those skills transferable.)
My child is in awe when he sees me use the Minecraft CLI. He was excited to start kindergarten specifically to learn to read and write enough to use the CLI. I'm hoping that will eventually be his entree into programming.
It does give me some hope that future generations will still use computers and not just tablets.
Language is kind of the way we have serialized our thoughts for communication and storage. CLI is probably the simplest way to implement that in a computer, but you could probably go quite far by providing a TTS/STT interface to a CLI.
I'm not saying that's what you should do with Minecraft, but it could be interesting to see how such an interface is usable.
Minecraft has taught our kids to write just for the search item interface. I just draw pictures with the words underneath for their favourites items and they copy them. A great carrot :)
Hesaid anecdata, but our company (that I recently joined) has been using AS/400 (now named IBM i, I believe) for 20 years and has had less than 6 hours of downtime.
As I understand it, one would definitely need training/study to interface with it.
These days, IBM i is specifically the operating system (so OS/400 successor), which is one of the operating systems (besides AIX and Linux) that run on IBM Power hardware.
Can confirm. I'm such a hipster on Mac who strongly prefers the command line.
Though, to be fair, I started using a computer in the early 90s which was a DOS machine and only had the command line. So I grew up using a computer this way.
I would have to say this. CLIs are much simpler to make and much cheaper to support.
Making one-off tools to automate tasks, or even building a set of internal tools for handling certain operations is much more feasible when you can churn it out in an hour and be done with it.
Just today I spent an hour modifying an old cli tool to help process some customers that I could get as Csv. If I was to make a gui tool for this job, it might take a week or two and the result would be way less flexible.
> It is amazing how much we regress as software gets bloated with feature creep.
Yes. This week I set up a new Windows 10 laptop for an elderly gentleman, who wanted to continue to play the classic Solitaire game. I eventually found the edition that shipped with Windows 3.0 and installed that. I then tried some of the standard Microsoft free games you get now and had a poor experience:
Horribly slow to load. Advertisements in your face constantly. Demands that you "sign in" to Xbox Live. Not having internet connectivity is treated as weird. Noisy user interface; multiple clicks to start a game.
Compare this with the old-fashioned games e.g. Solitaire, Minesweeper:
Loads instantly, ready to play. No advertisements, connectivity requirements, or user data harvesting. Even the simple graphics add an aesthetic all of their own, but that of course is a subjective opinion.
I live in a nearly perpetual state of outrage over the need to sign in and be tracked for things that have absolutely no reason to require an internet connection.
I'm even more outraged when I see popular video games that are clearly targeting children to become addicted and buy in-game currency to progress (or worse- not do anything at all like Fortnite skins). I played fucking The Legend of Zelda on my Nintendo and it was great. What ever happened to creating a product and selling that product once? If it doesn't suck we might buy the next product(s) you make.
I am no fan of IAP-laden games but there’s a lot of stuff skins with no obvious game effect do in a multiplayer game.
* status games - I have this old/hard to get/expensive skin and you do not.
* social games - me and my buddies all dress identically/similarly. Gang colors, the All-Shrek Squad, GTA’s roving gangs of purple/green alien gangs...
* role play games - does dressing up like Superman make you want to play the game differently than dressing up like Little Bo Peep, or like Freddy Krueger?
This has very little effect on a single player game (though I know I sure do spend a while making my character look cool in games with character creation and visible equipment) but “choosing your visual style” is an important part of social signaling.
And of course selling this stuff to the captive audience in your popular game is a pretty nice way to keep a small army of artists busy cranking out stuff, so you don’t have to worry about firing them while the next big project is in the planning stages. Or to just have a giant cash cow of a gentle IAP funnel with a few bits of whale bait here and there.
actually let me amend "no obvious game effect" to "no effects with regards to the games mechanics or statistics" since these can be some pretty obvious game effects
A few years ago, I made the mistake of installing the utility software that came with my new Razor keyboard - I thought it was some special driver thing - and I had to make a new account and sign in? It's a keyboard.
When you plug Razer stuff into a modern Windows 10 computer in the default configuration, Windows will actually silently download and install their crapware. Even if you are not admin. Even if the machine is part of a domain(!)
Oh no! That's awful. I'm still on Windows 7 and I moved to a Logitech keyboard that I prefer - but while I don't think I'm going to try any other Razer products in the future I'm aware many other hardware companies do this sort of thing as well.
None of the games I play on my switch is nagging me to sign up or buy anything. Mostly Super Mario Odyssey, New Super Mario Bros and Zelda, but the same goes for most Nintendo games.
Even programs that are _not_ done well, get replaced with infantile crap.
My wife really likes Scrabble, in-person ideally, but on-line as well. The iOS app was something you could sign in to, or play as guest, but with login you could play your friends. It was clearly buggy as hell, unstable, with all sorts of weird quirks. When it worked though, it was fine (enough).
They've basically started shoving users away, and now there's a new app. I think you might have to sign in with Facebook or some such nonsense, but anyway, it's like the expectation is that it's hung over a baby's crib. Flashy blinking cruft everywhere, and of course the board is shrunk to make room for the superfluous, distracting crap.
I'm sure the latter version is only populated by bots now, because it's well nigh unplayable for a human.
The beauty of the web as we know it has nothing to do with its efficiency. For applications, it is, as you wrote, inferior in speed, as well as many other attributes (memory usage, CPU usage, plenty of others). That is not the point. The point is that web apps are distributed instantly to users, on demand, anywhere, extremely cheaply, with no middlemen necessary. (The web is not the only technology with these traits, but in practice it is the only widespread one.)
Back before the web was prominent, software had to be distributed via physical media, usually floppies and later CDs. This inherently limited the reach and complexity of software - physical media is expensive, especially floppies - so it required either the mailing of software by the creator(s) to users, who would have to pay a fee, or a substantial investment by a publisher, which again incurs a (larger) cost to the end user. Almost any widespread software would require a middleman (publisher) and the end user having to go to a store to purchase it.
Allowing anyone to create and distribute software in a format simple enough to be used easily by anyone has democratized this system. With more programmers who might not practice the craft professionally, and a desire for speedy development, higher-level languages and frameworks sprung up to fill the knowledge gap and enable even easier development.
That's not to say that we should not appreciate older software - as you wrote, the aesthetics ought not belie the challenge and art of software which had to fit into tight spaces, and the constraints required usually forced developers to place function over form. It's a different, not necessarily better or worse, software development paradigm, and older software can be as impressive as or even more so than newer software. But we shouldn't write off the new things either, even if they're worse in areas.
(I don't think all of it looks completely ugly, either: Windows 3.1, for example, exhibits a number of traits of modern flat design - single colors, mostly a lack of shadows, many simple icons and interface elements. Windows 1.0 even used the hamburger menu icon. What's old is new again.)
For applications, it is, as you wrote, inferior in speed, as well as many other attributes (memory usage, CPU usage, plenty of others). That is not the point. The point is that web apps are distributed instantly to users, on demand, anywhere, extremely cheaply, with no middlemen necessary. (The web is not the only technology with these traits, but in practice it is the only widespread one.)
Apart from the speed and convenience of clicking a link to visit a web app, those attributes all sound to me like tradeoffs that benefit the developer at the expense of the user.
The software I remember, from the 90’s and earlier, put the user in control. Licenses were one time costs and they were perpetual. Software was much more thoroughly tested before being released to manufacturing (through the golden master process) since it was a major gaffe to release a broken app that had to be updated with fresh physical media.
Unlike today, there was never any sense that the developer could pull the rug out from under your feet, at any moment. The title of the article is a testament to the (now archaic) notion that software might be considered “complete”. That it seems so quaint is such a terrible shame.
Now we have so much software offered for rental only, like Adobe stuff, or ad-driven free sites which are liable to changed drastically (with no option to skip the “upgrade”) or cancelled outright like so many Google products [1].
Somehow, in the past few years I’ve picked up a nostalgic fever for retro computing [2] which has only intensified since Covid began. I don’t know how to explain it, but here we are. Perhaps I’m just getting old and finding it increasingly difficult to relate to young people, who seem to jump from one social media fad to another, like locusts.
> The title of the article is a testament to the (now archaic) notion that software might be considered “complete”. That it seems so quaint is such a terrible shame.
Your comment reminds me of a pretty jarring experience moving from embedded systems into mobile (iOS) apps. I asked the lead how we know the software is “done” and can ship. He looked at me like I had horns growing out of my head. “We are never done. We just keep developing and releasing until they tell us to work on something else.” This idea that your program can be done and you release the final version is turning into a relic from a lost age.
It’s also sad from the user’s point of view: as updates get more automatic, you have to take deliberate action to stay on an old, working, familiar version. If you’re not careful, you can reboot and your software looks and behaves entirely different. And in the web world it’s impossible! You don’t even have a choice. You are running whatever version the developer decrees you should be running.
Oh man. So painfully true. As I write this message on the latest firefox mobile that upgraded automatically. And is one of the worst software upgrade I have ever seen. And there is no easy way to revert to what worked before of course. At least the most recent patch brought back the back button (no joke). What a treat!
That reminds me, I need to go dig an old version of Firefox out of one of those shady APK websites so I can get tab queue functionality back. I used that easily 20 times every day to remember URLs and such ("Share > with Firefox").
It'll reappear eventually. But until it does, Mozilla has decreed that I shall use an out of date web browser.
>Software was much more thoroughly tested before being released to manufacturing (through the golden master process)
>since it was a major gaffe to release a broken app that had to be updated with fresh physical media.
Well, with maybe the exception of Office 6.0 which I heard described as 'so full of bugs it practically walks itself'
It came on about 9 floppy disks. I had to install it in an office with about 15 standalone PCs and so had to do the disk 1, disk 2,.. shuffle for each PC.
Microsoft is not exactly the kind of company that I would use if I were to show what quality release management meant back in the floppy days. But lots of other companies got it right. That said, Office was super complex and an extremely large and rushed package (the competition was gaining ground).
That is true. But some companies got it very, very right. They made software that has stood the test of time.
The same can not be said of present-day stuff, which seems to change on a weekly basis. I don’t know how anyone will ever be able to use the old Facebook again once they shut it off.
Weekly? Several times per day. Mind you, I'm going with the flow and I'll be more than happy to do things that way too, it is the competitive arena we live in and if you can't fight them join them. But I do feel that the code I wrote in the past was more mature, higher quality and had more longevity in it simply because the pace wasn't so idiotically high.
Isn't that just survivorship bias? The old software that survived either had massive lock-in or was good enough to survive. In another 20 years, some software from today will survive, and that too will be the stuff that was big enough, caused enough lock-in, or was good enough to stand the test of time.
It is and it isn't. My point is that back then it wasn't possible to update software continuously. Thus, the development model was very much focused around putting out discrete, fully functional versions.
Today that isn't the case. It's all about continuous rollover. So in 20 years the software that "survived" will be the same old software that we're talking about now (from the 90s and earlier). All of the present-day software will be continually updated and may be unrecognizable in 20 years.
The oldest proprietary software I came across (approx. 1995) had a LPT hardware key and stopped working as soon as you disconnected it. You could connect your printer through that key (it was a pass-through), but it behaved weirdly.
Oh, and the keys were prone to hardware failure, and replacement took weeks to arrive.
While despicable, and typical for certain classes of professional software, dongles were never required for any of the software I’ve ever used. Even the Adobe tools like Photoshop and Illustrator did not require anything other than a serial number.
I can imagine a modern retro computer with modest specs running on A8 processor, in a compact form, with a super minimal kernel and TCP stack that can do 90% of tasks we need to do today.
I went to a restaurant tonight. Because of COVID-19, we were seated outside. Instead of being given a menu, we were told to scan a laminated QR code on the table. That loaded a website with the menu.
From one point of view, this is a huge waste of compute. The phone camera is scanning for QR code whenever it’s open, which is a waste. We’re calling up cell towers to do megabytes worth of downloading to get a website, when the text of the menu is at most a couple of dozen items.
On the other hand, the restaurant knew we’d have phones and didn’t want to either sanitize laminated menus or print and toss plain paper menus. You could imagine a system in the 80s where you’d use a CueCat style scanner to get a simple barcode for a text menu… but nothing was ever as widespread then as smartphones are now. The sheer ubiquity of computing makes it easy to solve a novel problem with overwhelming resources.
These are way too limited, but I could imagine something a little more powerful - 1024x768 pixel buffer, 64 MB ram and 100 MB space. All apps natively running on this (or even emulation) would instantly load and the latency would be unreal.
Anyone interested in building something like this in physical form? Have access to ME/EE expertise and supply chain that's second to none, contact info in the profile.
Adding to your point, even with the artificial limitations of pico-8, massive hits have been born in it, such as Celeste: https://www.lexaloffle.com/bbs/?tid=2145
That's the basic offering of Microsoft's Power Platform, except for the repl part. Scripting is done graphically through Flow, the backend database is CDS, and reporting is through PowerBI.
As a developer, I think the whole thing is so convoluted and asinine it should curl up and die in a fire. It will spawn an entirely new area of bespoke, non-maintainable yet critical business tooling.
Great software at great (big) prices. I mean if minecraft would have been published in the 90s it would prob cost 80$ which is 160$ in todays money (making up numbers here) instead of 10$. It is harder and more expensive to make software in constraint environments.
If a restaurant can't be bothered to clean their menus between guests can they be bothered to clean the dishes, glassware and prepare food without issue? It seems like a small thing but I'd expect that to happen anyway.
On the plus side, they chose the environmentally friendly option, the alternative would have been disposable paper menus.
They likely already have commercial dishwashers. Maybe there’s a machine that can do high-temperature cleaning of laminated menus with minimal labor, but I doubt it’s cheap.
It's more of a faff to clean laminated menus (disinfectant spray, cloth, manual, have to clean each indvidually) than to clean dishes (just stick them in the dishwasher and turn it on).
I don't know the innards, but I don't think qr decode is that intensive. I seem to recall you can give popular open source decoders frame-at-a-time from a phone camera and it isn't really a bottleneck.
It's not that hard all things considered, but from a 90s PoV, it's a lot of work: you have to identify possible QR areas, deskew them, try the QR algorithm, see if it decodes to something, and then show a pop up on screen.
Yeah but... People write libraries in garbage collected byte code languages, I have seen zxing (java) run on phones from 10 years ago much less powerful than today, and it can still get a decent framerate processing the camera without any special tricks.
> The point is that web apps are distributed instantly to users, on demand, anywhere, extremely cheaply, with no middlemen necessary. Back before the web was prominent, software had to be distributed via physical media, usually floppies and later CDs.
You're conflating "web apps" with "distributing through Internet" and forgetting that between floppies and web apps, for about 15 years the main way of getting applications was downloading regular pieces of software through FTP and HTTP ("download the installer" in Windows world, a package manager in Unix-like worlds, etc.).
That's not the same thing. Yes, you have distribution over the internet. For that grandpa needs to:
1. Find the site: not always easy. Still, same level of difficulty for webapps.
2. Find the HUGE download button: not easy, it's incredible how many people don't figure it out. Still, same difficulty as creating an account or logging in.
3. Download the file: might be blocked by security policies or corrupted by overzealous antiviruses.
4. Find the downloaded file: quite challenging for a good chunk of people, so challenging that mobile apps try to do everything possible to hide files to not scare users (ergo the app-centric OS models).
5.Launch the installer: see security policies, installer bugs, corrupted or out of date local OSes, plain old user error, memory or disk space issues.
6. Find the installed app afterwards: not an easy challenge for many people.
The desktop funnel was so long and fraught with peril that the vast majority of desktop users would get help from family or people they'd know. There were/are thriving PC help businesses around. Also, despite me being a PC fan, PCs could never reach ubiquity, unlike mobile devices, despite being the only option for 30 years and the web helping them with usability.
These are good points but in practice it wasn't that hard to deal with floppies, CDs, etc. I made a living the first third of my career programming products that used physical media (from my own language distributed on floppies to Visual Basic 5&6 at Microsoft). It wasn't fun, but arguably the product introduction/update cadence was not worse than what we have now.
Back then you had to be super careful about creating a strong product that wouldn't require too many updates. Now you have CI churning out multiple new versions a day, but one could argue we've got lazy about quality & documentation. And for the power user it was good to have complete control over whether you installed an update (and equally bad for the company because it was hard to know who had what version).
These days you have to look for release notes, newsgroups, Slack channels, Discord channels, tweets, FAQs, KBs, and so on to make sure you know what's going on. Gets a bit tiring. As a web developer I of course love the fact that I always know what version of the software my users are on, modulo browser differences.
Not arguing for a return to the past, not at all. I'm just saying that after 30+ years as a developer I don't feel that we've served customers any better in 2020 than in 1985.
Oh man, the US release, at least of the original Final Fantasy had so many bugs in the combat system, you could just about throw the manual in the trash.
- All weapons special effects would never trigger
- They failed to dereference the index for critical hits, so the higher index the weapon had, the better its critical rate
- A bunch of spells either don't work or have the opposite effect they are supposed to
- The code for calculating run chance is so terribly messed up, it reads the wrong field from 2 characters down (so the last two characters read from some coincidental memory location causing them to be unlikely to run)
[edit]
Found a fairly comprehensive list of bugs; totaling nearly 50:
Given my experience with using modern software, I can say with confidence that had Final Fantasy been released 5 years ago, all of these bugs would have open issues for them but none of them would actually be fixed. Each new patch would only exist to arbitrarily change the interface, update the embedded adware, or add features no on asked for.
I guess that depends on the definition of "complete". Games back then tended to ship with whole features simply not working. The already mentioned Final Fantasy series comes to mind, I think Diablo 2 shipped with a Holy Fire Aura Skill that was supposed to deal periodic damage in a radius but simply didn't.
Games also had their fair share of bugs that nobody would really call "easter eggs", for example Elder Scrolls Daggerfall had the tendency to generate Dungeons that had no entrance/exit so once you got in you were stuck. Ocarina of time, a console release, had three different versions released to fix various crashes and possibly game breaking bugs (https://www.zeldaspeedruns.com/oot/generalknowledge/version-...).
I guess it mostly comes down to the fact that players just didn't notice a lot of things (the "blind" status effect in Final Fantasy 6 did nothing, but this is really hard to notice as the hit chance it affects is called a chance for a reason) because they didn't know how things were supposed to work or they just worked around bugs if possible because patches often weren't an option.
[Edit]: I'm not trying to argue that the current way of things is in any way better than the old, both have their fair share of pros and cons.
> Back before the web was prominent, software had to be distributed via physical media, usually floppies and later CDs.
This is blatantly false. Home users could digitally obtain software via online services or BBSes, businesses with minis or mainframes could obtain software from their vendor or reseller via modem, and universities could trade software via ftp. All these sources could (and did) cross-pollinate.
The general public did not use any of these services. Yes, BBSes and the like were notable, but used by a tiny minority. Most computer users bought boxed software, and that was it.
Was anyone actually selling software over the internet in the 80's/early 90's at scale? I was under the impression that the answer is "no", but if there is a counterexample I'd love to see it.
That's true. A lot of people got the shareware versions online (whether by FTP or more likely from a BBS) and decided to buy them after trying, so in that sense it's selling software online, but I get what you're saying, it's still physical media.
Forgive me if I'm pointing out something obvious about shareware, but often no further digital download was needed because the shareware or nagware download already contained the complete software package. (does not apply when you're distributing one level of a ten level video game) A user could complete the purchase of the full software and receive a key to unlock it via phone, snail mail, or email, with no transfer of physical media.
And lest we forget, a lot of shareware was based on nothing more than an appeal in a readme.txt and the hope that users would send money if they felt the software was useful - nothing more formal than that.
> The general public did not use any of these services.
The users of Prodigy and CompuServe were very much part of the "general public."
I'm sure me and all my high school friends thought of ourselves as somewhat elite for dialing into BBSes, but that's just because we were jerks. It's not like we were a secret society or something, and there were plenty of us. (You might think of computer users as an elite of that era, since they were middle class tools and not for everybody, but users of online services or BBSes were not an elite subset of that group at all. Maybe that's the big lesson here.)
> Most computer users bought boxed software, and that was it.
You've obviously unfamiliar with shareware. Adorable. (the wiki page on Shareware is inadequate enough that I guess I shouldn't wonder too much about it)
It's too bad, though, because the very lightweight and high quality software that people used to make for each other and sell or give away serves as a counterexample to a number of the assertions about the web and software you made in the grandparent post. A bigger topic, honestly...
> Was anyone actually selling software over the internet in the 80's/early 90's at scale?
Okay, first of all, you need to abandon the idea that "the internet" was the entirety of digital software distribution in the late eighties or early nineties, since you're talking about the last few years when really interesting networked systems might not rely much, or at all, on TCP/IP. I wouldn't blame you for not including that era's CompuServe users as part of "the internet," since their connection to the internet involved nothing more than exchanging email with internet users at that time (I'm not even sure when they got that ability). Although they were definitely part of the "general public." Anyway.
> Was anyone actually selling software over the internet in the 80's/early 90's at scale?
The video game companies and individual developers of games or utilities were probably the first to do it at scale. Doom, in 1993, was probably the biggest example of the era but it was relying on a well established method of distribution.
Doom fulfills the "scale" part of your question but it's a little deceptive - most software isn't like that. The great thing about early digital software distribution was that tools and games that weren't necessarily big enough to justify creating a company had a path to get into consumer's hands. Doom would have got out there regardless, but a tool like, just as one example you might have heard of, pkzip, really was a product of that environment. It eventually was shrink wrapped, but only after it'd established itself a decade prior.
People were accessing applications and data on remote systems and using digital distribution for software. The big differences are the available bandwidth (which does not depend upon the web) and that HTML/CSS/JS are platform independent client side languages.
That said, much of the inefficiency of the web comes down to implementation. A site like HN is efficient since it has a well defined purpose and the design reflects that purpose. Many websites have less clearly defined purposes, provide a much larger scope of functionality, of depends upon general purpose frameworks. Those are all ways to diminish performance with little perceptible benefit to the end user, a.k.a. bloat.
As for modern development, I'm not sure it is much more accessible than it was in years past. Yes, a big part of the appeal of web development in the 1990's was its accessibility. On the other hand, that was comparing mature general purpose development tools to one that had limited scope. You certainly weren't developing standalone applications in HTML/CSS/JS in the early days, and most of what was developed for online use was geared towards accessing data an less so to content creation.
Function over form any day. Unfortunately, we are where we are, we have the cycles to waste on useless eye candy with as a result that we are now cycles short in order to deliver a good user experience. Somewhere, somehow, user experience and design became conflated. The resulting mess is an incredible hodge-podge of technologies, protocols and hardware all trying to pretend it is something that it is not: software running on your local machine.
Running older software on modern hardware is a great way of seeing what we've lost along the way.
Exactly. This is why I love the classic Mac OS ecosystem. Yes, the classic Mac OS lacked preemptive multitasking and protected memory, which led to system instability that hampered the user experience. However, most applications written for the classic Mac OS strictly adhered to the Apple Human Interface Guidelines, and the result was GUI applications that were easy to use, had consistent conventions, and did not look bad at all. The classic Mac OS might not look as fancy as modern desktops, but its usability is unparalleled even compared to modern macOS.
It's always pleasant booting up one of my classic Macs, whether I'm running System 7.5 or Mac OS 8.1 or Mac OS 9, and using programs like Microsoft Word 5.1 or ClarisWorks. These environments are quite simple compared to the desktops and applications of today, yet they are so complete. Microsoft Word 5.1 and Microsoft Excel 4 on my Power Macintosh 7200 are more comfortable for me to use than LibreOffice on my 2013 Mac Pro. There is a lot that software developers could learn from the classic Mac OS and Apple's UI guidelines of the era.
It is easy to make things fast if you cut out 90% of the UI trimmings. No drop shadows, no shaders. No fancy data synchronization and subtle animations on every click and tap. Everything Winform style bog standard UI. And it will be blazing fast. But nobody wants ugly cross platform apps, so here we are.
Some of the problem with web UIs is poor discoverability of shortcuts. With some modest effort, you can create a web UI that is as easy to use for an expert as a mainframe form. It won't be exactly the same as a mainframe form (you could do that, but we're out of "modest effort" here), but it would be as easy to use. But there's no good story on how to discover the expert interface. For one thing, nobody even experts there to be an expert mode, so there's a vicious cycle in which it is never developed -> never used -> never developed etc.
But all the capabilities are there to turn a "down" cursor into going to the next form element down, or having CTRL- shortcuts for popular fields, etc. A nontrivial part of the problem is just that people don't even think about expert-mode users, let alone start trying to program for them. People see a web page and they are in a mindset of novice-mode UI, both developers and users.
A Web 1.0 interface with Javascript expert-mode shortcuts, and some selective pre-loading of things like form screens could probably just about match a mainframe UI on its own turf, then be able to extend it with modern touches. And while applying All The CSS could possibly slow it down again, the results could still be at least a bit prettier than a mainframe form.
At my job I am currently trying to mitigate XSS issues because the web application that is in place uses jquery. It is a security risk by default. Rewriting the entire application would be the only secure solution but it is too costly.
If it was an old fashioned desktop application we would not be in this mess...
Web applications are really hard to get and keep them secure.
I moved from classic Vim, quick and snappy, to VSCode, which behaves exactly like a "bloated web-app" and often is so slow I can feel as it's catching up to my typing. It's a trade-off.
With VSCode, I got an editor which mostly just works. Which is maintained actively, major bugs fixed, which doesn't require me to maintain and version it's configuration, experimenting with different options and plugin combinations. Which has much lower learning threshold on plugin development, and a much friendlier development environment - JS for both the editor and plugins is just so much more convenient than VimScript and C.
Yes, Vim is much faster and more minimalistic, and after learning it, I install vim-emulation plugins into every editor I use (including VSCode). But being fast is not the only thing an editor can be.
I'm not a personal trainer, or motivational speaker, but brighten up.
As a mathematician, everything since the 1960s has to me been this cycle, and I think the key way to approach this is from the perspective of knowledge.
Don't look at software or data, but look at fundamental science, and you'll have less of this sort of burn out.
I read papers from 1950, and I pretend that I could have a coffee with the author later that day, and savour the fact that you can't take people's discoveries away from them, even if history or humanity has.
The complaints are to be interpreted as problems in workplace I believe.
My last job people complained about printers being shitty.. they were fine. Jam would happen, bugs at times but really low error rate and proper function. The complaints were mostly due to extremely depressing work environment.
When work becomes too much a drag people seek scapegoat (organic or digital)
Joe does not know the difference between aesthetics and functionlism. Jill on the other hand is a linux aficionado and goes out of her way to make things hard.
Though I'm not a big gamer, one thing I like about game software is the combination of interesting interfaces and speed.
>Modern web apps are so much more inferior in certain areas when speed, immediacy and efficiency is required.
This is a weird thing to suggest.
It's now possible to run a complete productivity suite in an office. Web mail is WAY more usable now than it was in 2005. Dynamic web apps made this kind of thing possible. I absolutely do not want to regress back to the time when every web app change required a submit; I mean, come ON.
At the same time, it's assailably true that most apps - web, native, phone, or otherwise - are crap. But don't confuse bad developer choices with a bad platform, and don't let nostalgia convince you that everything was better in $some_prior_era.
Word 5.5 was incredibly powerful and amazingly stable. I used it to generate 1,000+ page manuals. Word has progressed in many ways but it is generally not better for huge documents than it was then.
I'm thinking this could still be useful as a distraction-free word processor for writers today.
It runs under DOSBox perfectly, saves Word .doc files to the underlying file system (which could well be a Dropbox folder if cloud syncing is needed). Or Google Drive even, which makes it interoperable with Google Docs.
And because it's a completely text-based UI, there's no temptation to mess around with formatting (except maybe bold text).
Bah. I was a very serious and intense user of Word 4.x and 5.0, but 5.5 stripped out its native (and idiosyncratic) interface for a character-mode and hokey version of Windows' CUA menus. I hated it, and stayed on 5.0 until I eventually had to switch to Word for Windows (which, fortunately, eventually became usable).
Cool! This was the absolute acme of word processing software -- it's been downhill ever since. If only it worked on Ubuntu 10.04; linux has gone downhill since then.
Word 5.5, Harvard Project Manager, and Santa Cruz Operation Unix, those were the days. I got stuff done.
I like to install stuff like this in DOS 6.2 Running on a VM on my ibm x3650 M3 with Level 5 RAID, running Server 2012 R2, with an IBM LTO6 Tape Library backup system and Backblaze cloud backup....
I like the screenshot (the Twitter link). Looks like he is:
- Running WP 6.2
- Inside a dosemu2
- Inside a Linux VM (WSL, likely)
- In a Windows 10 terminal window
I'm not sure if dosemu2 includes actual DOS code, but if so it's an MS OS running a Linux running another MS OS running Word Perfect. Either way, amazing inception.
I'm assuming he's using that odd WSL+dosemu2 detour and not a native Windows DOSBox build because afaik DOSBox makes its own fake terminal (it simulates the pixels, not the text). So with it he can't just WP6.2 straight from the active terminal window.
It's also kind of a testament to how well WSL works in practice that one can go "I want $OBSCURE_THING. Damn, doesn't work on Windows, hmm I'll try the Linux version ok that works done". This is the same thing that finally got me, a lifetime Windows user, access to important productivity software such as gti, sl, cowsay and lolcat.
My favorite editor of all time was “Leading Edge Word Processor” which was a lightening fast piece of word processing software for DOS w/ all the trimmings that comfortably fit in a floppy disk with ample room for documents. The company that made it “Leading Edge” was primarily a PC clone vendor and all their software disappeared overnight when they were acquired by another clone maker and their entire brand winked out.
For writing batch scripts and reading large text files I would use a utility called sled “slick little editor”. That was some hand coded assembly language program that was only a couple kilobytes but could do most of what vi does and work with anything that fit into memory. It seems to have completely disappeared from history.
Wordstar was big in school - first half of class was always everyone passing around the DOS and Wordstar disks to boot their computers. Great way to start your day or wind down, really sucked at lunch because you were hungry and nothing to do but watching the second hand spin while everyone was booting up.
OH MY GOD I remember LEWP, but my nostalgia here is fundamentally grounded in its instability. See, I kept myself in beer money by helping a few grad students recover LEWP documents that LEWP had obligingly munged badly.
Great tool -- for people who charge money to fix it when it breaks!
I remember there was a master index file which implemented a sort of virtual filesystem allowing long document names w/ spaces. If that got corrupted it was a bad thing - only happened once but I lost an evening of work. Otherwise it never gave me any problems. I remember that document recovery used to be a big thing so I guess a lot of programs had issues - or maybe it was just the technology at the time was less reliable.
I remember one of my school teachers seemed to think that inserting a floppy disk required you to slam it in with as much force as possible, and she could never understand why the disks stopped working [point to crease in media, point to abrasion on media, point to tear in jacket] -- complete mystery, can't have been anything she was doing. ¯\_(ツ)_/¯
It was something about LEWP's goofy virtual filesystem, absolutely. Contemporary tech was really stable when crappy developers didn't get overcute.
I made literally thousands of dollars solving issues with LEWP among a cohort of grad students between 1990 and 1992.
I was never called upon to do similar magic for anyone who used a more serious and less idiosyncratic tool (at my university, Word was the mainstream option).
A close relative of mine swears by WP 5.1 and running as many programs as he can from the command line.
He also:
- Refuses to upgrade to Windows 10 (he's trying desperately to stay on XP for as long as he can, although I think either 98 or even 3.1 is his favorite. He has, however, made peace with Windows 7)
- Will buy out-of-date but refurbished laptops to achieve this goal.
- Prefers obscure browsers over Chrome, IE/Edge and even Firefox.
- Will actively block JS from loading in browser. It makes for a 'unique' browsing experience.
As a result, I get the impression that he's created some sort of high-end IT security policy in the sense that no nefarious hacker would bother even looking for hardware and software that obsolete and obscure.
He's in his mid 70s. I've tried to get him to migrate to a lighter-weight Linux distro running XFCE or MATE but he seems adamant on sticking to his guns. I kind of respect that dedication, even with the mind-boggling frustration it comes with.
> I kind of respect that dedication, even with the mind-boggling frustration it comes with.
I dunno. I get the "respect the dedication" thing, but at some point this person is so afraid of change that they'd rather become a burden to others who have to deal with them rather than adapt to a changing world. At least for developers, part of the job description is to actually make an effort to learn and use new things.
I've worked with a number of these people, where whenever you introduce a concept that they haven't worked with for 25 years (e.g. a new VCS, a new/upgraded programming language, a new way of building, a new framework, even minor changes to code they wrote 15 years ago, really anything), they become incredibly resistant and intransigent, and they make change significantly harder than it needs to be.
People who are adaptable are forced to work with their ancient (and not always better) systems, because it's the path of least resistance. These kinds of people can be a real problem.
But it makes sense to be resistant - you're trying to change something that they perceive to have worked for 25 years (or whatever). I bet that if they perceive that something to not work they'd accept that fix, but despite what some will say, more often than not if something isn't broken then it is a bad idea to try and fix it.
Part of gaining experience over the years is also gaining the experience that people often want to mess things that work (often with good intentions) and end up making things worse.
The rest of us have mostly given up trying to troubleshoot his IT issues. For his part, he recognizes that he's on his own and will take the time to (try to) fix whatever issues he has.
Interestingly to you point, I've worked in banking for a number of years. The number of institutions that depend on legacy systems coded in otherwise extinct languages is concerning.
On the other hand, I've met some of the consultants banks use for IT maintenance. These guys (mostly in their 50s) can easily earn 40K a month (in Europe!) just for knowing COBOL and late 70s/early 80s era infrastructure.
I just got back from an acquaintance with an old core 2 era celeron running windows 7. No GPU, no SSD. The thing flies I was shocked. Seriously shocked.
> Will actively block JS from loading in browser. It makes for a 'unique' browsing experience
It is unique in the sense that it has far fewer crappy paywalls, cookie consent windows, annoying "sign up to our mailing list" spam, tracking scripts, adverts, cryptocurrency miners, and pointless shit nobody cares about. Less RAM usage too.
With NoScript, turn off JS for most sites, then make exceptions for the sites that need it. Generally, it is a much better experience.
I was thinking that. And I guess, considering that his machines don't have a lot of RAM in them, blocking JS is a lot more resource-effective than running a bunch of browser plugins to achieve the same results.
DOS and pre-2000 Windows aren't supported very well on VMs, especially if you are on a laptop as they tend to hog an entire CPU core at all times. Not to mention the lack of drivers.
George RR Martin notoriously uses a DOS box with WordStar 4.0 to write his novels.
I get it, my mother stuck with WordPerfect for her entire career as an academic publishing editor. The reason was the "reveal codes" feature: without the ability to "pop the hood" and directly edit the representation of the document, fixing various weird edge cases was an exercise in frustration.
I used to love WordStar, it was an excellent Ctrl-Key editor.
I'm sure you could probably convert GRR to use something like Sublime Text and keep all his only WordStar shortcuts. Pick a nice font and size and just start typing.
My grandfather, who's in his 80s and has been using WP for the longest, has switched to WordGrinder about 8 years or so ago. He co-wrote several books with WG as well. He likes it a lot but it'd definitely not as feature-complete as WP. He mentioned to me that he still starts up WordPerfect every now and then to do some task.
When I bought a computer for him, an iMac, he asked me about WP and I setup a DOSBox with it for him. But then I managed to compile Wordgrinder and showed him how to use it and he's been using it since. Wordgrinder used to be a pain in the neck to compile on a Mac but it's much easier now after 0.7.x version was released. I still update it for him when I visit him.
I find it strange that nobody seems to be directly commenting on the whole arms race we have with constantly upgrading operating systems, browsers, and software in the name of security or features, and how we end up abandoning perfectly good software, the sum of countless of human-hours of work, and all for what?
What would help is if the operating system was no more than the hypervisor, and we run every single application in their own VM that includes all their dependencies.
That's why I'm happy to use a tiling window manager that I've setup years ago (bspwm) and never touched since. Likewise, I learned how to use emacs over 10 years ago and, while it got some nice improvements over time, it never changed in any significant way. I now use it for coding, taking notes, and emails.
I still use Office 2003 as it is the last version without the ribbon. The ribbon is still a really bad idea and I can't wait for it to go out of fashion again.
Wasn't aware of the dosemu2 method of converting text mode DOS programs into calls to native text routines. Will try this with Grandview, a great old DOS outliner.
https://www.outlinersoftware.com/topics/viewt/6291 "I have put together a version of GrandView 2.0 for DOS that is completely portable, fully functional, and ready to run on any modern version of Windows, using vDos-lfn, a variant of the excellent DOS emulator vDos.*"
I have used this off and on for a few years with relative success (and pleasure).
Well this is awesome: dosemu2 seems to work well in 64-bit Linux. I just tried it with old DOS OrCAD (a schematic design tool).
32-bit Windows-XP still works a little better in one way: OrCAD has a plug-in graphics driver architecture. Some sneaky person wrote a driver that uses DPMI to make Windows graphics calls. The result is that you can do things like resize the window or switch to full screen mode and the ancient DOS program thinks it's attached to a monitor of that size. So dosemu2 needs to merge with Wine for this work properly...
I could be mistaken, but it looks like that feature is mainly for showing whitespace characters (space, tab, carriage returns), while reveal codes showed the formatting, like bold and italic. A bit like showing the HTML tags that make up a webpage.
I wrote a ton in WordPerfect. During college, they had WordPerfect on the UNIX and OpenVMS systems. Then, at a job, we did all our documentation using Wordstar, WordPerfect was only for the legal team.
Because of that, I still tend to install joe and use jstar quite a bit on any box I use with any regularity.
WordPerfect was awesome. I constantly want to press "Reveal Codes" when editing in some crappy WYSIWIG markdown editor (looking at you Evernote) that is causing me trouble.
For the youngsters, Reveal Codes would show in reverse text, things like "[Boldface On]". As you were pressing your cursor past it, that would be considered a single character. You could copy, paste, or delete it. It would be as if <b> were a single character that you could manipulate.
I remember that the FAT filesystem had a limit of 8 characters for filenames, so you had to be creative with document names.
I do not remember ever having problems with WordPerfect or Quattro Pro. The only friction I experienced was with the dot matrix printer I used at that time.
I remember WordPerfect 6 having a mouse enabled GUI mode in DOS, as well as the text UI.
Not using WP but have gone back to using Eudora 7 for Windows after Gmail got even slower after the redesign. Eudora is lightning fast for handling mailing list cruft; alt click on sender or subject to group matching emails.
I remember it being loved by lawyers. The Mac version for the old System OSes was pretty damn nice, especially compared to the crappy versions of Word.
I get the feeling that someone looking at the post for [1] might not do badly by cloning WP or more probably WordStar for the terminal.
I bought a washer/dryer at a regional appliance chain recently. Checkout was at a desk with a terminal. The implementation was modern, with high contrast colors and nice fonts, but unmistakably still “ANSI” [0]. The employee who checked me out absolutely flew through the invoice, contact, and delivery forms. It was really nice after dealing with so many companies that just use web apps with obviously overloaded backends.
More than a few years ago I worked for a legal newspaper. The editors has WP macros that converted to Quark XPress codes so we could drop the files right into our layouts.
Wow, the really brings me back. I'm certain that my old WP floppies are gone at this point... is there a place to get a safe (from malware) and preferably legal copy of WP for DOS?
I am still using Paint Shop Pro 7.04 Anniversary Edition from 2000. It is fast and very intuitive and has all the functionality that I need from an image editor. I tried several newer versions and other programs (yes, this includes Gimp) but always came back to my beloved PSP 7.04.
I still use a program called "iPhoto Plus 4!" for basic image editing.
It came with an old Mustek paralell port scanner that came with an equally old 533Mhz AMD K6-2 Compaq Presario runniung Windows ME (initially 98 and later upgraded to XP), that I bought off of a friend in 1999. The scanner is long since dead (stopped using that in '02 I think).
It loads quickly, has basic functions but is better than MS Paint, I'm used to the interface, and have never needed to stop using it. For more sophisticated things I will use Gimp or other programs, but this program has always been there when I've needed it and it still works on the latest Windows 10 builds.
This feels performative and not like a well-reasoned choice that the poster would make in the absence of any observers.
I used WP for DOS back in the day. It sucked out loud, especially when compared to more modern word processors already available, like Word (for DOS! I don't even mean the Windows version). (The reasons are legion and really beyond the scope of this post.)
At the same time, yeah, it sucks that feature creep and whatnot often bloats good tools into something unwieldy and slow. But there are other, modern options available that wouldn't require having to shift your brain back to 1995 or whatever.
Shrug, I only mentioned it because there was a thread about using old word processors.
Regarding Word for DOS, I did try it out, but I think you're mistaken - it's capable but not as powerful or configurable as WordPerfect. I think you're calling it "modern" because by default it has CUA key bindings (Ctrl+C, Ctrl+V, Shift+Arrows to select, etc.). WordPerfect has those too, File->Setup->Keyboard Layout and then switch the default "Original" bindings to "CUAWP". This is better than Word, because you can edit and rebind the layout, even to macros. For example, I always use Ctrl+W to delete a word, so in WordPerfect I bound it to DeleteToBeginningOfWord. I don't believe that's possible in Word.
Another major drawback of Word for DOS is that it's fixed at standard VGA text resolutions (e.g. 40x25 characters), where as WP supports arbitrary resolutions. I can resize my xterm any size up to 255x128 and WP just works.
What are the "other, modern options available", with the requirement I've already stated - that I want it to run in a terminal? I would be very happy if there was a vim plugin that makes it a word processor, but I really think I've tried them all.
>I think you're calling it "modern" because by default it has CUA
No. I'm calling it modern because it was stylesheet-based, and followed the eventual dominant paradigm of working with ranges of text and attaching formatting, not inserting codes.
CUA menus and keybindings aren't part of my opinion of Word. (In fact, the eventual grafting of CUA menus to it really ruined it, IMO.)
Your "major drawback of Word for DOS" is really only a drawback for people trying to run it in an xterm in 2020, decades after its last release, so, uh...
I'm baffled at the insistence of a word process in a terminal, honestly.
> I'm baffled at the insistence of a word process in a terminal, honestly.
I don't know what to tell you, some people are more productive working in a terminal.
I like being able to ssh in, reattach to a screen/tmux session that has my email (mutt), development environment (gdb+vim+ycm), word processor (dosemu+wp), and so on. This is how I've worked for a long time.
I agree that everything I use a computer for could be achieved (far less efficiently, but without the steep learning curve) using GUI tools, but I value the efficiency gain and don't mind investing in learning new powerful tools.
I hear people insist they're more efficient in a terminal, and some tasks are friendly to this bias -- but not all of them. And when it comes to the ones not fundamentally text-mode in nature, I have yet to really SEE that happen from those same people when I've worked closely with them.
IME, these people are also the ones who insist that the tasks a terminal is bad at, and that they consequently have trouble doing efficiently (e.g., well-formatted text, or formatting text in line with a shared template) are somehow not important or legitimate. The super common pattern is a refusal to engage in HTML email, for example. Sure, mutt is fast for plaintext mail, but if you have to engage in some additional toolchain machinations to read (let alone create) an HTML mail, you lose a bunch of those efficiency points.
Maybe you're different! It's possible! But I haven't seen it.
As for myself, I absolutely still do composition in plain text 99% of the time, because I've been bitten by extinct file formats too many times. But I'm running native emacs under OSX, in a graphical environment, when I do it. When I need to do something for production to a client, I'm in Word, because it makes creating an attractive document far, far easier than with any prior tool.
(Including, it chagrins me to note, Word 5 for DOS.)
I think I'm pretty efficient, but maybe you're right and I'd be even more effective if I just used Microsoft Office and Outlook, but I doubt it. It's not like I don't know how to use an graphical email client, and I believe I can achieve tasks that would take you many minutes in a few keystrokes.
It seems like you have strong opinions on how a word processor should operate, but this seems like calling emacs more modern than vim, because it uses elisp (i.e. just your opinion, and not a universal truth).
I don't really mind what word processor you use, and think Microsoft Word is a perfectly valid choice if it works well for you. I think I'm happy with my workflow at the moment, and don't plan on changing it.
What is wrong about 1995? Nothing fundamental has happened in the last 25 years. Trains are the same (TGV, ICE etc.) planes are the same (a little bit more efficient), TVs are worse with ridiculous colors, house prices are way up and the press is a farce now. And everything spies on the "user".
Only thing is that cat pictures can be transferred faster now.
Going back to ed would be a regression, but vim and WP are very good editors.
Anyone with a TheC64 Maxi and looking for an ascetic, hair-shirt level of simplicity and focus should check out "Kwik Write!" Basically notepad.exe. You can get about 5 or 6 thousand words of prose in memory. Has search and replace, soft line wrap, etc. It's my NaNoWriMo environment for 2020 (assuming I can keep away from Wizball.)
I have to so I can eject the USB and plug it in to my RPi and it will extract the files using the c1541 tool and git commit them somewhere. Delightfully pointless.
So specifically on vim as a word processor: I've tried using vim as a word-processor and it doesn't work that well even with wordwrap configs enabled up to the wazoo. (and I've been vim user since 1998)
I did use vim to write my dissertation but it was in LaTeX.
That being said, I feel like Latex has just too many features - making it more like words. I like .md format much better particularly a blend of markdown and latex for the occasional need of equations.
I feel like eventually we'll return to the root of writing - to communicate. That process can take years but I'm so so happy that light formats like markdown is making inroads to businesses.
Hmm, not really related to the story (or HN comment), but I thought HN comments became automatically dead when submitted... [0] I guess someone decided to vouch for this?
Is my guess right? Can someone shed some light on how this is not dead?
You guessed right: someone vouched for this, and it created such a good thread that I think we'll protect this one. Cases like this are one reason why HN's systems are so porous—e.g. submissions to HN posts get autokilled but someone vouched for it so it came back.
Also, allowing weird exceptions keeps things unpredictable, which makes things more interesting.
I learned on WordPerfect for DOS because that was what my dad used at work. Then, when I was in high school, we moved to WordPerfect for Windows when we moved to Windows 95.
I always missed the simplicity of WordPerfect for DOS, but WYSIWYG always kept me in the Windows version. For some reason, I always had some kind of irrational fear of MS Word; but I switched in college and never looked back. Ever.
I've said as much multiple times. If you want write it once and run it on nearly every platform with a screen, write it for DOS (or more specifically, DOSBox).
...and if you want an actual GUI, then Win32 (and WINE for those not on Windows.) I have utilities I wrote in the Win95 era that I still use daily, the binaries continue to work fine in Win10.
Wow, the picture[1] he links to in the comment makes it look surprisingly capable. I remember using WordPerfect for DOS way back in the early 90's for writing elementary/middle school papers, but don't remember it being this capable. Either I wasn't aware of the features (likely, I was young and trying to do homework, not explore a word processor), or they were added afterwards.
pcem or box86 with freedos, for somewhat accurate experience.
dosbox-x or dosbox-staging for playing games specifically.
These are both forks of dosbox, as that is dead and has been for a decade or so. They've had releases, but no real progress.
They're actively hostile to outside collaborators (thus the forks), and basically censor the vogons forums they moderate to hide the existence of forks[0]. Scummy.
I used to be a wiz at WP5/6/7, often taking tech support calls from remote family members at 3am. I could typically answer their questions with opening my eyes.
I hate the bloat in most of todays "office" software, but I go along out of convenience.
I still run Quicken 6 for Windows from around 1995. Runs fine using Windows 10 in VirtualBox on my Mac. Tried newer versions of Quicken and really don't like them. Can't find anything open source that is as simple and easy to use.
Modern web apps are so much more inferior in certain areas when speed, immediacy and efficiency is required. They do have their upsides but more often than not, it is just a mess. I recently logged into my medical insurance website and the fucking thing ajaxing all day and I have no idea what part of the page is waiting to load. I don't have a speedy internet connection. I actually prefer the page to refresh than some small segment of it reload without any indication. Sometimes spinners are fixed in place to indicate "loading" but it is just a frustrating thing to look at.
I've used IBM AS400 and a bunch of factory management software - it is an absolute pleasure to use. Yet, most operators on the factory floor complain "We are just stuck with this old archaic system". Every single day. I feel like one of these days some asshole consultant company is going to convince our upper management for a web-based MRP.
Large percentage of people do not appreciate old pieces of software. They look at the aesthetics of it and write it off. I really think an average joe off the streets can be completely bedazzled by a modern React app with animations and frivolous bells and whistles that takes 5 seconds to load. Joe does not know the difference between aesthetics and functionlism. Jill on the other hand is a linux aficionado and goes out of her way to make things hard. Spends ages perfecting her vim config file, and looks down on anyone that doesn't use Arch Linux and some form of a window manager.
Perhaps there is a sweet spot between Joe and Jill's philosophy towards useful, easy to learn and efficient software that doesn't require an obsessive linux nerd to use. Like WordPerfect 6.2.
The world is really a sad place. :-(