This is the most embarrassing thing I've ever read. That thread goes on to extoll the virtues of dogfooding, and its absolutely true.
I personally believe the reason that chrome/edge/firefox and apps in general have become so resource hungry is because all the devs have been given beefy machines to allow them to compile on-device and this has lead to the devs basically ignoring the app's resource budget. The ultimate "it works on my machine" shrug. imo windows OS and browser devs should be given i3s w/4gb of ram, let them optimize their code for that experience.
> windows OS and browser devs should be given i3s w/4gb of ram
And slow HDDs. The gains of going from an HDD to an SSD have been squandered. The time it takes my computer to boot to a usable desktop feels the same as the bad old days with spinning HDDs.
I remember when SSDs where new and YouTubers were making videos of booting into Win7 with Office, etc. open in seconds. WTF happened?
My fairly high-end system with NVMe drives is absurdly slow at booting to my minimal Arch Linux installation's (tty) login screen. And I'm not some anti-systemd evangelist, but it certainly doesn't seem to be helping matters. Once the kernel starts being loaded, it shouldn't take more than a couple more seconds to get to login.
Yup, I got a relatively new AIO from work for free, set it up as an all purpose media machine in the kitchen, but no one ever used it bec it was so slow, after about a yr of aggravation I decided to check the specs, had an HDD, I replaced it with a cheap sata SSD, now it purrs.
When I first heard about this, everything that's ever bothered me about the Windows 8-10-11 evolution makes so much more sense. They're turning it into MacOS.
The Macification of Windows has been in progress for years -- and it makes no sense! When Microsoft designed the UI for Windows 95 they did extensive user studies and everything from how the start menu worked and the task bar was carefully figured out.
They've spend the last few decades undoing all that work piece by piece until nobody actually knows how to use it anymore.
Windows 95 is under-appreciated in the GUI histories I feel - it had many things that became 'standard' pretty much immediately afterwards (I even have a shortcut to Applications in my Dock which mimics a Start Menu).
But it's hard for a company to justify "paying to leave shit alone" so even if a maximum was reached, it will recede.
The difference is that Mac OS actually looks somewhat consistent and has kept much of its familiar paradigms (Spotlight, System Preferences, the menubar, context menus...) the same over the past decade. Microsoft seems to have lifted the aesthetics of Mac OS without any of the UX.
And more importantly, when Apple changes some design guidelines, provided you’re using the Cocoa API (which you likely are, if you’re developing for macOS natively), your application will look consistent with the rest of the OS.
Yes, Apple doesn’t care about backwards compatability, so you might have to update your application so that it runs on the latest macOS version, but there’s no way around it.
You either have a gazillion competing frontend APIs to maintain compatability with shit that was written for Windows 3.1, or you don’t care about any of that and end up with almost entirely consistent visual styles across the entire OS.
Sadly we've lost that, as more and more apps everywhere are now developed in Electron or other "cross OS" systems. Even Java apps would try a bit harder to match the OS than these "let's create a new UI paradigm for everything" apps that won't even let you open a second window.
OS X has been great for the past decade. I won't say the same for MacBooks, where there have been some pretty horrible decisions (butterfly keyboard, touch bar), but OS X has kept humming along, very usable, very productive. A few issues with Spotlight and some easily removed nags to use crap like iCloud is the worst I can say.
They also spent years telling people how the butterfly keyboard were amazing, despite the feedback they got. It came off like they ignored their customers and said "we know what's best." You can do that in development, but once your customers actually get their hands on the product, it's time to listen to them.
Last year I went through every version of macOS from 10.1 to the current release and used each for a couple of weeks. The thing that surprised me the most is just how consistent the user interface has been. Anybody familiar with 10.1 would have no problem using Monterey.
The people who worked on the design use Macs. Seriously.
https://news.ycombinator.com/item?id=30019307