Couldn't agree more, modern macOS is just kind of... annoying. I also don't feel like it really does much if anything better than e.g. Snow Leopard, while simultaneously performing a lot of mostly invisible magic behind the scenes that I wish had some feedback and gave some control over - like all the photo analysis stuff.
Fastmail works pretty well and it's priced from $37.20/year/user. Still a bit pricey if you're not a heavy user and the mailbox is just for random website contacts.
I started migrating to Migadu last night after seeing this post. I have 5 domains and a few users and it should cost me $20 a year once their trial expires.
Let's not forget that Google's spam filtering racket has made it next to impossible to host your own email in any kind of practical manner, or even use many hosted email services effectively over the years.
This. I hosted my own email for over 15 years, and last year was the year I gave up. There was nothing wrong with my machine's IP, I was doing the right thing email server wise, but Gmail still didn't let me deliver. A racket is a good way of describing it.
Absolutely. I set up my own domain last year, hoping to run a mailserver directly off it. The whole thing was perfectly configured, but the 'some dude' IP address made it unusable. Eventually I moved the domain's hosting to Fastmail, and now the mail actually arrives.
Users may not switch (right away) but how we as a community talk about this problem can switch. This will ultimately help the problem get fixed when perceptions shift, either within Google or by new users evaluating other options given Google’s reputation of poor deliverability/interoperability.
Also I suspect a lot of Google’s early adopters will be switching away due to this change. These are the tech-savvy evangelists who helped build Google Apps to what it is, and can plausibly do it for another service too.
Yes. Through a combination of increased network-dependence of apps, increased pace in general of updates, system-level updates that are required for security, app updates that are required for system updates and/or security, or simply because the app just ceases to function at all without the updates, and SaaS, we have in many ways lost the ability to say "no" to updates by remaining on older software.
I generally agree. MacOS appears to be stagnating in its capabilities, even if there is a constant churn of UI and features. The Human Interface Guidelines-based approach of old seems to have been blurred thanks to the iOS mashups we are now seeing, and native apps that used to feel substantially better aren't really better enough anymore.
However, for me the big issue with Electron or other web-tech apps is the performance - they all feel incredibly slow and laggy, engorge themselves on my machine's resources, and generally slow down substantially the longer they have been open.
Considering my main gripe about macOS when compared to say, Windows, has been the general performance of the UI and apps, this isn't a particularly welcome phenomenon.
Rotten Tomatoes is the epitome of why trusting expert ratings on their own is a bad idea. The number of times that the tomatometer is wildly off the general perception is insane, enough to make the tomatometer largely useless for myself at lease.
It certainly does seem, though, that a company with Google's prowess in search, NLP, ML, algorithms, categorisation and the like should be able to do a lot better, even if the problem is incredibly difficult.
My only explanation is that it doesn't suit the business needs of Google to do so.
I find DDG is generally good or at least good enough when the data you need is relative popular and relative well distributed around the internet. DDG falls very, very flat when it comes to needing to scour the depths of the internet for something very specific.
Case in point is car parts - find an obscure part from a Japanese Honda, for example, and attempt to find it in DDG. I'll wait. :P
First comment in this thread that's hit the nail on the head with regards to why software requires maintenance. After all, if I write an algorithm, how on earth does it just stop working after a while?
Fixing flaws is a separate and legitimate concern, but whether feature or security, this is solved by the idea of a warranty where the purchase price includes the product, and some time-limited obligation to ensure that it is fit for purpose. Once that's up, it's up to the developer to determine whether it's worth their while continuing to support it - but at least the end user can run it on the platform they originally purchased it on. Theoretically.
Then comes the hiding-in-plain-sight pernicious moneymaker mentioned: platform planned obsolescence. It comes in many forms, forced upgrades as mentioned, but also things like "turn on auto-updates because that's just what you do these days", "oh this is a 32-bit app so you can't run it anymore - pester the developer". Then there's all the developer-side stuff like "32-bit apps don't work anymore, update", "Oh neither does the older SDK, update that too", "Oh didn't we mention there's this new screen shape and you have to add support for that too", and "oh if you don't do all this stuff your app will disappear from the store in X days". What about the less obvious stuff like "this new SDK/toolkit/platform library is the way to build new apps! There's not really a clear upgrade path, but it's the one true way. Until the next one."
It's next to impossible for developers to solve this problem because it is an externality imposed on them by platform vendors. It is the main reason why software, in the short term, needs to be maintained. Of course, long term you expect that big developments and innovations would result in some level of obsolescence, but the devices and software industries have an incredibly fast rate of churn even for the pace of innovation. It's very interesting using systems from 5, 10, 20 years ago just to see how far we've come (or in many cases, how far we haven't).
Apple in particular has driven this by consistently eschewing backwards compatibility, constantly churning their SDKs, and continuously breaking older platforms. It's no surprise, they have two strong financial incentives to do so: if new software requires new Apple hardware, that's new revenue for them, providing they can convince us to buy it. If new hardware requires new software, that's new revenue for them thanks to their app stores. If they can get developers to move to subscription models well, that increases the recurring revenue via IAPs. Force them to use your payment and accounting services too, clipping the ticket along the way? More revenue!
We're the suckers though - we've been conditioned so hard to expect their stuff to be somewhat fragile, expensive to repair, and severely limited in lifespan that we almost like it. You dropped your phone? Oh the glass on the back has all broken. That's integral to the chassis, so it's a really expensive repair - over half the cost of a new one. Might as well treat yourself. No don't listen to that Rossmann guy, he's obviously <unsafe/counterfeit/untrustworthy>, eyes over here buddy, look at my pretty blue shirt. Oh hey, if you go to him Face ID won't work, because a nation-state actor might have installed an intercept on his dodgy counterfeit screen. You want to keep secure right? Anyway what were we saying again? Oh yeah, the new one. Oh you also ran out of storage? It's only been 5 years, the base model still has the same amount, so why don't you pay a bit more and get the upgrade. After all it only makes sense that you'd accumulate stuff over time right? What did you say, SD card? RAM stick? nVME SSD? No no no, they're way too slow/insecure/low-tech/dangerous to work with iPhone/Mac. No no your friend who put one in his computer has a PC - these are Macs, entirely different type of thing. Of course software needs to be maintained, of course it does. How could it possibly work on your new machines, how on earth! I digress...
I've been on record here for as long as I can remember saying something similar in defense of Apple's laptops, and I've done something similar to you in buying (usually cheap ex-lease) Windows machines every year or two to try them out, but I wouldn't go so far as to say Macs are superior in "every way".
Typically with an Apple device, except for the butterfly keyboards, you could be reasonably sure that all the interaction points were at least mostly decent - display, inputs, speakers microphone, charger, system noise (fan and electrical). You could also be sure that the inbuilt feature set such as WiFi, Bluetooth, Thunderbolt, USB (USB3 was a clusterf... on PCs in the early days), even charging would work reasonably reliably.
The other nice thing about Apple's laptops for a long time has been longevity of batteries - Apple were the first manufacturer to move to 500 cycle batteries with the 2008 Unibodies, and then 1000 cycle batteries later on. Additionally, they have pretty good charging algorithms to preserve battery life over the long term - not something that is the case with many other brands.
However, I think Apple have regressed on some of these fronts since 2015 (maybe they'll redeem themselves with the M1). The touchbar and T2 have been notorious for strange issues. The USB-C only setup has been both obnoxious for anyone who ever needs to use a Type A device, and notorious for strange issues such as heat generation when charging from the "wrong" side - a clear step backwards from MagSafe. The quality of their machines has suffered with the butterfly keyboard, flexgate, less robust internals (see Louis Rossmann for more details!), and anaemic cooling (one of the most impressive things about the original retina MBP was the improvement in the cooling system for sustained performance vs the thicker unibody). MacOS itself also declined in quality somewhat obviously since somewhere around Snow Leopard, with mere flashes of brilliance inbetween since then.
At the same time, Windows 10 has improved dramatically and is now solidly Microsoft's best OS since Windows 7, beating even that OS out in many ways. And, while I don't think the PC laptops have caught up, many of them are now within firing range of Apple devices - certainly close enough that in many cases, for the few that actually get all the touchpoints right, they offer an experience that isn't too far away from the best Mac experience I've had.
I'm seeing more and more this sentiment about "not being the target group", and using it to justify crappy business decisions.
Around the PPC change, Apple took the grand strategy of backing away from this idea of controlling what their users were able to do with their machines. Now they're moving back to their "we say what our machines are for and if your use case doesn't match then screw you" attitude.
Many of Apple's business decisions of late have been nothing but hostile to the idea of allowing users to do what they want with their devices. As a very long time Mac OS user, this whole idea of a "walled garden" was never central to the Mac, and while Apple have always been a sufferer of NIH syndrome, it seemed in the mid-2000s that they had finally got over this and started playing nice with the rest of the industry. So much for that.