Hacker News new | past | comments | ask | show | jobs | submit login
The End of OS X (stratechery.com)
376 points by Amorymeltzer on June 23, 2020 | hide | past | favorite | 573 comments



I've looked at the trajectory of Apple these past few years with mounting bitter-sweetness. At its best, MacOS really felt like the best of Unix combined with the consumer focus alluded to in the article. I still adore the craftsmanship of the latest Macbooks, but my 2015 Macbook felt like the apex of Mac's design. It was the kind of device that was such a pleasure to develop on. You could leverage the power of Unix in a UI designed with thoughtfulness and care. It reminded you that power and accessibility to end-users can be balanced. And now that balance has tipped away from the power-user, the developers, to the end-user, and in doing so I question my commitment to this extraordinary ecosystem. Why develop on a platform that contends with me every step of the way? Because the users and the money are there? That was never what brought me to the Macbook.

EDIT: I suppose most users in the Apple ecosystem aren't on Macbooks, but iPhones.


I keep reading these types of comments on HN but I just can't relate. I consider myself a power user, and precisely because MacOS feels like the best of Unix combined with customer focus is why I'm wholly invested in the Apple ecosystem. I simply can't relate to the statement that Mac is a "platform that contends with me every step of the way" and I'm curious to know more specifically why you feel this way?


I just "use linux" and I get all the benefits with much better hardware (more memory, disk storage, speed for compilation), better drivers (important especially for GPU support for either graphics/rendering work or machine learning) and my downsides are... what exactly? Itunes, Mail, etc, none of the software shipping with MacOS is actually any good. I don't use Safari. From your perspective, what aspect of the "customer focus" materially improves anything? ICloud? The thing that lost a ton of my data over the years? Safari? The thing that doesn't even support WebGL 2 or other standards that are a decade old? (seriously, safari is the new internet explorer).


> and my downsides are... what exactly?

I've spent zero days, zero hours, zero minutes, and zero seconds fiddling with setting anything up in macOSX over the last 15 years.

I manage a bunch of Linux systems, from laptops to desktop to servers, and over the last 15 years I've probably spent months of my life setting up stuff that should have just worked.

I use a mac at home and as my dev machine, because I charge quite a bit of money for doing that for Linux, and I don't want to throw my own money like that during my free time.

> Itunes, Mail, etc, none of the software shipping with MacOS is actually any good.

I use many Microsoft Office Apps, Adobe suite apps (mainly photoshop, lightroom and illustrator), and all of them just work.

VNC clients, VPN clients, Samba, they all just work. External devices, no issues. Do I want to stream my display over wifi to my television, or my audio to the stereo? Just one click away. Do I want to restore the back up of one directory from 2 years ago ? Its three clicks away.

On Linux, half the steps to each of these things are filled with hours and hours of research in the Arch wiki. Do you want to stream your monitor to your TV? Start by setting up a media server, and then 200 steps. Do you want to restore a partial backup of some folder in a BTRFs drive, a dozen steps that I cannot remember. Like no way. I don't want to remember how to do these things, and I don't want to have to look it up. If nobody is paying me to do these things it should be completely obvious how to do them and it should just work.

But hey, do convince your employers to use Linux everywhere, its great for the business.


I've been a nix guy since 1983. Worked for a major nix player, wrote end user application software on the platform, built chip design software on the platform, wrote device drivers for all kinds of hardware, system admin for years and years, etc. etc. I'm not some newb who just fell off the truck.

In many way *nix is still stuck in 1983 because it's a craftsman's platform, which is why there 10,000 distros, oops 10,001, no 10,002, oh 10,003, ugh! Linux is great on servers, because all servers require craftsman for the care and feeding, and Linux is so much better than Windows in that regard.

I gave up my Linux machines because I'm no longer interested in being my own sys admin. Frankly I was done being my own sys admin back in the 90's but I felt the need to keep my propeller spinning, and stay true to my geek roots.

In 2008 I bought a Mac Pro 8 core machine, and that's still kicking, my son uses it for Machine Learning as part of his graduate program, and it's just as fast now as when I first bought it. It was left behind with the last OS upgrade, but it will work just fine for another 4 or 5 years on the old OS.

In 2009 I still had a Linux laptop, but it was painful back then. With SnowLeopard it became easier to create an maintain a Hackintosh then it was to manage a Linux laptop I was off Linux for good.

For a hobbyist(craftsman) Linux is great, for a server that needs craftsman, Linux is great, for a consumer, buy a Mac.


And most server admin is moving to an “immutable infra” model where the Linux box is a container worker node that is disposable. Life’s too short to be sysadmining.


I went to a mix of Windows/macOS instead, but can fully relate to your story, as I had an UNIX zealot phase during my university years, and have been the UNIX/Windows porting guy at several projects.

My reason for giving up was reaching the conclusion that Linux distributions would never replicate the expectations of an Amiga/BeOS like experience for those that care about multimedia.


This is anecdotal. My own anecdote is that we've had far more trouble with my wife's MacBook than my NixOS ThinkPad which just works, even after upgrades.

The Linux experience depends on the distribution, software and hardware you choose. You need to choose carefully.


> you need to choose carefully.

So here’s the interesting part; can you recommend a distro that is universally the right choice for dev work?

If yes, please tell - I’d love to know!

If the answer is “well it depends a lot on your needs” that’s kinda the point of MacOS right? The fact that making the right distro choice requires serious thought is already problematic (hypothetically)


You do have a point, with Mac the choices have been made for you. I guess the most mac-like distro is probably Ubuntu. If you pick the LTS, you might even have a smoother ride than the macOS yearly updates.


> I've spent zero days, zero hours, zero minutes, and zero seconds fiddling with setting anything up in macOSX over the last 15 years.

I have to say that I 100% do not believe this statement.

You mean you you've never used any wireless connections to your machine? You've never had to set up backups or change audio settings? You've never had to deal with setting up one thing in 15 years? Since when did OSX gain the ability to read minds?


>You mean you you've never used any wireless connections to your machine? You've never had to set up backups or change audio settings? You've never had to deal with setting up one thing in 15 years? Since when did OSX gain the ability to read minds?

You seem to confuse using the machine with setting up.

Connecting a wireless connection (picking the SSID, entering the password, etc) is not "setting up". It's merely regularly using the computer.

The setting up we'd rather avoid, and that the parent talks about, would be tinkering to get your wifi chip working with your OS -- which you get the "pleasure" to do on Linux...


His claim is he did it 15 years ago, and never had to touch anything else since. Definitely a stretch, but quite believable. Just a few days ago I had to figure out how to reinstall Nvidia driver after Ubuntu 18.04 to 20.04 upgrade. And the damn thing still shows my primary screen panned whenever I log off/log back on.

I was glad to return to Windows when the Linux-specific issue was solved.


Isn't this YOUR problem you are using an infamously unsupported hardware with Linux? The problems would be same if you use Hackintosh on unsupported hardware too.


Somehow I need my GPGPU and there are currently no good alternatives.


I've got a stock Ubuntu 18.04 install at work. Sometimes, but not always, it hangs for three minutes or so trying to run fsck on my UEFI partition. Other times, I'm at the login screen within a minute of turning it on.

When it hangs, it finally exits with an error message saying it couldn't mount /boot/efi. Once I log in I can see that /boot/efi is mounted fine.

A bunch of my applications hang all the time for reasons I couldn't figure out. Turns out they were trying to communicate with my Gnome keychain over D-Bus, but they were in a separate D-Bus session so they just sent their D-Bus message and then waited for however long until they gave up. No idea at what point this started happening or why, but I never did figure out how to fix it (other than disabling those apps from being able to store passwords securely).

If I used the Intel drivers for my Intel GPU I got extreme stuttering and tearing, which only went away when I uninstalled the Intel drivers entirely so that it would use the "correct" drivers.

Then there's snaps. I install stuff and sometimes it's a package, and sometimes it's a snap. If it's a snap, everything turns into a giant mess. I had Firefox saving its downloads to Skype's workspace for some reason, multiple Firefox profiles on my system in various places, multiple versions of Firefox running when snapd downloaded a new version and symlinked it to be the default, two copies of Telegram for some reason and no obvious way to tell which was which, a snap and non-snap version of things because the Store installed the snap but another package depended on the apt package, and so on.

I installed my system on btrfs because "it's like a decade old now, it should be fine right?" and within a month my filesystem had been completely corrupted and I had to reinstall entirely (though I managed to salvage most of my home directory at least).

This is all stuff I'm willing to tolerate to some extent, but I can't think of a single thing that Linux offers me that makes it worth all this hassle. At my last job, I happily used a MacBook for nine straight years, and I never lost an entire day of productivity just because suddenly my system won't boot, or dealt with graphical glitches in every app because the driver for my video card isn't the right driver for my video card, or lost data because one of the default options for installation filesystem is unstable and unreliable.

I've never had to do much of anything to "tweak" MacOS unless I wanted to; on Ubuntu, I have to do it almost every day to get something working well (or at all).

So I'm gonna go ahead and say that yeah, MacOS isn't even remotely dead, and it's certainly not going to lose to Linux any time soon.


Thank you for providing all these perfect examples:

> You mean you you've never used any wireless connections to your machine?

Yes: I clicked on the network, typed the password, and it worked, always. If the machine already has seen the network, i don't even have to click, it just connects. If one of my machines (e.g. my iphone) has seen the network, the network and its credentials are stored on iCloud, so my laptop and my tablet can just connect to it. I never had to google how this works, never had to install anything to make this work, never had to change any configuration file to make this work, etc.. Same for VPNs, VNCs, corporate networks, network drives, etc.

> You've never had to set up backups or change audio settings?

Yes, I clicked on enable backups, clicked on the box that enables daily back ups, clicked on the drive where I wanted to store them, and it worked, always.

I can restore, browse, partially restore, backups with one click using time machine. I never had to google how this works. It always worked.

> change audio settings?

Yes, I clicked on the audio button, and volume worked, muting worked, all 4 sound outputs worked, always worked, etc.

---

On linux, over the last 15 years, i have googled at least for days to do _each_ of these things for customers:

- can't connect to a particular wlan, corporate wlan, etc. needed new drivers, upgrade kernel, change some etc config to enable something, etc. Same for network drives, cloud drives, SMB, ...

- sound, damn sound: sound cards not being detected (hell on my lenovo yoga an update to 19.04 Ubuntu pushed a kernel patch that broke my sound cards driver: no sound card detected, still not fixed on Ubuntu 20.04 5 months later - no sound in the times of corona and web meetings - great!), alsamixer, etc files, pulse audio, .... I have had to google a lot about each of these things, and fiddle a lot over the last 15 years, on dozens of machines, for dozens of users, for hours.

- backups, on Linux, have fun. I've set backups for ext4, zfs, and btrfs. ALWAYS had to google for hours. Every time I had to restore a backup, had to google for hours. Some times restoring did not work. Some times backups did not work. Encrypted backups do not work out of the box (Apple's full-disk encryption is enabled by default - but enabling it and having work with everything is just 1 click - on Linux full-disk encryption is a mess).

---

So no, I never had to set up anything on MacOS X in the last 15 years. The things were in the obvious place, and I just used them, and they just worked. No need to even google how to do the thing, or what to install to get it done, or configure the install, or read bug reports to work around issues, nothing. They just damn worked, as they should.

Its 2020, and setting things for companies using Linux is my day job. These people would all save a ton of money by just using macs or windows machines. The amount of time their employees waste on setting up and maintaining their machines instead of doing their actual work is astonishing.


What if you want to type without accidentally hitting your cartoonishly-large trackpad? Or if you want to type without a key accidentally registering twice because someone decided that thin keys were better?


Applications and hardware drivers on Mac just work. Even Homebrew works pretty nicely. The visuals of Mac are vastly superior to any flavor of Linux I've tried: smooth graphics, font rendering, HiDPI support, etc. Some aspects of Mac hardware (screen, touchpad) are very hard to come by with other manufacturers.

At this point I'll probably switch to Linux as soon as my 2015 Macbook stops working, but I would've really liked to stay with Apple.


> Even Homebrew works pretty nicely

It always surprises me when people cite homebrew as a plus rather than a minus in these discussions. Compared to the experience I have with Linux package managers, I've found homebrew to be extraordinarily slow even at the best of times. A simple "brew update" will generally take at least 30 seconds; in that time on something like pacman, I'll usually not only have already finished fetching the list of updates but also actually performing the upgrades, and that's updating my entire system, not just a few userspace packages that I have installed. I can understand that some people prefer MacOS and I don't think they're wrong for doing so, but it's hard for me to believe that someone has given Linux a fair shot when they cite package management as something that Macias does better.


In my experience this is partly because Homebrew has steadily moved to a "knows better than you" stance and every time you ask it to do one thing, it goes off and does a bunch of other things first.

MacPorts is well worth a spin if you're on a Mac and have trouble with Homebrew. I switched sometime last year and am very happy. I think I've actually been using it more because I know it's not going to be a pain every time.


I've been a MacPorts user since the days of Fink vs. MacPorts and tried and failed to use brew many times.

The biggest argument at the beginning for Homebrew over MacPorts was binary distribution and now MacPorts has that and can still compile custom versions if necessary.

MacPorts is always the first thing I install on a new Mac and has been for over 15 years. The several times I've seriously tried Homebrew, I get frustrated (terrible jargon, missing packages, spewing files all over my system, lax security) and go back to MP.


I've been using Homebrew for years and, although I've not had any major problems with it, I do find the slowness irritating. And also the fact it tends to ridiculous bloat by keeping previous versions of everything installed, unless you remember to do some spring cleaning every now and then.

I'm up for giving MacPorts a go. But is there a pain-free way to transition between Homebrew and MacPorts, without having to reinstall everything over again? I've installed a load of stuff with Homebrew, over the years and don't fancy having to pick through it all again to try and remember what I installed and why.


While Homebrew is absolutely slow, the experience of package management on macOS (homebrew gets you bleeding-edge versions of the crap you install, whereas Apple manages the system and keeps it stable) is IMO better than Linux where the choice of stable vs. bleeding-edge is systemwide.


Obviously requires significant setup, but this[1] seems to solve the issue you are talking about on linux.

[1] https://bedrocklinux.org/


Yeah sigh, homebrew had definitely slowed down in recent years.


> in that time on something like pacman

If you're using Pacman (and presumably Arch or some derivative), that might explain your better experience on Linux. You're on a rolling distro so you get up to date packages (and I believe AUR gets things even quicker).

Homebrew is much closer to that rolling always-up-to-date (within a day or two usually) experience than Apt or Yum on Debian/Ubuntu/Fedora (most people's linux experience), where you have a choice between packages that are 3-6 months out of date or hunting around on the web for a 3rd-party repository to add which contains more up to date packages.

Homebrew can be slow. But the fact that I can `brew install X` or `brew upgrade X` and get on with something else while it's doing it's thing means it doesn't generally take up much of my time.


> Homebrew is much closer to that rolling always-up-to-date (within a day or two usually) experience than Apt or Yum on Debian/Ubuntu/Fedora (most people's linux experience), where you have a choice between packages that are 3-6 months out of date or hunting around on the web for a 3rd-party repository to add which contains more up to date packages

It still has a centralized list of the versions of the packages it supports (on github) though, right? I feel like it should be able to get this list with a single HTTP request to the Github API; I don't see why having a faster or slower schedule to update the packages should affect how long it takes to actually download the list of updated packages with `brew update`.


> The visuals of Mac are vastly superior to any flavor of Linux I've tried.

This is subjective. I think the MacOS UI looks and acts like a toy. My Linux machine has a solarised theme (I can toggle light/dark) with minimal window borders (i3, polybar). I think it's much more tasteful than MacOS. GNOME looks better too and it's themeable, unlike MacOS. But this is all of course just my opinion.


I agree. I dislike the Mac UI, particularly Finder. The one Mac tool I use and love is the screenshot shortcuts. Other than that, 99% of my time is spent in the terminal and Sublime Text, and Firefox.


At this point FreeType is generally better (faster, more featureful) than Apple's Core Graphics, although FT is very customizable and I prefer Apple-like defaults to the ones most Linux distros use. You can definitely get Apple-like rendering out of FreeType; disabling hinting and making sure stem darkening is on gets you most of the way there.


> hardware drivers on Mac just work

Same for GNU/Linux if you choose the hardware for it (or buy preinstalled).


Try getting hardware accelerated video in Chrome, or FF. Not supported.


Interestingly enough this causes all kind of weird distortion on my brand new 16" MBP. Not sure if it's a hardware issue, but it only happens in the browser with GPU acceleration turned on...


hardware acceleration has been supported in FF since 75. however your display scaling has to be 100% on x-org or use it in wayland. Linux desktop would've been perfect if it weren't for the fragmentation. btw currently running ubuntu 20.04


In my experience this argument is BS. Maybe the computer on day one works OK, but it's downhill from there as you maintain and update.


Huh, in twenty years of using Linux, hardware support has improved with each update not gone down hill.

What I do agree with is that on day one support might not be as good for new types of hardware (especially for more niche hardware like a fingerprint reader for example). Even that has improved a lot over the years though.


It really does depend on the hardware.

Thinkpads usually work really well out of the box (even the nipple mouse works without a hitch).

I have plenty of horror stories about other computers though. Usually involving being deep in some page tree on a company's website that they don't seem to have actually expected anyone to go to, trying to figure out which specific vendor id maps to the actual hardware I have.


> ... but it's downhill from there as you maintain and update

I've been maintaining a Fedora desktop installed 8 years ago without a snag, updating every year or so as time permit. Not a single problem.


I did exactly that with an Asus Netbook, then Canonical and AMD broke the whole experience with their actions replacing working binary drivers with WIP FOSS alternatives, because Free is Good ™.


Until you have to connect to a 12 year old printer at work


Mac and Linux actually use the same printer stack: https://en.wikipedia.org/wiki/CUPS


Most corporate printers are network attached, many also run under CUPS


> Applications and hardware drivers on Mac just work.

... until Apple decides to break an API and not provide equivalent functionality, and then you end up with a perfectly functional soft-bricked paperweight. I can't use my external USB monitor because of this.


Except when they broke most Homebrew installations a few years back (Mojave maybe? Been using Linux for a while now too, that was my tipping point)


I switched to Ubuntu for work about a year ago, I've done zero fiddling since the day I set it up. Linux has really come a long way even in the last 5 years, IMO. I still have a 2015 MBP that I use for my personal work and it's still solid.


Same, as of late last year I now run an Ubuntu desktop and the Macbook together. What I really like about this setup is the ability to sidestep some problems instead of having to solve everything, allowing me to focus on doing paid work instead of troubleshooting the platform.


Everything on Ubuntu just works, too.


Except for fractional scaling if you use the proprietary NVIDIA driver. This was an issue on the latest LTS release.


It became worse for me since I upgraded to 20.04. Now central screen in 3 screen setup gets panned, which can only be fixed from Nvidia control panel + graphics manager restart. It breaks on the screen lock. Don't upgrade to 20.04 yet, if that's what you have.


I've been running Linux on Macs since my 2005 12" PowerBook. There were a couple years in the middle here I switched to OS X with Linux running in a VM due to hardware support issues (the VM was a pain in the ass to use; there were no end to performance problems). I never liked OS X as an OS all that much. It was fine, but I found it got in my way more than it helped me.

In 2016, after getting tired of various hardware bits never working just right on Linux, I gave up and tried someone else's hardware. It was amazing that everything just worked with Linux. I didn't have to fight with bleeding-edge, just-reverse-engineered drivers. I didn't have to wonder if the laptop was going to wake up again after being put to sleep. But I chose somewhat poorly: the hardware I chose was decent, but rough. The battery expanded to the point where it warped the case, which caused the keyboard illumination to intermittently not work, and eventually made the touchpad impossible to use.

So I gave up in 2018 and got a MacBook Pro again, though it was a secondhand 2016 model. Once I installed Linux, I found that the keyboard and touchpad didn't work, and I had to build an out-of-tree driver for it, ditto for the WiFi. After that I found that audio and suspend-to-RAM didn't work. It turns out I was "lucky" that I had an older model; the keyboard and mouse setup in the 2017 and 2018 models hadn't been reverse engineered yet and didn't work at all.

In 2019 I got fed up with having to use the USB-C headphone adapter from my Pixel in order to hear sound, and to remember to shut my laptop down completely if I was going to be away from AC power for more than a few hours. I again left the Apple hardware world. It's not perfect, but again I marvel at how everything just works (with Linux) without tweaking. I hear so many colleagues complain about various inexplicable broken things on macOS all the time, and I think I've gotten the better deal.

Is Linux as polished as macOS? Of course not. I agree with you that the graphics aren't as smooth, font rendering isn't as good (though this is debatable; personally I find Linux rendering to be more crisp and readable than macOS), HiDPI support varies depending on what toolkit any given app was written with. But it works well enough. I think one of the biggest problems that has kept (and will continue to keep) Linux from going mainstream is that even though for the most part it works just fine, when it does break, it breaks spectacularly to the point that you need specialized knowledge to fix it. Fortunately I have that specialized knowledge, so I'm fine, but your average home user does not.

I don't have to deal with Apple's keyboard design issues, random WiFi failures (yes, it is incredibly funny to me to find that Linux WiFi can work better than macOS), Apple's overzealous "system integrity protection", or Apple in general deciding they know better than I do as to how I want to use my computer. It makes me happy, and I feel productive.


You are quite persistent on Mac hardware. Why?

It is hostile platform, it should be rough. I use 14" Dell Latitude - no problems at all, beautiful machine. Probably would stay on it or try Asus ZenBook.


Because the hardware is beautiful and the build quality is (was?) excellent. Recent fumbles like the butterfly keyboard design have definitely reduced that perception, though.

I have a Dell XPS 13 now, and while I like it, it's just not as pretty as a MBP. I recognize that that's a highly subjective judgment, but that's a thing that matters to me. The XPS's build quality is good, but not amazing; for example, I've had it for less than a year, and one of the rubber strips on the bottom of the laptop has been peeling off for the last few months.

Overall it just makes me sad that "hostile platform" is even a thing. At this point I've decided that the alternatives are good enough for me, and the better hardware support tips the scales in favor of them.


Good experiences with the ZenBook line here. Pop!_OS worked out of the box with everything but the fingerprint reader, including the Optimus GPU.


On my 2011 MacBook Pro I have run various Linux distributions since 2013. Mostly debian and now solus. Both have been a joy. I have not experienced the same issues that you describe. All my hardware just works out of the box.


Same story here


The window animation delays in mac are killing me - kind of in a month you lose hours of screen animation time. also macos has ZERO use case as a cloud OS. Any stuff you learn is meaningless except the unixy ones where linux rocks.


1) You can turn off windows animation, and it seems strange that it is a performance bottleneck.

2) macOS is not a cloud OS. It certainly can be a portal to the cloud quite easily, via terminal, iterm2, ssh/mosh, VNC etc. But you know that already.

You can also run VMs, KVM, docker et al on it as well.


1. it's better to avoid macos totally. 2. u repeat what i said. linux is better in every detail


Software that is unique to MacOS:

- HoudahSpot (search files / documents)

- DEVONthink (document management system)

- Ulysses app (a markdown writing app; I know there are many, on many platforms)

- iA Writer (another markdown writer)

- Alfred App (automating)

- Keyboard Maestro (automating)

- AppleScript (automating)

- MailMate

- Things3 (task management)

- OmniOutliner

- OmniFocus

- OmniGraffle

- PDF Expert (reading / annotating pdfs)

- iTerm

- aText (text macros)

- Timing App (auto time tracking)

and many more.

Many of those apps have their equivalents in the Linux / Windows world, but their Mac pendents are imho much better.

Finally, it's the overall experience. Everyting is streamlined, everything works across iDevices, everything (mostly) works. I can't say it in a few sentences. Sitting in front of this machine, coding, writing, listening to music, that's a joy. I didn't have this feeling with a Windows machine (was a Windows and Ubuntu user until 2015).


Agreed about the Windows feeling, though the reason I have a Windows laptop (which I use personally more than my MacBook) is that outside of development tools it does far more than Mac or Linux machines.


Have you noticed how all your arguments are fine yet only personal preferences? (except dataloss)

The UX, UI and 'feel' is what makes people use one thing over another. Unless you can replicate that, or can do better, nobody's getting swayed either way (and that's fine, do what you want).

There is no objective 'my pc is better than your pc' because you're not the same person as the one you'd be attempting to compare against.

Better hardware and better drivers are useless arguments if someone wants to go to a shop, take a device home, plop it down and do some work. None of the betterness of the other stuff is going to undo the work, or undo the experience.

It's like telling someone their 24-pack of toilet paper is stupid because your shipping container full of toilet paper is cheaper and you have more than they have. (especially when all they wanted to do was wipe their butt)


How is better hardware-to-price ratio and driver performance a "personal preference?" Do you not want those things? If you're saying that having a particular UI is important, that's another matter. I'm just explaining what it is that caused me to ditch the Mac ecosystem.


No, those things aren't the main driver for my choices.

Say you have two options, same price, different vendor. And the main difference (for the sake of the argument) is that one comes with a mobile i5 and the other one with a mobile i7. Technically the price-to-performance ratio is better for the i7 model. But the driver for me to buy any device at all is wether the device can do the task I need it to do. If the i5 does the task the same way the i7 does the task, what is the extra value that the i7 supposedly brings?

Often people will make a 'what if' argument here, because "what if you need to do something different" sounds like a nice argument to get more than what you need. In reality, I don't often see anyone changing their tasks mid-lifecycle and then have a sudden requirements change. It hasn't happend for me at all.

The same parallel can be made with cargo capacity of a vehicle; technically every Lamborghini ever is a stupid choice because you can't move your furniture with it. Or every Fiat Panda is stupid because you can't drive at 200 Km/h speeds. Make it more absurd: every boat is stupid because it doesn't fly like a rocket.


Linux is objectively more configurable too (obviously).

This is probably my biggest complaint with macOS.

I'm still shocked by the inability to disable inertia scrolling on a usb mouse, as well as acceleration.

These things were removed on purpose.

If you use a macbook with usb devices, there are all sorts of oddities. It's annoying. I can use it, it's just not for a power user. For me, power user means "configure it", shape it to maximize your workflow speed.

MacOS is just not that. With linux there is a lot more flexibility.

I don't care too much about eye-candiness, I want my UI to be incredibly fast. MacOS has those damn animations when switching spaces that is annoying to say the least (I switch spaces very often)


I use Safari as my main browser now due to the password generation/sharing with my iPhone. It's a killer feature for me.


That's a commodity feature at this point.


Assumptions: 1) All my devices are Apple 2) I'm not planning on leaving Apple

For all the negative replies I wonder how many people actually have used Keychain. It's very good. I tried onePassword, google's, and firefox's, but none integrates as well among all my devices and with zero effort in the OS as keychain. Anyone leaving negative comments about keychain, at least point out that you've given it the ol' try in the Apple Ecosystem.

Or we have different assumptions and you have devices outside the Apple ecosystem or don't want to be tied to it therefore the benefits I see to keychain don't apply to you.

Also the WWDC yesterday has some cool new features about bad passwords (But I think other password managers already do this).


Pray you never have to bulk export your passwords. It is extremely painful.


Firefox does this too FYI


I like Firefox but I don't use it on my iPhone. I use the Apple pw manager to log into websites on both phone/laptop a lot. The syncing to my iPhone is great.


As does chrome and edge.


Almost every browser has this feature...


But "almost every browser" doesn't sync your saved passwords seamlessly between devices (mobile and desktop) with an end-to-end encrypted database, and automatically offer to autofill them, with the autofill itself protected by Touch ID/Face ID on the devices that support those.


For one, Bitwarden does all of that and more (I guess except the Face ID thing, but I don't really want that so I'd disable it anyway). It has web, mobile and desktop apps, and browser extensions. All available in the free plan. Oh, and it's open source and self-hostable.

https://bitwarden.com/

(Not affiliated with it, just a pretty happy (free) user!)


Neat, and glad to know about it! I've actually been surprised (and a little frustrated) that there haven't been any alternatives to iCloud Keychain for genuine seamless, e2e encrypted, password management.

Does it actually work the same as Mobile Safari's interface on iOS? Automatically presenting options both for creating and autofilling passwords on webpages? (If it can do that in iOS Firefox, I might consider it...I've been using desktop Firefox for a while because of some bizarre idiosyncratic issues I have with Safari on my laptop, that I can't reproduce anywhere else, but are extremely irritating where they do happen...)


Yes, it does. The integration with both Safari and Firefox on iOS is excellent. In Settings -> Passwords & Accounts -> Autofill Passwords you just change the backend from iCloud Keychain to Bitwarden, then it otherwise behaves substantially the same.

A big benefit of switching to Bitwarden is that you can also get it up and running on non-Apple machines. Kicking myself for not doing it earlier.

Just be aware that bulk exporting your passwords from iCloud Keychain is non-trivial.


Lastpass does this, but works with apps as well. Third party password managers are absolutely underrated, because most of them are truly cross-platform.

If you create a Netflix account on your desktop in Chrome, it can generate a password for you, or auto-grab whatever password you typed in, and will autofill that password when you open the Netflix app on your phone. (Based on a built-in association between "netflix.com" and the Netflix android app.)

Super handy feature when you get a new mobile device and need to get logged in to all your apps for the first time.


Chrome does that. If you set a passphrase for sync, then your data is e2e encrypted.

https://support.google.com/chrome/answer/165139


Use a password manager. It's safer than trusting Apple or Google who are after your personal data.


Apple never gets your saved login credentials. iCloud Keychain is end-to-end encrypted, and Apple hasn't been shy about advertising that fact.


> Apple never gets your saved login credentials.

Yeah, keep telling yourself that. They are a part of PRISM.


Also worth noting is that with recent iOS versions, third party password managers are recognized at the OS level, and can be used with the same seamlessness as iCloud Keychain.


Apple and Google aren't going to sell users' login credentials, and if law enforcement is requesting information, they will go directly to the provider who contains your data they're seeking.


Apps shipped with macOS are actually good imo, or there's just not a lot of better alternatives. Never found a better alternative for itunes or mail (used Thunderbird and mutt for a while and they're either bloat or have bad UX). macOS designers seem to be good at cramming functionalities (i know it's not unix philosophy) while making it not look bloat at all.


Mail is a pretty awful client imo, but still better than most other. Canary and Airmail however are absolutely amazing (I recently switched to Canary from Airmail) and there really isn't anything comparable on other platforms.


Mail.app is an appalling client. Having recently switched email providers, I tried copying several thousand emails from one mailbox to another. Mail would invariably crash after moving several hundred, and then could not be restarted without immediately crashing. I lost count of how many crashes I reported to Apple (aside: has anything ever come of these reports?). Mail is so bad, it did what I didn’t think was possible - I switched to Outlook. And guess what? Outlook handled me email-moving task faster and without error.

Internal quality control for macOS and built-in Apple apps has fallen off a cliff in recent times. Catalina has been disastrous in that regard. Having used Macs since OS/X Leopard, I’m not jumping to Big Sur. I switched from Windows because it was a buggy, hot mess. I’m not hanging around and risking my work as macOS quality control keeps falling. Apple’s developers have lost my trust.


I once had a similar need but outlook failed me hard, never tried with Mail.app but I'm sure it won't go well. In an ideal world i would just make all software i use myself cuz I've only seen like <10 actual good softwares in my life


> Apps shipped with macOS are actually good imo, ...

As long as your standard is low, sure. Scanning with Image Capture results in needlessly large files and it has turned to be quite a piece of crap. Moved my scanning pipeline back to Linux, and now, I've got text recognition for free !


Some are good. But many have better alternatives. All mac apps leech your data to the cloud though. And that should be a huge no from all of us.


I rely on TimeMachine to backup my MAC.

AFAIK, windows and linux still don't have suctions tool that can backup/restore without pain.


>and my downsides are... what exactly

Less cohesion, competing GUI libs/configuration systems/etc, still not 100% GPU support, the Wayland situation, almost non-existent multimedia (video, audio) options (not to mention support for peripherals like audio interfaces), and being at the worse than Apple's arbitrary mercy of your favourite desktop environment programmers (e.g. abrupt transitions and bizarro decisions in Gnome and KDE major versions).


> still not 100% GPU support

I'm a graphics engineer so I'm not sure I understand this argument. When I talk about weaker and worse-performing hardware options and drivers on Mac, I'm referring _mainly_ to GPUs.

> Non-existent multimedia (video, audio) options

This... is simply not true (?)

Also, regarding Gnome/KDE, I dislike both and don't use either. Something like i3 with zero-to-know configuration is already pretty great out of the box.


>When I talk about weaker and worse-performing hardware options and drivers on Mac, I'm referring _mainly_ to GPUs.

And when I talk about weaker support for GPUs on Linux, I mean the Wayland / compositor situation, getting 3D to work without planning what specific model of laptop to buy to use with what specific distro, and so on...

>This... is simply not true (?)

This is simply true. It can't run Cubase, Logic (ok, that's expected), Pro Tools, Studio One, Reason, FL Studio, Maschine, and tons of other things besides. Your best bet is some niche players like BitWig and Traktion, or crossing your fingers with Wine... And let's not get started for drivers for professional audio interfaces and peripherals...

Same for NLEs...


There are no good mail clients for Linux that I can find. All of them are great, except for one or two things that makes them completely awful.

> (seriously, safari is the new internet explorer).

I'm not sure what this is supposed to mean. Safari is holding back the web because despite its monopoly on users it doesn't support new technologies? That's not true at all. Especially because it's got such a small market share, it should be up to them to deliver what they feel makes the most sense, and let the market decide the rest (though sadly the market seems to keep using Chrome, which is terrible).

Honest question though: as a Safari user almost exclusively for the past 15 years, what I missing out on that I could have experienced if Safari had supported WebGL2? I don't know that I've encountered any sites or use cases where Safari was holding me back in that regard, but I'm curious to know what was out there that I couldn't see before.


Does your benefit comes with battery power and portability? I couldn't care less about speed when I have a 64 vCPU box at 10ms delay.


Linux is great for development work and casual web browsing, but imo it doesn't feel polished enough for most people's personal stuff.

> Itunes, Mail, etc, none of the software shipping with MacOS is actually any good. I

It's better than the open source stuff on Linux. While it has gotten much better over the years, it's still not good enough for normal people like my parents or my sisters who are not techies. That's just reality for now at least. It's pointless as to whether or not you agree with it.


I maintain an old Mac for corporate purposes, but yes a Mac is just tedious. While things like printers “just work” in Linux they really struggle in OS X, requiring “drivers” like its 1999.

The other benefits of Mac hardware (MagSafe for example) have all been removed leaving not-a-lot.

YMMV.


For me, the React Native experience has been instructive. Android is quick and easy to develop, debug, and deploy. The iOS side includes multiple hoops, constant frustration stemming from XCode, long app release times, and minutes-long builds for production-size apps.

Our team, without any policy dictating daily development habits, has coalesced around Linux/Android development with iOS testing before release.


As a consumer, I've never used a React Native app that I liked using. It's always janky and commpromises UX in ways that made me simply use the app less or had the effect of teaching me to "get in and out of the app as quickly as possible".


> As a consumer, I've never used a React Native app that I liked using.

If it is a good react native app you won't know it is RN.

If you notice it, it is by definition poorly done. There are some things (e.g. specific animations, UI elements) that RN just isn't great at, the best way to avoid making a janky app is to avoid even trying to do those things.


Non-native menus and poor keyboard focus handling and shortcuts are the giveaway, and extreme memory usage!


That’s your bias. Most consumers have mo clue what react native is


As with everything, it depends on the implementation. There are some things I'd like to clean up about our app, but overall feedback has been very positive.


> Our team, without any policy dictating daily development habits, has coalesced around Linux/Android development with iOS testing before release.

Interestingly our team has gone the opposite way (again, no official policy). The deciding factor in our case has been the relative performance of Xcode/iOS Simulator vs Android Studio + Android Emulator. Android Studio on it's own slows my computer down more than Xcode compiling AND the iOS Simulator running at the same time. The Android emulator is all but unusable on my MacBook.

I agree there's a lot of painful bits around signing/release on iOS though.


Sometimes it feels like I live in another dimension. I worked with dozens of developers using macs that never make complaints beyond little niggles, then I come on here and you’d think the entire ecosystem was a dumpster fire. It’s enough to make me think I don’t work on the same system they’re describing for 60+ hours a week.


I recently ran into some trouble with some tools installed in /usr/bin a while ago, and wanted to try making some changes.

Even with root, you can't touch /usr/bin. It took me way too long to try to figure out what was going on, and all the fixes were hacks.


I may be missing something here, but why would you need to change the stock installed tools in /usr/bin? Seems like an easy way to screw up your OS installation. And it's not like a VM where you can rollback to a snapshot or relaunch.

If there really was an issue there then you either need to file a bug and have them mainline a fix, or yes hack/shim a fix on top for your needs. Perhaps leveraging PATH precedence.


Embedded in your response is the general attitude one hears when concerned about not being able to do "a thing" in the Apple ecosystem: why would you want to do that?

Questioning the use case and insisting that one doesn't actually _want_ to do a thing instead of allowing the user to control their own system is the quintessential Apple experience.


You can still do this thing, you just do it a different way that doesn't fundamentally risk breaking the OS. I understand that there may be some non-zero sized group of people who absolutely want to screw with protected OS files, and even for this group of people you can go and disable SIP and mess with the OS all you want (one of the Macs I have is a hackintosh, which requires some decent modification).

However most people, including me, and I'd venture most engineers too, would rather have a hardened system.


> Embedded in your response is the general attitude one hears when concerned about not being able to do "a thing" in the Apple ecosystem: why would you want to do that?

Funny, that's a quote I hear a lot in Linux Desktop ecosystems as well. I think it is just the nature of people so accustomed to a certain way of thinking that any other use case that comes along is automatically considered to be doing it wrong.


Linux doesn't have a single desktop ecosystem. Apple very much has One Apple Way.

If Linux is going to get attacked for not having a single GUI, it would be nice if it weren't also attacked for having a repressive GUI monoculture. /s


Apples open source devtools are old as dinosaurs, so that might be a common case.

But more importantly, it's the question whose laptop it is. I continue to think that the tools I build go to /usr/bin, because that's the way I like it. Apple is telling me I'm liking it wrong.

As for filing a bug with Apple - good one. Every single Apple dev considers radr:// a black hole, and the chance of getting a fix from Apple because of bugs filed (vs. Apple wants to fix it anyways) is slim to none.

Overall, Macs are more and more machines that want to prevent shooting yourself in the foot, at the price of less flexibility and access. This is a good choice for some, it's not a good choice for me. (And many other people who like hacking their machines)


>Apples open source devtools are old as dinosaurs, so that might be a common case.

Yeah, but there's a certain expectation that the tool that you have installed in /usr/bin is a certain version. There's a reason why tools like Homebrew generally do not overwrite built-in tools.

If you just replaced /usr/bin/python with Python 3, you'd probably break all kinds of things.


The point is, it is my machine to break. Apple is more and more deciding that I don't get to do that. It's a choice that benefits a large class of customers, but it's detrimental to people like me.

macOS used to be "Unix, but with a great GUI". It is turning into "iOS, but with a few command line tools".


> I may be missing something here, but why would you need to change the stock installed tools in /usr/bin?

Because it's my computer that I paid for with my hard earned cash, and I want to.

How far we have fallen.


Then you should probably disable SIP and go do whatever the hell you hope to accomplish with that.


I can't agree with this more! It's a bleak future. Back to Linux I guess.


I had other reasons that I needed my PATH to be in the order it was.


> Even with root, you can't touch /usr/bin

Yes, you can. If you disagree with Apple's System Integrity Protection, you can take two whole minutes of your life (if even up to that, considering how quickly OSes boot these days) to turn it off permanently and mount the entire drive as read-writeable. For some strange reason, though, I tend to encounter far more people complaining about it than actually taking the steps needed to fix the "problem".


Perhaps what you need is Gnu core utilities - https://www.gnu.org/software/coreutils/coreutils.html ?

Just install it with macports and you can use all the tools by prefixing it with a g - gls for ls, gdd for dd etc. Some of these tools are newer versions than on macOS and hence improved (as GPL3 prohibits Apple from including newer versions).


Like I said, I could hack around it eventually.

It was annoying that I couldn't do the basic things I expect to be able to do on a machine as root


which is what /usr/local/bin is for, right?


I can't speak for why the OP feels this way, but I think one of the main gripes I've seen from "power users" of the latest MBPs are the touchbar and Apple's continued insistence on it and the larger trackpad that some find obtrusive (these features are generally embraced by end-users). Then of course there's the lack of USB ports that some take issue with and the fairly disappointing built-in webcam on the latest models. These are not problems for everyone and most have workarounds, but some "power users" are turned off by this. The abandonment of the butterfly keyboard is definitely a step in the right direction.


I would also consider myself to be a power user. However, one huge annoying problem I've encountered is sharing homebrew between multiple users. It seems like it's not possible for homebrew and all of it's libraries/binaries to be owned/usable by more than one user at a time.

Apple should roll their own package management tool or officially support homebrew.


The homebrew developers (of the homebrew framework itself not all the random app maintainers) have been "doing it wrong since day one. It is utterly and unforgivably wrong to install system-wide binaries owned by a normal user outside of the users own home dir. That's what they were doing for years until fairly recently when Apple had to simply change the os to break their install, forcing brew to change it, finally, against their will.

Macports did it right, but "brew ..." is just a cooler catchier name.

It litterally comes down to that. The more fun name.

When someone is new and confused and overwhelmed, a slight difference like that, or the design of the website, or the charisma of the people talking on forum posts, is the tipping factor in which of the 11 possible things they try.

And as long as the first thing they tried worked, they keep doing it and quickly gain a rewarding feeling of accomplishment and confidense doing that thing.

At that point this is their home and their comfort. They aren't bothering with anything else when it doesn't seem to be any different. They conclude they chose wisely the first time (which is a good feeling that anyone can always simply decide to reward themself with, ie confirmation bias).

So, no one should have ever used homebrew. Homebrew "won" anyway, but not by being the better-architected more technically correct system with wiser engineers.

I forgive all the users for not realizing that the directions they are terrible and broken. There is no excuse for the developers writing that system.

So, your problem with homebrew, I say, the problem is homebrew not Apple.


> It litterally comes down to that. The more fun name.

I can't speak for anyone else, but I switched from MacPorts to Homebrew, after having previously switched from Fink to MacPorts many years ago despite not only being told but having actually experienced that Homebrew was slower and more fragile. And the reason was not "the more fun name." The reason was that both MacPorts and Fink were incredibly slow at expanding and updating their ports tree. Stuff that I knew from my Linux and FreeBSD days, or cool stuff that I read about on weblogs or in places like Slashdot and HN, was nearly always right there in a current or new-current version in Homebrew. That was rarely true of MacPorts: often it wasn't there at all, and if it was, it was often several minor or even a full major version behind.

This may have changed since then -- I hope it has, since I suspect it's been over a decade at this point -- but IIRC, the last straw was trying to install the terminal version of a program (either Emacs or Vim, I think) that also had a GUI version available and, after going to make coffee and coming back to the computer, discovering MacPorts (a) didn't care that I had explicitly asked for the terminal version, it was going to install the GUI version for me anyway, and (b) because of MacPorts' aggressive "never depend on the system version of anything no way no how", it was building a new version of the entire bleeping XFree86 from source for me. Yes, I understand why MacPorts takes that "no system dependencies" approach; yes, I get that was probably a badly-specified port file. But I ripped it out and never looked back.

Well, I take that back. I think I looked back once, about four years ago, to see if everything that I had installed under Homebrew on a machine could be installed with MacPorts, because maybe it was time to give it another chance. The answer was no, everything could not be installed. So the answer was no, it was not time to go back.

I don't care that Homebrew has a fun name. I care that it has the stuff that I want kept reasonably current with upstream versions. Homebrew has, as far as I can tell, gotten a lot better over the years at not being fragile. It's still no speed demon, but the truth is that I don't actually run it that often. If a more native macOS package for a given piece of software is available, I'll install that in preference.


have you tried windows with subsystem before? I've noticed significantly better performance from windows 10 on Lenovo t580 (from 2018)than my Macbook pro (from 2029). I love win 10 + subsystem combo 10x more than my Mac that I now have moved all development and daily work over to my windows box. I only touch the Mac as a test box before running things in production, when it use to be the other way around.


>> I consider myself a power user, and precisely because MacOS feels like the best of Unix combined with customer focus [...] I simply can't relate to the statement that Mac is a "platform that contends with me every step of the way" and I'm curious to know more specifically why you feel this way?"

I just plugged an extra ssd into my my old thinkpad yesterday. My boot volume is mSata. The hard drive hole was empty, and I added it with one screw. Now I have a vm storage pool. It now has more storage than my gf's mac, and it cost 1/6th as much.

I tried something similar with her computer yesterday. Plugged in a usb SATA ssd. ext4 isn't supported. alt-tab is broken. Using the touchpad for everything is mandatory because hotkeys are counter revolutionary I guess? GNU userland is there but like half of it is broken or fake, like fs/mtab aren’t real? The console clipboard doesn’t seem to work. All the apps are already running instances without a window or something? Is this supposed to be like a phone?

I just want it to fullscreen tmux and leave me alone, but I keep having to search the internet for the cutesy name of the GUI thing that is the only way to do <trivial task>. I could format the disk and put the data back after, but I'm now afraid file io in python will be even more fun than I am already having, so I go find another computer.

I keep trying to learn how to use a mac, but it's always so much harder than just using any other computer that is around that I just can't seem to make time. I find the UI willfully contrarian, and more alien than anything I have every used. I'm not trying to be an OS fanboy here, I know lot's of smart devs who say they like it. I want to "get it" but I just don't "get it". Discover-ability seems poor, especially as a user friendly tool for beginners, but the UI and locked down software eco-system seems so limiting for devs. Like, who is this for?

My gf told to stop trying to think and just click on things… She’s a modeler/dev and she hates her mac, but her employer won’t let her use linux anymore. None of this feels like a dev or “power user” experience. It feels like using Windows XP because the boss made me.

Is this like that thing Neal Stephenson said about the difference between "easy to learn" and "easy to use"? Like if I learn to use it properly I will start having "ah ha!" moments and become more productive?

EDIT: I really don't mean to be flippant! OP asked why people find it limiting. They seemed like they know what they're doing, and might be a dev, and lot's of devs use mac.

OP called it "the best of Unix combined with customer focus", which sounds great to me! I just don't understand what went wrong...


I have the exact same experience. I've developed on Windows, and for the past 10 years on Linux. The experience for the latter has changed so much in terms of usability that I struggle to understand why alternatives are at all desirable for developers. Out of necessity I had to do iOS development on OSX, and out of the gate that already leaves a bad impression. I can cross-compile and target anything from anywhere and run in emulators or virtual machines and what not, the hardware can do it, the software allows me to use it, its all good. Except the mac ecosystem. So, purchase of a mac it is, install of xcode and the whole shebang. Very little is enjoyable, and less is particularly intuitive. I love my grandmother, and for non power-users who mostly wants to use a browser, a gnome based ubuntu seems a better option than OSX, and definitely also Windows.

As for how the experience was for me, in terms of developer and power user, everything mac has from unix seemed a bit half way. It certainly surprised me repeatedly, finding some versions of core utilities to be older and not supporting flags here and there. But this was fine. Homebrew felt sluggish. But this too was fine. Then comes iOS development, and oh lord what a shit show. This was not fun, and the release process and signing, and all that.

I've reached the point in my professional life where I want things to work. I don't wish to struggle with technical limitations caused by politics and marketing decisions. Development on OSX feels that way. You can't virtualize the OS, and can't emulate an iPhone. You have to go through way too many hoops to do what should be simple.

So, I honestly, and for no inflammatory motivation, don't see why people like to work OSX. I understand that you wish to do so if you are limited by software and tools that restrict your choice, making it either that or Windows. And I dare say Windows is, in total, even worse.

For usability and ease of use, as an OS, for use with free software and otherwise software that exists for the platform, it's in my honest and in an effort to not be too biased or fanboy-y (again, I wouldn't push my fanboyism on my grandmother), simply easier. For a power user and again not limited by language or tech, there is no competition. None. I'd chose to work for a different company if it meant I didn't have to develop on OSX or Windows. Life is too short.


"Power user" may be as much as changing Windows Theme. Not much is allowed so bar is not high. Through the page again and again "users don't care".

Linux changes expectations. Everything is possible, there should be package for that, deconstruct, throw away, replace, and most of the time someone already did it.


>> I don't wish to struggle with technical limitations caused by politics and marketing decisions

So much this. When I depend on proprietary stuff, the care and feeding of licenses and serial numbers and all that stuff never seems to end, and it's shackled to one computer or host name or MAC address or something like that, which changes faster than calvinball.

It's not even about money. Who has time for all that busywork?

If mac platform was freely redistributable and Linux was proprietary, I would switch to mac in a heartbeat even if I hated using it, just to protect my time from doing all that IT stuff :p


This - precisely. I live on a Mac all day every day. Does everything I want, exactly how I want it, and makes me so much more productive than if I had to fight with Linux all day long.


I am a developer and power user. I am incredibly happy with every update from Apple. Unix with good design is exactly what I want from my OS.

Sure I can't disassemble my laptop and change the hardware. I have no desire at all to do that. I literally can think of no reason I personally would want to do this, although I imagine there are a tiny fraction of users who would and you may be among them.


Easy. A new application you have needs more RAM. Your battery is not holding up very well and you want to replace it. Your SSD will fail soon and you want to replace it. I had every single one of those issues on a laptop, and fixing them cost me from 25$ to 100$, about a fifth of what the Apple Store or OEM was quoting me. Instead of being without my laptop for a week or two, I got all of them over in an hour and a half tops.

Everyone has a device fail them. Then you have the choice of changing it or repairing it, repair is almost always a better option if you're at all technically competent.

I can't imagine my SSD breaking down or my RAM failing and having to change the entire mainboard when I could just take 45 minutes and fix it. It's literally absurd to me.


Your scenario definitely exists, but it also doesn't hit the likes of the user you respond to as much as you might expect.

SSD's don't fail the way they used to, RAM requirements don't change as much as it used to and while batteries do die, at this time I've seen more people switch machines before any of that happens.

The last time I wanted or needed to replace memory or storage in a laptop must have been 7 years ago, and I don't see anyone else doing that either. Even desktop upgrades are somewhat dead at this point, except for a GPU upgrade if someone is a gamer, or storage upgrade if people still use direct attached storage and didn't get enough when they initially got the machine.

Just because a scenario that hits a small percentage of DIY users is getting hard doesn't mean it's bad for everyone else as well.


If SSDs don't fail, more RAM isn't needed, and so on, then why do people feel the need to buy a new laptop every three years?

I think the percentage of users that would buy a new, 1500$+ machine instead of a small 100$ upgrade is much lower than one expects living in a bit of an echo chamber. There's a lot of people that get burned by outrageous repair prices or that buy new computers when they could simply upgrade a part.

Ask yourself; if you've never felt the need to upgrade a laptop or to replace a break, then why really would you upgrade?


> why do people feel the need to buy a new laptop every three years?

Beats me. You will often see comments in these threads about people (like me) using nearly decade-old Macbook Pros to this day.

I stopped replacing laptops every 3 (or less) years after I stopped buying budget non-Apple laptops.


Well, a lot of people aren't upgrading old MacBooks because the new ones have been such shite.

I'm on Linux, and a laptop I can take apart and replace bits of, so I can see myself using this one for a few years more.

My Dell XPS 15 is suffering from Windows Rot after only two years. When I get a free afternoon I'm going to have to back it all up and clean it out, hopefully avoid reinstalling from scratch. For all the negativity I've had towards OS X, at least it's better than that.


Some people probably have some write-off system, or perhaps resell them why the devices still have some value so they always are somewhere 'in the middle'.

I personally tend to keep my most recent device and the new device at the same time for about 2 years because of vendor lag (happens a lot with those classical software vendors that take months between software releases) and because I need the ability to compare between versions of both hardware and software. The newest device gets promoted daily driver (usually many benefits there, as they often are lighter yet more powerful). In general it means 2.25-ish devices per 10 years.

Other hardware, like SBCs tend to rotate out slower, but generic x86 platforms rotate out faster because if a critical component fails the labour for finding parts and replacing them is too much vs. buying up to date replacements. Luckily, due to lower usage of those machines they last longer, so technically the Apple hardware made the non-Apple hardware have a longer lifecycle in my case.

There is some irony in there as I do provide board-level repairs and the machines I work on for other people are ones I'd never personally invest in.


Because the overall degradation. Sometimes I want 10% of 'everything', plus a new feature, plus less weight to carry around. That's practically been the only driver of my personal upgrades over the last 15 years.

And by degradation I don't mean the existing device degrades per se, but the degradation of productivity on the current device vs. new device.

At the same time, the way work is done has changed a lot for me and the people around me: heavy workloads are almost never done on a laptop anymore.


> but the degradation of productivity on the current device vs. new device.

What does this mean? Do you mean your current device starts working poorly (but that would mean "existing device degrades per se", which you ruled out) or that the existence of a new device automatically makes you perceive your device as being of "degraded productivity"?


For example if I run something that can take advantage of AVX512 and my current CPU doesn't have that but a new CPU does. Same goes for TB2 vs. TB3, very useful when you want to connect an external GPU. It does work on Thunderbolt 2 but the extra bandwidth of Thunderbolt 3 is a nice improvement.

Say you change your workload model there might be ~20% improvements between bare metal, virtual machines and containers. If you simulate a part of infrastructure using containers you may not need more RAM, but more CPU would be nice. But when you then want to do a lot of recording/capturing and process that, RAM gets more important. Just upgrading the RAM wouldn't help much because without a CPU to generate the data you might as well offload the whole thing.


> For example if I run something that can take advantage of AVX512 and my current CPU doesn't have that but a new CPU does. Same goes for TB2 vs. TB3, very useful when you want to connect an external GPU. It does work on Thunderbolt 2 but the extra bandwidth of Thunderbolt 3 is a nice improvement.

This is definitely an echo-chamber/bubble point of view. The vast majority of users out there don't even know or care what AVX is, don't use external GPUs, and don't know or care about the difference between Thunderbolt 2 and 3.

If you personally need these things and want to buy a new machine every few years, then that's great, you should do that. But there are a ton of people who would benefit from an easily-repairable, easily-upgradable (RAM, storage) machine that end up dropping $1500 every few years instead of the couple hundred they could instead spend for a reasonable upgrade.


Seems you are responding to the wrong thread here. User the_af was asking me why I replace a machine and I answered with some reasons specific to me. This was a deeper dive in to the point that some people don't need to upgrade at all because they don't do anything different between day 1 of their usage or day 1780. And they don't need to because the laptops of the last decade don't fall apart as much as they used to and a Mac specifically tends to work well during its entire lifecycle. This is also why there aren't as much people interested in modifying their computers mid-lifecycle.

While I bet that there are a lot of people that do want to modify their systems, they are such a minority that it's not very logical for a large multinational to invest in that to the detriment of other goals. It might simply mean that you are not the target audience for their product(s).

Some other manufacturers/brands have the same, while others do a mixed portfolio to cater to smaller groups as well. We also have large manufacturers that cater to the classical enterprises which still run on the old idea that you need a fleet of identical machines and then swap out parts all day long, so machines that have facilities for that exist. Most notably Lenovo, HP and Dell do that.


Apple devices do hold their value fairly well (certainly much better than non-Apple hardware), so the cost is offset somewhat.

I whinge about the non-upgradability of our devices as much as the next guy, but it's not like people are throwing their 3 year old macbooks in the trash can.


How many upgrade their personal laptops every 3 years?

Work one, sure. But I haven’t upgraded my Mac in years.


Work one gets at least 4 years of life, it's not good for value beyond that due to write-offs etc, which is somewhat strange when you think about it. After that, they are sold to whoever wants them (we wipe them, clean them and unlock them).

Most people get close to 8 years before they actually want a new one when their work doesn't change much -- if your requirements don't change and your tool fits, no reason to change.


Most normal people I know have 10+ year old laptops machines. People that buy laptops every 3 years are pros or enthusiasts that want the best performance.


> I've seen more people switch machines before any of that happens.

My opinion is that people switch machines more often precisely because they're not upgradable/repairable. I think if most people could go to their local computer repair shop to get RAM/storage/etc. replaced or upgraded, we'd have a lot less electronics waste, and consumer computers would last a lot longer.

At this point, software CPU and GPU needs aren't increasing all that much year by year, unless you're a gamer or do HPC. For the rest of everyone -- the majority -- a CPU and GPU made 5-10 years ago is still just fine for what they want to do. These old machines just need more RAM and sometimes more storage.

I get that it's hard to fit sockets for replaceable modules when everyone wants a super thin laptop, but overall this is a source of so much economic and physical waste.


There are plenty of people switching machines that can be completely taken apart and replaced or upgraded component by component. Some other post around here mentioned eBay: plenty of completely modular HP and Dell laptops (and not even in bulk) for sale.

Most computer repair shops around here have disappeared because it's no longer the way people buy and use their computers. At least not enough people to keep those shops running.

Sometimes people upgrade because they feel like it. Doesn't always have to do with the numbers.


I agree in that, these days, I don't see myself wanting to upgrade a laptop's RAM before I'd just replace it with a newer machine. While it would be a nice bonus to have the option, realistically I imagine the 8 GB of memory that comes standard with the baseline MacBook Pro model would be enough for anything I'd personally want to do with a laptop in its lifespan, and I would be willing to sacrifice memory upgradability for a slimmer design, for example.

However, I think Apple (and other laptop manufacturers) cross the line when they ship soldered-in batteries with their laptops. Batteries are consumable components that are guaranteed to degrade over time. Unlike RAM, which, assuming Apple sources it from quality vendors, you can expect to last for many years, laptop batteries will definitely begin to degrade within a few years, and will, over enough time, render the laptop unusable. Apple is producing $1500+ laptops with a non-replaceable component that will, without a doubt, eventually break down. Your options at that point are to buy a new laptop, or to send it to Apple so they can charge an arm and a leg to replace the entire top case instead of just the battery (as that'd be impossible). You should buy a new laptop when you want to, not because you "might as well" without the option of battery replacement, and with Apple's "repair" option costing a quarter of the price of a new one.

Though to a lesser extent, it also irks me that the SSDs are soldered in, not necessarily because I'd want the ability to upgrade them post-purchase, but mainly because your valuable data lives there, and it will become trapped on a dead board if something goes wrong with an unrelated component. Even if failures like this are rare, the threat of data loss while using a MacBook would still concern me more than while using a laptop with a replaceable SSD. I believe Apple has a data recovery procedure, but you'd still have to send your laptop to them and cross your fingers in the event it dies, whereas if the SSD wasn't soldered in, you'd have the option of manually recovering your data, or taking it to a repair shop where they could do it for you.

I would love to see a MacBook Pro with, at the very least, a replaceable battery (but especially with a replaceable SSD as well).


Why are there so many Apple devices with 8GB RAM for sale on eBay then? The RAM prices for new Apple devices are 3x the normal rate so people get disappointed with all the Electron apps eating RAM and sell it!


I'm not sure that is an indicator of anything at all. You could speculate that people sell their laptops because they want more RAM and that is the one and only reason this is happening. But then that's like what, 100 MacBooks? 1000 maybe? Hardly large in numbers compared to any other laptop from any other brand on eBay.

I know that a lot of people are very emotional about their RAM and SSDs and it can be a PITA, but it doesn't affect as many people as you'd think. At least that's what the available data shows.


This is very true, thanks.


All that, and also -- Apple sets prices about once a year. Meanwhile, the price of CPUs, RAM and graphics cards drops. For any price level that Apple establishes, I can afford a Hackintosh desktop of much greater power or I can save a lot of money and get equivalent power.

Though, to be honest, I don't bother unless there's a particular application which isn't available on Linux.


Hackintosh is a non starter if you are using the computer for commercial use.


Turns out there are some very small niches where it makes sense. Basically, if you need to maintain compatibility with a larger organization but your profit is independent from theirs, and you have necessary technical resources at hand.


and you don't mind violating the licensing agreement.


Photoshop


Runs on WINE or in a Windows virtual machine.

The latter works fine for me.


Premiere

Final Cut Pro


That’s it? Great!

Sounds like non-iOS software developers that are stuck on macOS can just hop right off without any problem whatsoever...


Did Apple start using soldered on storage or RAM for (non-air) Macbooks?


I'm pretty sure the MBPs have had soldered-on components for quite a few years now.


Is it you Elon? DId you return from a few years trip to Mars? ;)


Lol I have never heard this. For me mbp=sodimm and easily replaceable drive while air is soldered on! What year did it change? I swapped my hdd for an ssd and upgraded the ram in my current mbp (which is a Sandy Bridge...)


> I literally can think of no reason I personally would want to do this

Being able to swap out RAM and storage and battery from factory-provided can extend the lifetime (and even increase the performance capacity) of a given machine at nothing more than the cost of the new parts. And Apple used to make it pretty easy to do this, and enough people did that there was a subeconomy of vendors selling for this purpose, although of course there were full-service options for people who literally couldn't imagine doing it themselves.

This also meant that downtime wasn't controlled by Apple Store service availability (literally had a drive fail on me once, had my bootable backup, swapped it into the machine, good to go).

And on a number of occasions I've done more complex replacements, a DVD drive here, a keyboard there. I can see why most people wouldn't want to do those, they're a giant pain, but having the option can be empowering.

What's actually hard to think of, if one is thinking, is what Apple has gained in return for this. I can kinda squint and see that irregular (and therefore, perhaps, less efficiently swappable) battery shapes have some credible advantages, but the rest of the marginal ounce-gains and dimensional-golf scores belong to a category of diminishing returns.


Being able to swap out RAM and storage and battery from factory-provided can extend the lifetime (and even increase the performance capacity) of a given machine at nothing more than the cost of the new parts.

It isn't just the cost of new parts. It's also the cost of those parts being larger in order to accommodate user-accessibility and replacement. When everything is surface-mounted on the PCB it can be made a lot smaller. This allows Apple to shrink the laptop as well as allocate more space for the battery.

No one is asking for phones where we can swap out the RAM and storage. Why do we want this from laptops? I'm typing this on a 2020 Air which I configured with 16GB of RAM and a 1TB SSD. I really don't anticipate a need to upgrade either of these things before the machine reaches end of life anyway.


> Why do we want this from laptops?

Why shouldn't we? I would gladly sacrifice a few mm of thickness for a machine where I could upgrade the RAM and storage (and replace the battery) every few years. Not only would that save me money, but it's much more sustainable from a manufacturing and waste perspective.

I just finally threw out my old (sadly broken) 12" G4 PowerBook, and marveled that it had a removable battery (you don't have to disassemble it; you just flip a latch on the external bit, and the battery slides out). And I remembered that at some point I'd done a trivially-easy disassembly at one point to upgrade the HDD. I got that laptop secondhand, when it was already a couple years old, and it lasted me a good five years, and probably would have lasted longer had my then-girlfriend not dropped it on concrete, which somehow fried the drive controller.


My Air is 2.8lbs, far thinner and lighter than a 12” PowerBook G4 (which I used to own as well) with way better battery life. I don’t want to go back to an old brick of a laptop like that. I think most people don’t.

You’re part of a very small niche. The tinkerer who doesn’t want to jump to Linux. This is a tiny number of people, sadly, just as the number of people who want to tinker with their cars is very small. Most people just want something that works and is very convenient. They prefer to leave the service to service people.


> My Air is 2.8lbs, far thinner and lighter than a 12” PowerBook G4 (which I used to own as well) with way better battery life. I don’t want to go back to an old brick of a laptop like that. I think most people don’t.

Not saying you'd have to. Battery technology has improved since then; we can make lighter batteries that last longer. Hard drives have been replaced by small PCBs with a few chips on them; again, much lighter. Further miniaturization of the internals means a smaller chassis which means less material; again, much lighter.

It's be perfectly possible to build a laptop with similar dimensions and weight as the current crop of laptops, but allow for easier battery replacement, and even RAM/storage replacement. I'm not saying it'd be as simple to swap as it was in 2005, but it'd at least be easier/possible and not involve special tools.

Apple seems to implicitly claim that their anti-repair/anti-upgrade stance is due "delighting customers" with smaller, lighter hardware, but I suspect it's mostly driven by their desire to lock down their hardware against tampering, and keep people on the upgrade/purchase treadmill every few years.

> You’re part of a very small niche. The tinkerer who doesn’t want to jump to Linux.

I actually do run Linux (previously on Mac hardware, but I finally gave up on it last year), so that's not the niche I'm a part of.

But that's not really the point; I'm not speaking for myself, I'm speaking for average users. I see a lot of posts here claiming that average users "don't think about" this or "don't care about" that, and I posit that the average user doesn't care about an extra few millimeters of thickness or an extra few tenths of a pound of weight. Especially if it doubles or triples the useful life of the device through a RAM/storage upgrade and easy/cheap battery replacement every few years.

> They prefer to leave the service to service people.

That's fine too, but Apple is actively against allowing a robust, price-competitive field of independent repair shops. And even if they weren't, soldering in the RAM chips and NVMe drives means the only thing those repair shops could do would be to swap out batteries and main boards, rather than do cheap, targeted upgrades.


The 12” powermac g4 was 4.6 lb.

Yes, modern Apple laptops gave up the “brick battery that slips out with a latch.” But in place of that we have Terraced internal batteries that can fill any volume


Huh? Most sold phones probably have a microSD slot.


It used to be the case I would swap MacBook parts out more when laptops were simply clunkier maybe 10 years ago (added a SATA SSD back in 2010). But lately with 16 GB RAM being fine for most developers even, NVMe SSDs being standard, Macs still being lame for GPGPU tasks besides mining cryptocurrency, and a lot of compute intensive tasks simply put into the cloud ... I can’t really say I need to do many hardware updates on a Mac besides perhaps the battery anymore. What really helps me most now is battery life and better screens to reduce eye fatigue, and that’s basically what Apple has done fairly well compared to the PC laptop market.

I’m kind of curious to see how usage goes for AWS Graviton 2 instances in the future as Apple laptops transition and there’s less friction to deploy along ARM based tool chains and artifacts.


Last time I had a laptop with replaceable ram modules, there weren't any bigger modules on the market and the cpu memory interface had a maximum capacity equal to the installed amount. I'm not sure expandable ram in laptop is all that useful, unless you intend on getting the lowest spec version and upgrading it later.


> Being able to swap out RAM and storage and battery from factory-provided can extend the lifetime (and even increase the performance capacity) of a given machine at nothing more than the cost of the new parts.

While that is true, how many users of any kind of device actually considers that an option these days?


Changing the hardware or software yourself is not the point. The point is the monopoly of the manufacturer to repair and modify with huge negative consequences. The lack of right to modify the software leads to unjust power over users [0,1], while the lack of ability to repair hardware leads to unreasonable repair prices, monopoly and environmental damage [2].

[0] https://www.defectivebydesign.org/

[1] https://www.gnu.org/philosophy/free-software-even-more-impor...

[2] https://www.ifixit.com/Right-to-Repair/Intro


No, that isn't the point either. The point is happy users, and there will always be some unhappy users because they wanted to do something that isn't supported or actively barred.

Throwing in words like 'right' and 'unjust' helps in movements and call to action etc, but in this context it's about someone turning on their device, doing some work, and turning it off again.


Since I can't reply to the replies, I'll reply to myself:

All of those points are fine, but they are points separate from a user using a device.

If you want it for environmental reasons, say that. If you want it for freedom, say that. Both are valid reasons but don't have anything to do with someone using their machine, unless that 'usage' mains doing stuff with the insides. And before someone jumps in to tell that that is the main activity: great, but most people buy those machines to do work that doesn't entail opening it up.


I am not allowed to use a browser I want on iOS. Do you count this as “using their machine”? The reason is the lack of freedom to do what I want with the software.


Not sure what that has to do with hardware but I'll bite: users don't care. They don't know what a browser is, what a client is, or an URL or URI or address bar. They want a device that when they turn it on and type in 'tictoc dance dog' it shows them something that resembles what they had in mind. Doesn't matter if it runs WebKit or Gecko or if someone thought it'd be cool to emulate a trident-based engine on iOS on ARM in a custom browser.

Freedom 'to do what you want' has not had much to do with what works in the world of selling to the masses when it comes to user experience (and yes that is ironic).

Regarding 'using' and 'their' and 'machine': I meant an activity based scenario which is what most people see themselves doing. They mostly have no concept of ownership, actions or device semantics. The reason is simple: it's not needed to be a user and it's not needed to get the same results as other people in your group have. And for a lot of people all that matters is doing the same as the rest of the local group.

Edit number 3: you can apparently select your default protocol handler in iOS 14. I bet there are some double digit Google users that want that for Chrome, and a few sub-1% users that want FireFox. But that's still not freedom because they all use the same renderer and JS engine. On the other hand: it's unlikely that people care about that, and the people that do are not likely to run iOS at all. Luckily, you don't have to use iOS. Or a phone. So some aspects of freedom are unchanged.


> users don't care. They don't know what a browser is, what a client is, or an URL or URI or address bar.

All true but irrelevant. The problem is that artificial lack of choice prevents competition and innovation. Microsoft from the 90s is here again. And users don’t care, again.

I do not count another UI as a new browser. It should be different under the hood.


> I do not count another UI as a new browser. It should be different under the hood.

Can you imagine any general user ever repeating that? I can't. As those users woud say: nobody cares.

I personally would care, but I'm not the main marketed target for Apple. I also care more about getting stuff done than what engine my browser uses.


It doesn't matter if a general user would know about this stuff. The point is that they are being hurt by lack of competition and don't even know it. Even if they wanted to stick with Safari on iOS, that version of Safari would likely be better and have more features if it were forced to experience some competition.

This is why healthy competition and anti-trust actually matters. It's not because your average user cares about the details, it's because they're being impacted even when they don't know it.

This subthread has a fairly trivial example of this, but when you get into right-to-repair it becomes even more important. Your average consumer could be spending a lot less money to have a much better experience. The fact that they don't know about that is part of the problem; lack of knowledge doesn't make the issue irrelevant.

And we don't even need to get into the environmental arguments around reducing waste to consider this aspect of the issue.


> Not sure what that has to do with hardware but I'll bite: users don't care

Sure, if you keep a population ignorant, they won't care about the pettiness of their lives, as they don't know about freedom.


Those "happy" users are just ignorant users who don't know what to do with their money. Their happiness wouldn't be any lesser if Apple would provide ways to fix their laptops.

You should watch Louis Rossmann who is doing repairs, how purposefully Apple makes any repairs hard.

[1] https://www.youtube.com/watch?v=-uYUB8DZH2M&t=2649s


What you are saying is the same thing. “Unhappy users” are unhappy because of lack of the freedoms I mentioned. One has to fight for them.


Considering the environmental damages that this causes it should indeed be a right.


> I literally can think of no reason I personally would want to do this

If you personnaly care a little bit about the environment there is a pretty evident reason why being able to swap a single piece of hardware is better than replacing the whole unit because everything is soldered together.

Right to repair would also mean that once your device goes out of guarantee you would have more options to get your device repaired for cheaper.


Not only that, but also:

* Apple charges +$800 for 64GB of RAM, which on Amazon would cost you around $300-350.

* Apple costs $1600 for 4TB of SSD, which on Amazon would cost you around $450-600.

* Soldering the SSD to the motherboard is an enormously horrible engineering decision for both servicing, backups, and data recovery.


Exactly. I feel more of us should speak up like this and not let ignorant users or shills try to sell this message that soldering down components and making it extremely difficult to replace parts like a battery is a "great thing" being done for the benefit of the consumer.


Reliability is far more important than repairability. I don't need to worry about replacing components if they don't go wrong.


> Reliability is far more important than repairability.

And why do you think they are exclusive to each other? I have a, I kid you not, nearly 15+ year old washing machine that I am still using. I had to replace 2 parts, due to wear and tear, in the last 5 years and it still runs as good. I am glad that I could repair it for a fraction of the cost than dump it and buy a new one.


Thinner devices are great for the consumer.


> Soldering the SSD to the motherboard is an enormously horrible engineering decision

And a good sell-more-laptops business decision?


You assume that people have to throw away millions of laptops, but his comment (and mine) refer to the fact that a recent laptop works fine for 6+ years as-is. The lifecycle doesn't shorten but the need to toy with the innards has gone away for a lot of people.


> You assume that people have to throw away millions of laptops

Hmm yes I do: https://twitter.com/RDKLInc/status/1275100376350384132

Have you also never had anyone you know spilling liquid on their keyboard after the warranty expired? Or dropped it?

So in the end yes, I am pretty sure that there are literal millions of devices being thrown away because repairing them is too expensive.

Edit: I'll also add that in 6 years using a mac, Apple replaced my macbook entirely twice for some issue that you would hope could be fixed without throwing everything away. So, sure, the customer is happy because they got a brand new device. But personnally I would rather stick with my 1 year old device and reduce environmental impact.


Those are not laptops, those are iPads, which weren't modular to begin with. Also, not broken, but locked, because the previous owner didn't unlock them before discarding them. Are there millions of devices that get thrown away in general? Sure. But I doubt it's only devices with RAM that was soldered down and branded Apple.

Regarding spilling on keyboards: I don't know anyone personally who does that, in my local area people don't eat or drink next to their devices. I do have plenty of people I don't personally know that did ruin their computers that way and I have replaced plenty of Apple keyboards and entire topcases to solve it. Works fine as a side-business.

Repairing is not the same as upgrading, and recycling is not the same as throwing away, and neither is reusing. The difference is important, as the solution changes with the case.

Overall, this would be more fitting in a general discussion on environmental impact and not as much one about software, the hardware it runs on, and human behaviour. If you want to solve a big problem you need to generify instead of brand it as far as I know.

To dive in a little deeper: to 'fix' this, people need to spend less time on how beautiful their stuff is, or how cool they look, and go a more utilitarian route. But since the American economy is built around the opposite and the western European one has the tendency to follow it in some aspects that's unlikely to happy any time soon. Changing people is hard, it's also the best way to solve or fix or change anything.


> Those are not laptops, those are iPads, which weren't modular to begin with.

Sorry, I thought this was illustrative enough to make my point, but if we have to be pendantic then here you go the exact same thing but with Macbooks:

https://twitter.com/RDKLInc/status/1251533085734252549

> in my local area people don't eat or drink next to their devices

Interesting local area you have there, at my office and all the offices i was in the past everyone has a cup of coffee right next to their laptop.


> https://twitter.com/RDKLInc/status/1251533085734252549

Again, two MacBooks and not millions, as explained in the twitter thread. Again, not defective but locked by the previous owner. Not even remotely related to 'right to repair'.

As an owner, I want to be sure that when I lock a device, it's actually unusable to anyone else. So this is good (for me).


> As an owner, I want to be sure that when I lock a device, it's actually unusable to anyone else. So this is good (for me).

You want your data to be inaccessible. Not providing a way to factory-reset a perfectly functioning device (say, by replacing a modular storage) is just irresponsible.


No, you want it to be unusable, because this greatly reduces the incentive for theft.


Well, I want my device to only be usable to me unless I say otherwise. When I sell it I'll sell it unlocked.


> I am incredibly happy with every update from Apple.

I am incredibly disappointed with every update from Apple.

I bought my MacBook Pro in 2011. The most basic 13.3" machine that was available for sale. 320GB 5400RPM HDD, i5, 4GB RAM, Intel graphics.

I like backups. The optical drive had to go, I put it in an external USB enclosure. It was replaced by the HDD and in its place I put a Samsung SSD. The HDD contains my data archives and a macOS installation image.

Whenever macOS would become cluttered, I would wipe the SSD and enjoy a fresh macOS installation. Whenever a new macOS version would be released, I would update the installation image.

This enabled me to work on the go with no fear of data loss and no network dependencies.

SSDs are basically long lasting consumables. I had to painlessly replace the drive twice, however each time I got a faster and more robust unit with enhanced capacity. Sure I can carry an external drive with me. I have no desire at all to do that.

I like the mini jack, so that I can plug my favorite headphones. The audio-out port also doubles as an optical-out. Sure I can carry a couple of dongles with me. I have no desire at all to do that.

Sure I could have bought the top of the line 8GB RAM model back in 2011. I had no desire at all to do that. I could just buy 16GB of RAM and perform an upgrade Apple claims is not supported.

The battery lasts long enough. While coding, I get drained before the battery does. I can let the battery age with no fear of it bulging because it's contained in a shell. Replacement is easy.

I totally get where you are coming from, and agree with you that "a tiny fraction of users" want access to the hardware. However 100% would want access to the hardware when their 16" MacBook Pro SSD goes.

Don't professionals require machines that cater for their needs? If the current Apple offered choices like the ones I mentioned above, would you not pay a premium to have them?

I certainly would. And for that reason I still rely the same 2011 MacBook Pro and not the latest MacBook GoodEnough.


I upgraded my current laptop from 8 to 16 GB to 32 since 2014. I replaced the 750 GB HDD with 1 TB SDD. Then I replaced the DVD with another 1 TB SDD. It was as easy as slide the bottom out, loose a couple of screws and that's it. Ok, I concede that replacing a worn out keyboard was not as easy as on YouTube but I like serviceable hardware.


For me the problem can be summed up as: "Every new MacOS release has less Unix and poorer (for a power user) design".

Apple has been slowly whittling away at the MacOS value proposition for developers.


> Apple has been slowly whittling away at the MacOS value proposition for developers.

Yes, I think the success of the iDevices have made them change focus. They are no longer interested in making devices for the power user. They now make more money selling iDevices, especially recurrent revenue. And so the ignorant consumers are now their target market.


This makes no sense! Storage and RAM requirements have steadily increased, and costs have gone down dramatically. On the other hand, keyboard technology has not improved massively, and CPU speeds aren't significantly different.


I'm not bottlenecked on those components.

I think on my 2016 MacBook at home, the only time I get bottlenecked on performance is CPU while playing EU4. Oddly, all the powerusery stuff I do could probably be done equally well on a 10 year old device if it wasn't for USB-C.


> I am incredibly happy with every update from Apple.

You are happy with the sluggishness in Catalina from security checks over the network before running applications?

> Sure I can't disassemble my laptop and change the hardware. I have no desire at all to do that. I literally can think of no reason I personally would want to do this

The cost of repair is higher because independent shops cannot repair the boards anymore or recover data when your storage, memory, and everything else is soldered directly to the board.

I don't doubt that you enjoy the platform, but I would be hard pressed to say that I would be happy with everything from any tech company.


> I am a developer and power user ... Sure I can't disassemble my laptop and change the hardware. I have no desire at all to do that.

That's a lie. Every power user tries to get the most out of their machine, and so desires an upgradeable system. No power user likes to throw away a system because s/he can't replace its battery or upgrade its ram or hdd / ssd. At some point, everyone likes to upgrade their existing system without spending a gazillion bucks for it.

Yes, it is true that with phones and tablets, corporates like Apple and Samsung have managed to convince the general mass that not being able to upgrade your phone or tablet is a NORMAL thing, and soldered RAM and soldered SSD are the only option, but we power users know that's a lie that Apple and Samsung (and others) would like us to believe.

And that is why their marketing is heavily based to push the exact message you are parroting:

> I imagine there are a tiny fraction of users who would and you may be among them.

I can bet that even if someone is not a poweruser, and an ordinary consumer / user, they too would appreciate the option to upgrade their phones / systems and reuse it. That is why we have efforts like the PuzzlePhone in europe.


Wow. He doesn't want to disassemble his laptop, so he's lying about being a power user?

Have you ever heard of the No True Scotsman fallacy?


You don't need to disassemble a modern laptop or a PC to upgrade RAM, battery or HDD. Only most Apple systems need such crude and difficult disassembling to do such simple tasks of replacing these parts because they are deliberately designed like that.


Hell, even an average user would likely appreciate being able to bring their laptop to the local repair shop, give them $200, and walk out with the same laptop, but with more RAM, storage, and a new battery.

The only current option is to drop $1500 on a the new model. It's not only more expensive, but wasteful.

I guess there are a lot of younger folks around here nowadays, but it was perfectly normal in the 90s and early 00s for regular non-technical home users to get out a screwdriver, open up their desktop PC, and replace the RAM, add a new/larger HDD, add a CD/DVD drive, even swap out or add a graphics or sound card. And there was a perfectly functioning computer shop industry that would do that for you on the cheap if you weren't comfortable.

In the past 15-20 years manufacturers have gotten us into the cycle of buy->discard->replace to the point that people don't even know it was ever different.


Every time there’s a new Apple announcement, I’m happier and happier that I switched to Linux a couple of years ago.

Power users want freedom and control of their own machines and hardware. The moment that Apple started soldering parts into place, I started looking elsewhere because it was a signal of the thought process.


> The moment that Apple started soldering parts into place, I started looking elsewhere because it was a signal of the thought process.

Very true! The first thing I did after buying a Mac Mini was to upgrade the RAM and change the HDD to an SSD, and even added an extra HDD. The Apple options for the same were nearly triple the cost. Only an idiot would believe that having soldered CPUs, soldered RAM and now soldered SSDs is a good thing. It is a good thing only for Apple, and Samsung et. al bottomlines.

And with ARM processors, you can be sure you'll get a PC with locked bootloaders that won't allow you to even install other operating systems. And the offered OS will also be a dumbed down system like ios that will only allow you install from a locked app store. All the while leeching off your personal data to the cloud.

I too have started exploring other alternatives.


On the flip side: does anyone still buy a low-specced machine and upgrade it mid-lifecycle? I've seen it less and less over the last 6 years and the last 2 years it hasn't happened in any work setting.

I know people like to have the feeling of control (do you really control your laptop if you can't replace your EC, CSME, AGESA, SSD FW, NIC FW, VBIOS) and the concept of shuffling parts around, but the need for that (at least in my circle) has pretty much faded.


Why wait for mid lifecycle? Buy a low-specced machine and the drop in larger storage and memory rather than paying a huge premium to the vendor. It's not about control as much as just saving money.


Mid-lifecycle because saving 10% on an expensive device isn't what most people are interested in when they are buying something to work on.

Yes, people exist that do that or want that or really need to save every cent possible, but to many others that is not even interesting as a thought exercise.

So I referred to mid-lifecycle because those people that initially don't try to go on the cheap usually only did any part swapping because the machine was good enough and swapping machines was too much of a hassle: upgrades made sense.


> On the flip side: does anyone still buy a low-specced machine and upgrade it mid-lifecycle?

I've upgraded hard drives and RAM on very single Mac I've ever owned, except the last Mac I purchased, a 2015 MBP, where it wasn't an option.

It's looking like this one will be literally the last Mac I purchase. Apple's practice of charging premium, bespoke prices for commodity hardware was insulting, but at least there was a reasonable workaround before they started the soldering crap. Soldering things down is just a brazen money grab.

Can't believe I'm looking at going back to Windows but here we are.


Same here. My 2012 MBP has been upgraded with more RAM and a SSD and the battery is replaceable (but it's lasted better than my work 2017 model and is healthier!) but the cost of the new devices is so astronomical it'll be the last one I buy. Since I will have to invest in a new ARM device just to develop for their new platform, it's goodbye Mac and back to Linux (already got Windows as a side Dev machine).


Apple is making it a pain to do even basic things like replace the battery, which is a consumable part.

I've personally upgraded the hard drive and replaced the battery twice on my old 2011 MacBook Pro.


I have done that on a 2009 as well, but that is a slow machine at this point. I don't think I've used it beyond 2014. I have a 2015 around that I still use and works perfectly fine, but so does the 2018 which is lighter. Neither need upgrades or changes to do they work they are doing.

I did upgrade someone's 2015 in 2018 with a bigger SSD because they needed 2TB and it wasn't available at that time (they were on 512) and they used it as a local buffer for footage before offloading. Because they only have USB3 or TB3 drives and no TB2 drives, the multiple transfers on that MBP are too time consuming to do in the field.


Why does it have to be low-spec? I bought my current laptop with 16GB of RAM in it (soldered to the mainboard, sadly) because that's the most the manufacturer offered at the time. If I had the option today, I'd open it up and swap out the modules for 32GB. (I think they now offer a 32GB model.)

Ditto for the storage; I have 512GB in there now (probably wasn't the max when I bought it, but seemed sufficient and was a good price trade off), but I wouldn't mind swapping it out for a 1TB part. Actually, this bit might be possible to do myself; I should look into it.


Apple usually has a large premium to upgrade the specs. If you can change your own memory or SSD you could save a considerable amount. But now they are soldering everything.


That is only something people take issue with because it used to be modular. You don't see people making the same argument with phones, printers, iPads, TVs, Cars, Smartwatches etc.

I'm not saying soldering things down is ideal, but it's only a problem because people perceive an option that was there but now isn't as 'bad faith' or 'taking away' something. Imagine the riots if we used to be able to change the CPU in a laptop. (in theory we 'used' to be able to but in practise that never worked well because the microcode didn't fit in the firmware and the cooling solution only takes a small amount of TDP variance - popping in a faster CPU meant it overheated so quickly it was useless)

Would it be cool if all devices were composable? Sure. Is it something users care about? Not really.

I'd like it if Apple found a way to allow swapping of components within the design envelope constraints they create d for themselves, but it also won't have an impact on what I'd buy.


> does anyone still buy a low-specced machine and upgrade it mid-lifecycle?

Yes, they do. Sometimes a technology like SSD itself gives a great boost in performance equivalent to a new costlier system.


I’ve certainly bumped up the RAM in the past.


No, this is not what I want, and I’m a power user. I want my machine to solve my problems and not get in my way when I am creating solutions. I do not want to be messing around with the internals, I have better things to do . I pay for it to just work in an environment I hope is vaguely private.


> I want my machine to solve my problems and not get in my way when I am creating solutions.

That's literally why I switched to Linux in 2018, after a decade on OS X.


Linux does not solve my problems (as a power user with Linux). It does not work with the proprietary tools I need to do my job and I’ve broken apt way too many times. I’m not remotely interested in debugging this at this stage of my life.


Proprietary tools, I get. But "breaking" linux is something I don't buy.


How are you managing to break apt not once, but "way too many times?"

I haven't used Linux as a main driver in a decade but I do use it very frequently and I am intrigued by all the many ways people online seem to be able to break it. I have never once run into a problem with apt.

It really makes me wonder how much time people who complain about Linux have really put into learning it. I'm not thinking you need to read a book on it, but after a few months of just using it I would think one would be reasonably proficient?

I understand the claims of needing proprietary software. I, too, with that the Office suite was a first class citizen. That said, it absolutely baffles me the amount of weird ways people have issues.


> How are you managing to break apt not once, but "way too many times?"

Raises hand my home server (running Debian) is currently in a weird state where a whole bunch of packages won't install or update because some library's stuck on a version they don't like. The version they want isn't available. I'm pretty sure I haven't even added any nonstandard repos to it (all it really runs is docker, so anything odd comes in as a docker container rather than a system package—this in part because I don't trust Linux package managers not to break my system when installing non-critical software due to long experience of same happening over and over, go figure). I'm not sure what I did to cause it and it's still running Docker fine so my incentive to spend an hour tracking down and fixing whatever-it-is has been zero—I'm just glad it's not a system I depend on for anything serious or need to really do anything with.

I spent a few years running Gentoo as my main machine in the 00s. On a laptop. Got suspend-to-disk working, even. I had to manually install Grub on this same Debian home server when I built it last yer, in a chroot to my newly-installed system, because the installer kept failing to do it (my process to fix it was very ordinary and encountered no problems, AFAIK, so I still don't know why the Debian installer couldn't do it, I've never seen it fail quite that way before, but it did so repeatedly so I had to give up and open up the engine, as it were). So I'm not entirely clueless.

Nonetheless I don't really feel confident using a Linux desktop I can't snapshot for emergency rollbacks or rebuild in a few minutes from a script, because damned if weird problems don't crop up when you upgrade. Or don't upgrade. Or reboot, forgetting you'd installed (through the blessed package manager!) a new kernel and now it can't read your encrypted root anymore so is unbootable and now you get to spend some time figuring that out. And so on.

I feel no such anxiety on macOS. Not that that it never breaks, but it's rare enough I don't worry about it. FWIW I'm back on desktop Linux now due to Macs' insane prices, but I suspect I made a mistake and the time I've already lost would have made spending an extra $500 on a significantly worse-spec'd machine well worth it. It's a little here, a little there, but it adds up.


Fwiw, I actually manage my linux laptop with Ansible.


I can't tell you about all of the times, but I will tell you about one time it broke and why. I did a debian clean install. Then I added the Spotify repository. Then I apt-get spotify. The script hung and then apt got into this weird loop. I had to add exit 0 to a script to fix it. This isn't what i want to do. I have over 10 years experience with Linux. I've even done Linux from scratch. I learned it already, this is not the problem. The problem is that Linux doesn't have stability currently that satisfies my needs and life goals.


The last T series Thinkpad has soldered RAM. Granted it still has one non-soldered slot, but still. It feels like the world is closing for of us that value power and control.


As does most other high end laptops....


There’s a lot more options for high end laptops for Linux.


Apart from the Dell XPS13 what other options are comparable to the MacBook Pro?


I used to use macOS because it was a Unix that looked good and just worked, but both observations have become less and less true with time.

In that same time, Linux really caught up and diverged such that it offers features in its kernel, like cgroups and kernel namespaces, that macOS doesn't match, leaving me to feel as if the old FreeBSD roots of macOS are less of a perk and more of a kludge in 2020.

Today, I run Linux on my MacBook because it is a modern Unix clone that looks good and works well with all of my development tools. Native Docker is a treat.


What flavor or unix if I may ask? Do you have driver issues?TIA! Cheers.


I'm using Kubuntu, and I haven't had any driver issues yet, but I'm not doing anything crazy with hardware or using proprietary drivers. It's been a pretty pleasant experience.


I've always used Apple products because they just work. UNIX compatibility has been great and I've run docker, run virtual machines, given presentations and I love the hardware. Most of the restrictions are tradeoffs for security and the real problem is that I have tried having workflows on other systems and tried to switch off of it (I had linux on my MacBook Pro 2017. Below are my most frustrating situations

1. My MacBook Pro 2017 had some weird problems due to voltage of the CPU. Somehow they misplaced the laptop but they did a full top case replacement eventually. It was weird when someone came out of the store with handcuffs while I was in the Apple store. With this Laptop I wasn't disappointed by the keyboard or the Touch Bar and never experienced reliability issues. I sold it due to frustration but I shouldn't have done it. 2. My AirPods have been replaced twice due to buzzing. This could have been me not cleaning them and sometimes I still experience buzzing sometimes but cleaning them seemed to work. I got both earbuds replaced twice 3. iCloud got messed up on my MacBook 2016 when I installed a beta version of Catalina and that was my fault had to go on support and they never got back to me about an engineer looking at my iCloud. 4. My MacBook Pro Retina (around 2012) had a zebra pattern on the monitor. I took it to the Apple store twice and they just gave me a new laptop.

With all of these issues over the years Windows is much more pain to even use. On my Gaming computer Wifi still can take minutes to start. I had to upgrade it to Wifi 6 and that may have fixed it with an external wifi.

My current Laptop (MacBook Pro 16) has good thermal management now. Plugging into a monitor does increase the thermals to 60 C . Sometimes I turn off turbo boost. Other than that having a laptop with 64 GB of Ram, a 5500m graphics card is pretty nice especially with a 2.3 GHz 8-Core Intel Core i9. Somehow I actually was able to refine a few iterations of GPT-2 on the CPU but that was by mistake when I was cleaning data.

I like the Touch Bar because it lets me not change to using the trackpad sometimes due to RSI. I also have an external trackball mouse.

Wifi and bluetooth open with no issue. System is really responsive even compared to my Windows machine. Screens have always been amazing and color accurate. Emacs runs great also. Terminal.app keeps on getting better. UI looks like it's going to be cleaned up in MacOS 11. Having ARM is going to be very exciting and Metal is even more exciting and might be the only thing that will start to compete with NVIDIA.


Interesting you mention Emacs. The Mac port of Emacs runs noticeably slower than Linux depending on operation (magit especially is very slow). This is due to the much slower fork call on OS X I believe. I found Emacs so painfully slow on a MBP 16” I gave up on OS X completely. It might just be you are used to it and it’s good enough. For me OS X feels like a laggy mess.


I have experienced Linux being more responsive than macOS when I run it on my gaming / deep learning computer or when I loaded it on my MacBook Pro 2017 . The problem was the rest of the OS didn't work or was a nightmare. On a MacBook Pro Linux would never suspend correctly. On my Gaming / Deep learning computer Ubuntu keeps on updating the kernel and ruining my work installing linux to the point I gave up multiple times. And by the time that Dell Developer Editions started being competitive I would be SSHing into AWS or doing something completely different.

My workloads are getting to the point that a Colab Pro Notebook sounds like a good idea or setting up my Gaming / Deep learning computer to use RDP into it using Windows 10. But by now I would probably even want to install Windows Server 2019 due to the fact that the NVIDIA drivers would support the card and when I run video games I don't have to manually kill background services that subtract 30 FPS from my video games making them unplayable on a Titan RTX.


Docker on macOS has been pretty bad for me, too...


This somewhat echoes some of Steve Job's own complaints about the Apple he was kicked out of: https://youtu.be/Gk-9Fd2mEnI?t=2078

Also note that at the time he was creating a NeXT, a Unix company.


Thanks for linking the video. Interesting to listen to Jobs -- seems to me too that he argues against what Apple is doing nowadays.

I actually 10 minutes ago had a look at how much a 32 GB Mac costs nowadays -- and it's like 2x more than a 32 GB PC. That's less than some years ago, then a Mac was more like 3x more expensive, if I remember correctly. Looks like the trend Jobs mentions in the video, about lower and lower ASP. (ASP = average selling price, right.)


I second this. My MBPs 2015 are the best machine ever. The only things that could make them better is upgradeable RAM (16GB can be tight with VMs) and an nvidia gpu for faster DL prototyping offline. Apple Unix has a long history (AIX) but MacOSX was most def the best OS ever. As Apple is moving away from this unix power users's paradise, MS seems to be moving in, with tighter and tighter integration, now offering easy GPU passthrough to the unix subsystem... Open sourcing the rest of 10.14 would be really awesome by Apple. (one can only dream)


I came to Mac because copy/pasta into a Windows CMD terminal is/was abysmal in 2010. I didn't know about Powershell at the time, so you don't know if that would have made life nicer.

The other reason I switched was because all of my developer friends knew nothing about Windows, and getting stuck on a maybe Windows problem meant I had to fend for myself when I needed help the most.


I have a 13' macbook Pro. Really is the pinnacle. I don't even know which osx it is, but it just feels a lot more 'serious'.

I'm not a fan of the new OSX, guess mobile is more prevalent and maybe apple are hoping the iPhone and ipad can be gateway drugs into macbooks / desktop by being more of the same.


Maybe someone more knowledgable on Apple history can correct me but aren't MacBooks being attractive to developers due to the Unix layer a happy accident? Apple never made a plan for that, and I don't think they really care more about that demographic more than before. I use Macs since nearly 30 years and I find it useful that I can now also develop on it, but I can totally envision switching for work, just like in the pre-Intel era. Apple always was about end-users, don't fight it.


It went all downhill once a computer became a smartphone / tablet and the common mass accepted that not being to upgrade them is normal. And now we have mac mini's with soldered ram and soldered cpus and even soldered ssd's and batteries that cannot be removed. The huge amount of waste that this generate is so upsetting.

Wish more efforts like the Puzzle phone in europe would bear some fruit.


> And now that balance has tipped away from the power-user, the developers, to the end-user

What do you mean by that? (Personally, I can't think of anything I can't do now that I could do five or ten years ago on my Mac. Yet a lot of things are better.)


I personally think that the MacBook Pro 16 inch is the world’s best laptop, and look forward to see what the next generations can do.

But the whole point has been MacOS. I’m not sure what to think of the changes. If they stabilize the mess that was Catalina, then they’re doing the right things, even with the increased iOS-ification of the Mac. If they just slap more Windows 8 style touch gloss on deep and unresolvable system level issues, then Apple has reverted to all the reasons that Classic MacOS was so unstable.

That said, after 20 years with the Mac I recently added an Alienware Aurora R9 desktop to my life. Wanted to PC game again, and use VR. Also wanted to catch up with Windows, as I hadn’t really used it regularly for 10 years. Early Windows 10 did not impress me.

I have been surprisingly really happy with Windows 10 build 2004 and Windows Subsystem for Linux v2. To the point it’s my main dev box now. (With COVID there’s less need for a laptop!).

Some weirdness with cut and paste, but file system sharing and networking has mostly been smoothed from what I remember. Docker runs well integrated, as does Kubernetes. Everything is blazing fast. The main thing I miss is Spotlight search, but there are third party replacements for that.

I really can’t see myself moving back to Linux, which was my go to prior to the Mac from 1995-1999. I tend to follow the JWZ adage, Linux is only free if your time has no value. I use it as a server OS plenty, but don’t really need it on my desktop.


> Linux is only free if your time has no value

Hah, that's great. I'll have to tuck that one away.


That was probably a fair assessment a decade ago, but Linux seriously "just works" now. About a year ago I bought a Dell XPS 13, I've thrown Pop!_OS on there, and it worked out of the box, no driver downloads, no sneakernetting Wi-Fi drivers, no codec installs, no track-pad problems (two fingers to right click works!), no scaling issues, no 3rd party repos. It just works. (I'm sure FOSS folks may dislike the distribution of 3rd party binaries, but I don't use Linux because I'm a FOSS purist.)

Pop!_OS comes with a built-in tiling manager you can toggle on that completely changes the game for development. If you have never used a tiling manager, you owe it to yourself to do it at least once. The built in software manager is excellent, and is the one-stop-shop for installing VSCode, Android Studio, Steam, Chromium and IntelliJ all at once. No finding/adding repos, automatic updates, it just works.

Even gaming (on Steam and Lutris) is a breeze. It honestly requires zero tinkering (on Steam at least) to get 90% of my library to run (including AAA titles.) If you haven't used desktop Linux since 2010, it is a radically different experience. It honestly is easier to use and get what I want it to do than a MBP today.

(Disclaimer, I'm sure mileage varies depending on your environment, hardware, and chosen distro, just sharing my experience. I prefer to compare the XPS + Linux experience to MBP + macOS)


As someone that starting using Linux instead of Mac 2 years ago, this is definitely not the case. External monitors are always glitching out (not detected on resume), sleep didn't work at all, adding hibernate support took days of research (due to whole disk encryption). I once completely hosed my system by installing the latest NVIDIA drivers (Windows will automatically fall back to a built in driver if third party ones crash) and it took me days to get it working again and then it never worked the same. Tearing while scrolling Firefox is a disaster even after trying every X11 trick in the book.

The one thing I really, really like is using i3 to make every app run on a specific monitor and tiling all windows. When I restart my machine, every app I use launches on the right monitor. This customization has been awesome.


Ironically, I first used it in 2010

It's definitely gotten better over the past decade, and maybe the jab is a little unfair if you're using off-the-shelf Ubuntu on very vanilla hardware. But that's mostly what I do when I use Linux, and yet I've never had it "just work" without several asterisks.

Sure, the mouse and keyboard just work. The display and network adapter (usually) just work. Audio may or may not just work. A second display may not play very nicely. Setting up GRUB/dual-booting always takes some effort to get it running smoothly enough for daily use. Installing my dev tools, there's always one random things that goes wrong. Sometimes I'll install my browser the wrong way and it doesn't get an icon in my launcher; so I have to go look up how to configure that manually. I still have this weird issue where whenever I boot Linux and then boot back to Windows, my system clock (in Windows) is off by 6 hours; I gave up trying to fix it because it wasn't worth the effort.

It's a death by a thousand cuts. I'm sure officially-supported installations like those on the XPS 13 work just fine, but if you're installing it yourself you're going to get pulled off track from what you were trying to do with your computer, and into forums trying to find someone else who had your exact issue. Some people revel in this experience, and I've been known to take a sense of accomplishment from it before, but most of the time I don't want to be messing with my computer, I want to be using it.


I feel like Linux people have got so used to sorting out those thousand cuts, they don't really think about it as a problem anymore, just part of initial setup. But as soon as Joe Bloggs installs it and his wifi doesn't work or his graphics card has a minor glitch it's a complete dealbreaker.

And I agree with your assessment that I don't think I've ever installed even Ubuntu and had everything work perfectly out of the box. In contrast, Windows has plenty of things you'll want to change configuration on, but usually nothing that's outright broken.


That's true- and macOS has plenty of things I reconfigure whenever I set up a fresh install too, so I guess you could argue that takes up just as much time. I would say it's a do-once-and-forget thing, but then I'm sure there are people who would say the same about Linux issues.


I've never had an experience with dual booting that wasn't absolutely miserable.

> It's definitely gotten better over the past decade, and maybe the jab is a little unfair if you're using off-the-shelf Ubuntu on very vanilla hardware

> I'm sure officially-supported installations like those on the XPS 13 work just fine

That's fair, my use cases pretty much line up with that. I do view the XPS with Pop to be a MBP killer. I did install it myself (getting it on a USB drive is harder than literally the rest of the setup combined (did not dual boot, clean install,) and that wasn't bad at all) but Dell's Project Sputnik ensures Linux compatibility with the XPS line, and Pop!_OS noticed and applied firmware updates for my laptop. I compare the experience to choosing between a MBP and a Hackintosh. XPS + Linux seems to work better for dev work, better for software compatibility, far better for gaming, and is arguably a better experience overall compared to the MBP.

Now, I do also run Pop!_OS on a desktop that I put together (Ryzen 5 & GTX 1060) (no dual booting) and have had a similarly flawless experience. Pop!_OS installed my NVidia driver on the initial install. Better experience for dev work than Windows (haven't given WSL a shot yet though,) but a small number of gaming titles don't work. No where near the experience of setting up a Hackintosh, and compatible with far more software than Catalina.


Personally I couldn't care less about either one for gaming; I would never spend money on gaming hardware and then put anything other than Windows on it.

I would certainly do Linux over a Hackintosh any day, because for me the entire point of Apple-anything is having the complete package that's been designed and ensured to work well together. At least with Linux they know you're trying to assemble something yourself and there's a large community of forum-goers giving you a good chance of finding all those little fixes you're going to need.

Personally I use Windows for gaming and a Mac for dev and my Linux partition for that rare dev use-case that just doesn't quite work on the Mac (happens maybe once a year)


> I still have this weird issue where whenever I boot Linux and then boot back to Windows, my system clock (in Windows) is off by 6 hours; I gave up trying to fix it because it wasn't worth the effort.

I fixed this in pretty quickly after install. The reason for the issue is that your Linux machine uses UTC to store time on the hardware clock whereas Windows uses the local timezone. The solution is to either let windows use UTC or Linux use local time.

In my case I went with Windows on UTC based on this link: https://wiki.archlinux.org/index.php/System_time#UTC_in_Wind...

Hope that helps


I bought my 2018 XPS 13 with a Windows 10 license, but tried installing Dell's official Ubuntu image on it after a few days, and couldn't live with it. Trackpad acceleration was awkward, fingerprint reader unlock didn't work, and font scaling was too small (along with other UI elements). I wouldn't be surprised if the trackpad and font scaling issues were fixable, but I no longer have the interest in fixing those issues. So while it's very usable, Linux is still far from parity with MacOS or Windows 10 on the "it just works" front, and I don't expect it to reach parity in the foreseeable future.


Pop!_OS is far more polished than Ubuntu nowadays, scaling is better on Hi-DPI displays and theming is far more consistent (largely due to the fact that their vastly superior store defaults to either DEB or flatpak.)

I will concede that the fingerprint reader doesn't work, and I haven't spent any time trying to make it work, I don't have much patience for tinkering with stuff like that these days.


That is what I have being seeing since used Slackware 2.0, followed by Yasdril, Debian, Mandrake, Red-Hat, Ubuntu, ....

It always goes as follows:

Complain: I am trying distribution X, having problems A, B, C

Standard Reply: Why are you still using X? Distribution Y is much better now.

Left as exercise for the reader when Slackware 2.0 came out.


It only just works for laptops that are slightly older, and with Linux friendly hardware.

For ex: I got a Lenovo X1 Extreme near release date and Linux was a bit of a pain to get working... I actually bricked my first laptop due to a simple bios config change which was actually Lenovo's fault - but I don't have to change those settings to install Windows.

Don't get me wrong. I love Unix/Linux. But saying it just works is misleading.


> but Linux seriously "just works" now

I agree it’s far better than it used to be but until I can install programs without having to crack open terminal I feel like it still has a ways to go.

The latest LTS version of the largest distro doesn’t even let you drag and drop to the desktop.


You don't need to crack open a terminal.

System76 maintains a list of trusted repos that allows you to install all of your software from the Pop!_Shop. All of the examples I gave are real, you can install VSCode, IntelliJ, Android Studio, Steam, and much more from official sources and official repos with one click on the Pop!_Shop. It is essentially a pretty, managed app store that handles updates, installs, and repos for you.


And if you want to install a game or other program not in the limited Pop!_Shop?


The Pop!_Shop isn't like the Linux app stores of yesteryear (namely Ubuntu's graveyard that is Software Center.) It has tons of software in it. Should you need software that isn't in the Shop (which almost never happens for me) you can install stuff the same way you would on Windows: download the DEB/flatpakref, double-click, and the Pop!_Shop will take over from there.

For non-Steam games, I use Lutris, which is probably the least polished part of getting software installed, but I still don't have to crack open a terminal. If that doesn't immediately work, I don't bother with it anymore. I can get 90%+ of my titles running this way.


>And the Pop!_Shop will take over from there.

From installing my vpn app to games from gog I find I have to go to the terminal for a good percentage of my installations.

Even with simple utilities though, I find that the app store is FULL of garbage. Why are there multiple VLC entries in the store published by people other than VideoLAN?

There is a Gog galaxy client published by a random developer that is just a wine wrapper of the windows client (and also doesn't work). This is not a good experience and I don't think the average user is ready for that sort of thing.

> I can get 90%+ of my titles running this way.

I'll agree that a majority can come from the app store (for me its probably more like 70%) but it needs to be way higher imo to be ready for the average user.

Along similar lines I am currently having an issue with my ubuntu 20 install where i have to change the audio device every time I boot my computer, there doesn't seem to be a way to set the default device -again- without popping into terminal. It is frustrating to deal with all these idiosyncrasies which is why I prefer to work on MAC OS when I can.


> That was probably a fair assessment a decade ago, but Linux seriously "just works" now.

My anecdotal experience tells me otherwise. During confinement, I tried to install OpenSUSE on an old laptop I had around for the kids to do their classes online. The laptop kept losing wifi credentials. Able, but not wanting to, troubleshoot I proceeded to install Unbuntu. That was all good until I went on the GUI to enable auto-login. Now I cannot log in. Not big issues, I have the skills to fix it but not the time.

edit: typos


> two fingers to right click works!

I really live by trying not to be snarky but this is testing my principles given how nice a mac trackpad is... (used Linux for a long time and now I've been on macos for almost the same amount of years, I love Linux and appreciate all the work people put into it, to be clear)


It may support more, I just don't know what else to do with it (I usually just use my MX Master 3.)

Tell me a gesture you use or rely on and I'll give it a shot and reply with my results. I'm ignorant on the cool stuff one can do with a Mac track pad. I thought the two fingers to right-click was pretty neat.


My AMD GPU on my travel netbook begs to differ with Linux seriously "just works" now.


Counterpoint: what some might considered time wasted, others could consider time learning, and when a task gets repetitive, you can automate it. That's what made me fall in love with Linux and BSD.


You can't automate hunting through forums and applying the incantations found therein, and most of the things you learn by doing so (in this context) are only useful for solving a very narrow subset of adjacent problems; they aren't usually general principles you can get a lot of mileage out of later


That's the bs that Apple and Microsoft sell because today's Linux can meet the general needs of most users now a days, for free.


It's true if you want to run GUI apps more complex than xeyes (anything that draws text)


> Linux is only free if your time has no value.

Linux has cost me less time than either Windows or OS X. But that's me; everyone's personal workload and usage pattern is different. My experience with OS X was that any time I tried to color outside the lines, the system worked against me, and that cost me more time. (My experience with Windows has generally been that it just sucks.) I haven't had that experience with Linux.


For me, I can spend an entire day tweaking every bit of config in my environment, from my editors/IDEs all the way to my desktop environment/window manager (which of those would I even want to use? That's a week - at least - each to evaluate and twiddle with configuration). Hell, I could spend months evaluating which distribution I want to use.

And sure, I'm doing productive work in there as well, but probably less than I do on my Mac. I still spend too much time tweaking my dev environment and switching between vim and emacs and IDEs, but that's all I really can choose between, which saves time to do actual productive work.


You can also just use a given distribution as-is, the major ones are usable out of the box just like macOS. Just because you _can_ spend a week tweaking every possible setting doesn't mean you have to.


I don't think that (most) Linux users see "free" as the main reason for using Linux.

Personally if Linux had a price tag, and Windows were free I would still be a Linux user.


That is true. If I put it into monetary terms, I would gladly pay 50 EUR per month to have an operating system that did the following:

- UNIX command line integration and generally following UNIX philosophy

- Docker integration working out of the box

- no spying on me

- fractional scaling support

- Nvidia drivers support (for gaming)

Running Windows 10, Ubuntu 20.04, Mac OS Mojave and Mac OS Catalina on different computers because of reasons and all of them lack something.


I'd pay $250 or $1000 for just using Arch month by month. Bear in mind I actually do, I donate $1200 plus to "Linux" or open source in general, per month, because I like it so much.

Haphazardly donating ~~ $1200 per month, and many times; more, but that's about it. It's a donate system I know this, put your money where your hearts is or are. Which could be anywhere. But that's my plan and in retrospects. Donate people, support whatever you want, whether it's Candy Crush for windows 8 or some other projects. I do. And have done. It's been going on for 14 years now. Donate people. Where you want to see either changes or stability.


I'd rather have the Nvidia support for ML. But more importantly, the OS would need to support a multi-touch trackpad and the Adobe suite.


Sure they do, otherwise selling desktop software to Linux users would actually be profitable.


That's a chicken and egg problem and has nothing to do with Linux users being cheap. It has more to do with nobody actually using Linux on the desktop. Why would you devote a ton of dev time to supporting a platform with "dozens of us!"?


Selling desktop software to Linux users is only unprofitable because it often sucks and when it doesn't the market share isn't there. Have you ever seen anyone actually pay for Windows or Mac OS? No? right.


Plenty of times actually, that is why they sell Windows OEM CDs and macOS used to be sold on stores as well.


They basically sold recovery keys.

As far as Windows, 95% of the people I know pirated Windows. The only people that buy windows are enterprises.


Windows has been being sold in boxes since early 90s, 30 years long by now, and while rampant in some countries, not every country is/was a piracy dream.


The kind of people that need to buy Windows (ie, those that build their own computers) are the kind of people that either pirate it or buy illegally sourced volume keys for 3$.


Keep telling that to yourself.


macOS is free and has been for a long time, though obviously there are issues with getting it running on non-Apple hardware.

Most people do purchase Windows, it's just that the license is included in their hardware purchase.


I don't count that as purchasing Windows. In the same way the cost of Linux support for my computer is included in the hardware purchase because AMD paid engineers to write drivers.

I'd count consciously giving money for a license as paying for the license, because the argument is about the attitude.


I didn't know macOS license allowed you to install it on anything other than a Mac.


> Linux is only free if your time has no value.

I don't get this. I have more issues with Windows 10 than I do Ubuntu. Linux used to be fiddly, but not anymore. Windows 10 forgets stuff. You have bluetooth? Nope, maybe if you reboot you'll get it back. Hibernate your laptop and take it to the coffee shop. Black screen. Have to reboot.

I don't have issues like this with Linux.

Now, some of that could just be Lenovo. But Linux on Lenovo is fine.


These sound like hardware driver issues with your Windows 10, these are not normal experiences.


­>inux is only free if your time has no value. I use it as a server OS plenty, but don’t really need it on my desktop.

To me this means "I don't know how to get Ubuntu working". Even Arch Linux requires less maintenance for me than my 2013 Macbook Pro did. As soon as you get the setup over with correctly you never have an issue, unless you can't get set it up correctly.

Windows is nice when everything works, and if you don't do anything too advanced for your setup and never have an issue it's good. But fixing issues is much easier in Linux than Windows as long as you know what you are doing.


Your argument doesn't really prove anything, you can just flip it around:

Linux is nice when everything works, and if you don't do anything too advanced for your setup and never have an issue it's good. But fixing issues is much easier in Windows than Linux as long as you know what you are doing.

Anything will be easier to do "as long as you know what you are doing."


But it's not. Windows is not an open operating system, many of the issues simply cannot be fixed and the only solution is to basically reinstall Windows. The error codes are often opaque and there is a ton of the time no documentation as to what's happening behind the scenes. Whereas on Linux I've never ran across an issue whose official solution was a reinstall.


Have you ever read Windows Internals? Or MSDN? It has more documentation than any Linux system, ever. Even Microsoft's documentation of Apple's SDKs is better than Apple's. Windows Internals books are extremely extremely detailed.

I've ran Windows since 3.11 and have never ever had to reinstall Windows to fix things. Perhaps you should take a look at Event Viewer in more detail to see why things are broken.


I worked as a Windows sysadmin. I'm familiar with Windows Internals and Sysinternals. But when the bootloader decides it's going to go into automatic repair or now or when the indexer decides that it needs to use 100% of your HDD for the next year, there's nothing you can do about it but reinstall. When some Windows services will fail exclusively if your locale is set to French, there's also no explanation, and the log gives you no clue about what's going wrong. I could go on and on. Every update the already incomplete Internals documentation gets slowly more obsolete. And that's not even taking into account the clusterfuck it becomes when you need to modify anything.

There's a reason why when you need something to be stable you don't use Windows.


Perhaps I have not had these problems because my locale is UK English.

I've used Windows for decades and it has been very stable for me for the past 7+ years.


> the setup over with correctly you never have an issue, unless you can't get set it up correctly.

Eh... I don't know about that. I still have problems with GPU drivers when updating Ubuntu (and I don't think that's significantly different for other distros)


I personally don't have such issues due to rolling distros, but FWIW I had the same problems when upgrading Windows as far as graphics drivers. Though yes, on every OS a major system upgrade might require you to repeat part of your set-up. The only OS where I haven't had such issues are Linux rolling distros.


> To me this means "I don't know how to get Ubuntu working".

Yes, I'm sure Jamie Zawinski didn't understand Unix systems well enough to get them running.


He said that over twenty years ago. Things changed. A lot.


I spent $800 on a new Win 10 laptop with an entry-level NVIDIA GPU plus an extra memory stick. Installing the memory took about 10 minutes and was easy enough for my 11-year-old to do herself. It can stream and video conference at once without even getting warm. It can play mid-tier 3d games pretty comfortably. The MBP is so hot its painful. 15 minutes of Zoom or Minecraft is too much. I only need it because I need to support Safari/iOS and Apple doesn't allow cross-platform emulators or VMs.


Nothing can be upgraded on Macs - SSD, memory. Maybe Mac users do not care, but the most annoying thing about this is that others are copying Apple. Even the legendary thinkpads, some of them are soldered and non upgradeable. Even the screws on Macs are non standard. We've to use dongles for everything.

For this alone, it is better to get windows hardware. They are pretty good these days and cost a lot less. Plus there are lots of interesting things going on - second screens for example.


What laptop did you get?


Acer Nitro 5


I feel like I'm posting the same thing comment over and over again over the past few weeks. :)

I've been using Linux on Laptops since the last 90s, on a variety of Thinkpads. T42, T500, T440p, X1E gen2

Yes, 10 or 15 years ago you had to fiddle a lot, I mostly even compiled my own customized Kernels.

This was mostly due the hardware manufacturer not releasing specs for anything, so you had to fiddle around that.

At some point I stopped compiling my own Kernels, it wasn't needed anymore, but there was still a bunch of software I wanted and compiled from source. Then that too became less and less, since what I need is now typically shipped with the distributions (I've been using Fedora).

Then it required fiddling with the video drivers to support dual video cards (Intel and NVidia). No longer needed. Now you just add the rpmfusion repos and install the drivers.

Everything just works. (New network printers or other devices are detected, the right drivers found, etc, etc.)

Cutting edge hardware (perhaps laptops releases within the last 1/2 year or so) sometimes still gives problems.

With Lenovo actually shipping Linux with its new laptops even that will no longer be a problem - at least with these laptops.

In addition, if you want to, you can know what's going on if you use Linux.

In fact I find myself getting annoyed with both MacOS and Windows (when I look at problems on my friends' machines). Everything is so opaque.


Counterpoint: I've spent a total of an hour (ish) futzing around with my Debian install, for my home/hobby laptop. I feel like my work (OSX) laptop gives me much more to do, but that could be because I'm expecting more of it. My Linux workstation(s) require basically 0 maintenance.


Depends on your environment a lot. My last colleague who tried Debian literally never got it to work with the corporate WiFi.


> The main thing I miss is Spotlight search, but there are third party replacements for that.

Such as?

I've been looking for a decent launcher for Windows but all the options are pretty mediocre compared to Spotlight (or Alfred which is what I use).


Microsoft is adding one to PowerToys: https://github.com/microsoft/powertoys


Oh this one looks nice. Thanks!


Launchy[1] used to be good, but hasn't been updated in a decade. I'm not sure if it works in latest Windows 10.

[1] https://www.launchy.net


Yeah... both Launchy and Wox are the ones that I tried.


Everything from voidtools.com is a great search tool.


Yeah but it's only for files. No web search or other features.


What I think a lot of us programmers miss is the experience of the end-users who are likely to buy a Mac.

The fact of the matter is, 99.9% of people who buy a Mac are not going to know what subpixel hinting is. Or care too much that they can't delete Chess.app. To them, the Mac is nicer looking and easier to use than ever.

Will iOS devs care? Not at all - developing for the Mac has become easier with Catalyst. It's easy and cheap to get a new audience for your apps. Will web devs care? Not while homebrew and all of the packages in it still work fine. I, as a web dev, have everything working great on the Big Sur beta already.

And as for some saying that MacOS is becoming more iOS like: Yes, it's been happening since Mac OS X Lion with the fullscreen Launchpad. Nothing has changed.


And I’ll continue to buy them.

Why? Because I have better things to do with my time than fiddle with my OS, and because it’s still Unix like without having an awful user experience.

And my employers will continue to buy macs for me for largely similar reasons. My last colleague who tried to use Linux literally never got their laptop to work with the WiFi authentication, so they had to wire in.


Exactly this. Let’s assume you have a computer because it helps you in getting tasks done.

In Windows, antivirus bloatware and ads in your start menu stand between you and that goal. Third party software is often of very disputable quality and design.

In Linux, it’s always a surprise what thing/dependency is suddenly broken on this boot, and what arcane commands I need to fix it.

On macOS generally I boot my machine, click on the app I need to perform my task, the app will be of sufficient to very good quality, I do my task and I’m done.


Been running Manjaro Linux for 2 years now on 2 desktop workstations and 1 laptop. No fiddling needed. Everything just works. (I had to install one utility called libinput-gestures to use multi-touch gestures on the laptop.)

Matter of fact, my life is easier with this setup than it ever was with macOS or Windows since every piece of software I need is in the package manager including official Chrome, Slack, Thunderbird, Zoom, Beyond Compare, VS Code, Node.js, .NET Core, Docker, MySQL Workbench, Azure Data Studio and many, many, many more...

I avoided Linux for years because every time I tried, it would break but Manjaro/XFCE never broke once on me. I can't recommend it enough. So glad I'm no longer a slave to Apple or Microsoft.


Buying a laptop to run manjaro is a minefield. I have had some Just Work. Those were generally older laptops without dedicated graphics. I have had newer laptops fail to properly switch between graphics, rip through my battery, not reconnect to wifi after waking up forcing me to shut down and boot up every time I wanted it to sleep, have middling to poor touchpad drivers, not have the webcam work out of the box. For those who get lucky, great. For those that want to fix that stuff, which is sometimes easy, sometimes hard, other times nearly impossible without writing drivers yourself, good for them too. I don't want any of that.


That's a good point.

I think the Arch wiki is a good resource for this. My laptop is listed here - https://wiki.archlinux.org/index.php/Acer_Aspire_E5-575

I have the exceptionally cheap Acer E5-575g model (quad core i5) with RAM upgraded to 32gb and a 500gb Samsung EVO ssd. The touchpad is an Elantech.

If anyone knows another good site, please let me know - I'll be shopping for an upgrade early next year.


I tried manjaro after living on debian based distros for most of my life. I somehow ended up in dependency hell after only a month.


Meh, I just load up manjaro with XFCE and call it a day. Not much fiddling involved, very little about the user experience that I don't like.

Only thing I ever really tweaked was running a more up-to-date kernel for an old soundcard I used to use. Which took a couple of clicks.

Sorry for your colleague who had trouble with wifi, but. /shrug


> And as for some saying that MacOS is becoming more iOS like: Yes, it's been happening since Mac OS X Lion with the fullscreen Launchpad. Nothing has changed.

I'd say a lot has changed, because everything is a matter of degrees. As soon as I realized Launchpad was a piece of crap, I put an Applications folder stack in my dock a la the Leopard default, and forgot that Launchpad exists.

Can't really ignore it anymore...


My solution is to just use Spotlight Search to launch everything. My hands are on the keyboard much more than a mouse or touchpad, and it's a good habit for when I switch between the laptop (where the touchpad is more convenient) and a desktop setup where they're more physically separated. Hunting for things on a screen with a cursor is tedious and annoying to me, when I can always just type it. But maybe that's a habit from growing up with MSDOS and (later) Linux and FreeBSD starting in college.


Spotlight search is one of the only patterns I really took away from using a mac for work.

I even mapped Super (win key) + Space to the application finder in XFCE. It's just muscle memory at this point, just like the vim keybindings.

And, really, any excuse to use the mouse less is a win.


You can also just right-click the Launchpad icon on the Dock.


I've been using a MacBook for ~5 years full-time for all sorts of development (including frontend JavaScript, backend APIs, big data involving hadoop, etc.) and I've never used launchpad. I don't see why it can't be ignored? I don't even have it in my menu bar.


Sorry, I wasn't clear.

Launchpad can absolutely still be ignored†, but the overall iOS-ization of macOS cannot be. It is ever-present, in every part of the system.

---

† If it even still exists in macos 11.


It looked like Launchpad was not in the Big Sur install. It wasn’t in the docks for the demos. I suspect that it as been deprecated if not fully retired.

I’m also seeing UI elements from MacOS coming to iPads in the sidebar components they are adding.


The median user who sees their computer as a means to an end is using Windows and saves $300+ doing so and ends up with a computer that stays cool even if it clogs up with dust after a while.


Or they end up like my parents, who had the screen hinge break off on a $400 HP laptop just from daily opening and closing after only a year and a half.

Or you end up like my dad who nearly couldn't open a Bid Document because Windows randomly decided to deregister Word's file extensions and spit out an "Unspecified Error" whenever he tried to open a file.

The point is, you are very eager to ignore the flaws of the other side.


That's a very silly anecdote when so many people in the past 2 years had the keyboards on their $2000+ Macbook Pros break within a year.

And it's an HP, we all know they are terrible.


Yes, but at least Apple paid for the repair. HP had no mercy for the hinge.

Tell me, outside of a Thinkpad, are there really any good Windows brands? Microsoft's Surface division just opened a repair program because the glass on their screens was spontaneously cracking.


When my son burned his Dell's CPU and mobo while carelessly overclocking it, Dell sent a tech guy to my house the next day with the parts and replaced it on-site for free. No questions asked.

My limited time on Earth is worth a lot and Dell seems to acknowledge that.


The problems I have faced with my 2018 MacBook Pro that make me dread working with it (I have to due my job) are not development issues but mostly end-user ones: https://ivanca.tumblr.com/post/615979862803562496/15-flaws-o...


> I know you can Opt+Cmd+V but thats not the point, which is extremely poor UI/UX.

Cutting files is generally bad UX, and Windows (which has the function) handles it rather awkwardly. To be honest many of these issues you point out are tenets of macOS since the very first release of 10.0 and in many cases earlier.


> Cutting files is generally bad UX

Says who? Apple? Where was it established to be "bad ux"? 30 years using it on Windows and have yet to find anything awkward about cutting files.


Because on Windows the functionality is semantically inconsistent.

When you cut text you are removing it from one place and pasting it into another. As soon as you choose the action the text disappears.

When you cut a file nothing happens to the file unless you paste it somewhere - and for good reason, it would be insanity for a file to disappear in a similar manner and when the majority of uses for this function are to simply move a file (something most modern filesystems can handle quite gracefully) it would also be insanity to copy the file from the filesystem into RAM and then back into the filesystem.

The macOS paradigm makes more literal sense - you copy the file and then you can choose to either paste it somewhere or move it to another location.


Text on buffers are disposable (e.g. cut/pasting text) but files are not, as losing one can even be a tragedy without backup so I don't see much logic in treating them the same, specially since the mental overhead becomes a burden to the user (oh I'm pasting a file so I must remember to use the special command to cut a file) instead of putting the burden on the machine, specially since reducing mental overhead is arguably the sole purpose of the machine.

And the focus of that complain was that the edit menu does not clarify in any way that the "Cut" command there is only meant for text, it should be renamed "Cut text" or something that the user can easily understand to be intended only for text.


Overheating is still a concern for me.


I recently purchased a new 13" pro, upgraded to the top processor. I installed Bootcamp and played hours of The Witcher 3 (on low settings, of course, but still). There were no overheating issues. Fan was going full blast the whole time, but it kept up just fine. Hopefully my little anecdote instills a little confidence. The new 13" is as close to a perfect machine to me as I have ever owned. It's compact, portable, light but a completely full powered functional 100% dev workhorse (with a great keyboard AND a physical escape key to boot). I couldn't be happier. Kudos to Apple for this one.


> what the company perceives as a weakness: the paucity of apps in the Mac App Store.

I think that the issue with the Mac App store, is that it imposes restrictions on a once-free-for-all development environment. I know many app developers that have abandoned it, after giving it a damn fair shake.

Mac developers just got used to being able to wander all over the house, poking their noses into whatever they liked.

iOS, on the other hand, started off restricted. iOS developers have never known the freedom that Mac developers have had.

To be completely fair to Apple, the iOS model is the one that needs to be followed, if security and privacy are the goals, and I think that a big part of MacOS 11 will be to add restrictions.

I suspect that non-App Store apps for the Mac may be seeing the twilight.

I am not looking forward to the restrictions on the Mac, but it has been quite some time since I've stepped outside the walled garden, myself.


On the other hand, what a modern desktop is sorely lacking right now is protecting users and developers from themselves. Whenever I run "npm install" I have to close my eyes and ignore the chills down my spine thinking about how a thousand random npmjs.org account holders just ran their postinstall scripts on my computer with full access to ~/Documents/. The traditional multi-user Unix permission system is no longer appropriate for the modern waterhole malware world. Nobody cares if you break root and hack system binaries, they can always be restored. The gold is in the user's content. We need a system where regular users don't need to worry about defending against software developers who don't carry the user's best interest in mind, and we need a system where even the cowboy I-know-what-I'm-doing developer doesn't point a gun at their foot, while still providing the freedom to tinker and share.


scripts on my computer with full access to ~/Documents/ ... Nobody cares if you break root and hack system binaries

This is exactly it. Unix is a multi-user system trying to cope in a single-user world and it's not working. The Unix threat model is students on a university mainframe trying to break into each other's accounts. It's totally unprepared for a world of single-user devices where non-technical people want to run untrusted software.


Maybe what we need is a granular permissions model for all executables, including those in the terminal. Instead of the binary choice between "this code is allowed to rewrite my operating system if it wants to" and "this code can 'only' read and write all of my user files", what if we had to explicitly give binaries access to specific directories? Network access? The desktop environment? Drawing to the screen anywhere outside of their designated windows?

IMO this is the real innovation of iOS: not simply locking everything down, but locking everything down by default and putting the user in control of where to release those valves. Catalina introduced a fairly ham-fisted version of this that doesn't work super well in practice, but I think they've got the right idea.

Improvement ideas for the Catalina model:

- A terminal interface. Right now if you run a program that tries to access Documents you get a pop-up. If you ran it from the terminal, why not get a terminal prompt, a la sudo?

- A central place to manage all of the permissions you've allowed and disallowed (this may already exist and I just don't know about it)

- No special status for folders like "Documents" and "Downloads"; a generalized way of specifying any arbitrary directory (recursive and non-recursive, maybe with patterns). For regular users there can be a simplified UI overlaid on top, while power-users could manage these directly.

- Separate read and write permissions for files and directories

Imagine if you ran all those NPM install scripts and they only had write permissions to their subdirectory of node_modules, only had read permissions to your path, and would dynamically prompt you if they had a reason to request something beyond that

EDIT: What if you could also designate certain programs as trusted "authorities" that can then automatically delegate these permissions to other programs? Running with the example, you could allow NPM to dole out narrow permissions to postinstall scripts as needed, coming back up to prompt you in unusual cases. This would hugely improve the user-experience issue and these "authorities" wouldn't really have any more power than they already do today.


IMO there are two problems.

The first is that no one actually wants to manage that kind of access. Especially not your every day computer user.

The second, related, problem is that if every app is asking you to review 20 permissions, people are going to stop reading the list and blindly accept. Or at least the people who need the guardrails the most will blindly accept.

For all the crazy granular permissions I just don't see a way for it to work other than a source like Apple doing the review, saying that this app is trying to read your documents for no reason and so it's rejected. But now you're back to all apps going through the app store sandbox.


A highly-granular system can be organized under a simplified, streamlined, user-friendly interface. I fully believe that this could be made to work well.


And the end result would be Catalina ;-)


> Imagine if you ran all those NPM install scripts and they only had write permissions to their subdirectory of node_modules, only had read permissions to your path, and would dynamically prompt you if they had a reason to request something beyond that

This sounds entirely possible to do within the existing macOS sandboxing framework.


That does exist: we have ACLs, containers (no, not docker, but the filesystem views and isolation in macOS - and to some extent windows), chroot etc. Beyond that there are systems like AppArmor and SELinux on the Linux side of things.

Problems with all of those: it becomes too much overhead and the additional moving parts create additional places for things to break (and they do break - even in a preconfigured distro).


And this is what Apple has always been best at: taking technologies that already can work and making them into a product that does work, without hassle, for most people.


Yup. My thoughts, exactly.

The iPhone is a success, because developers are not its target audience.


Vagrant gives this for dev environments.


pledge(2) from OpenBSD pretty much solves this problem.


Seems like docker solves all those issues


Docker is not meant to be used for security.


i don't understand your point about random npm account holders running their scripts on your computer?


When you npm install a single package like vue or react, it drags in a dependency tree that sources a thousand random npmjs packages. Each of these packages may specify a postinstall script that executes at install time. Time and time again, some random leftpad-like dependency might lurk five directories deep and spring a surprise script on you, either because the author is malicious, or they handed over (or sold out) to someone else, or perhaps most likely their npmjs account was simply compromised.


gotcha. oof. :/


Meh. I still can’t find most apps for my iPhone that I had on my mac. Sure there are lots of “webpages but in objective-c with notifications“ and the occasional interesting project that disappears after 2 years because the developer can’t work full time to keep up with API churn, but lots of arguably critical stuff is just missing.

There’s no IRC client with working notifications that doesn’t cost money.

I haven’t found a working gvim port (occasionally broken ones show up.)

I haven’t found an ssh client I like (has key support, has sftp support, has a vte that doesn’t suck) (and no, ish doesn’t count. I shouldn’t have to reinstall my stuff after having my files deleted with no warning every couple weeks.) This is pretty bad for an OS that’s been around for over a decade. I’m pretty sure both palm OS and windows had decent ssh clients after 10 years.

I haven’t found a replacement for wish (which is super useful for flinging together a quick GUI app when someone needs one)

There’s nothing that really replaces AppleScript (shortcuts is more like the goofy graphical automation thing OS X had that uses the same API)

There’s no Arduino IDE

There’s no sync thing port or anything that could even replace it.

No BitTorrent client, no useful mail clients (none of them can download mail in the background) no good alternative podcast/rss clients (see mail) no cron.

And of course lots of little tiny (and some not so tiny) shell utilities that are trivially easy to find on OS X are at best reimplemented badly in awkward apps and more often totally unavailable.

The Mac App Store is more empty than the iOS store but both are nothing compared to the OSX software library.


> I shouldn’t have to reinstall my stuff after having my files deleted with no warning every couple weeks.

iSH shouldn't really be doing this. Do you have any more details?


You get kicked out of the beta pretty fast if you don’t use it. This means you absolutely can’t rely on it.


"I suspect that non-App Store apps for the Mac may be seeing the twilight."

What about MacOS 11 makes you think this? Literally almost nothing of consequence has changed since Catalina in OS security. Gatekeeper hasn't gotten tighter, you can still disable SIP, homebrew works great.


The move towards "iOSing" the OS.

It won't be GateKeeper. It will be policies. They won't make it obvious at all, because they know how unpopular it will be.

Let's see what the future brings. I'm not thrilled with it (for example, I have one app that I've written, that uses ffmpeg[0], and ffmpeg no likee sandbox).

[0] https://github.com/RiftValleySoftware/RVS_MediaServer


People have been saying that MacOS was going to be App Store only eventually for over a decade.


Mac App Store was announced in October 2010 (and released Jan 2011) so that's not yet possible, hahaha ;)


But once again, the only "iOS-ing" that has happened so far is the theme. Yes, the theme looks more iOS-like, but that's about it. There are almost no technical differences whatsoever underneath since Catalina and almost no security changes.


I know Apple fairly well. They have...control issues. They want as much control as they can get, over every aspect of the experience.

It can get stifling, but I actually support it, and it's one of the reasons that I have always had faith in the company.

The HIG (Human Interface Group) used to be nicknamed "The Blue Meanies" in the 1980s.


IIRC, the Blue Meanies were simply the System 7 core developers, not a HIG-specific group.


I recall them as the HIG group. I first encountered it at MacHack, in 1987, or so. There were a couple of HIG engineers, there, and they referred to themselves that way.

The reason was, because they insisted that app developers follow HIG guidelines.


Are you sure? The story I've always heard is that the Blue (Mac OS) vs. Pink (Taligent) team split originated in a planning meeting around 1988, and the "Blue Meanies" were core members of the Blue team.


Maybe. I just remember people referring to them as "Blue Meanies" because of the characters in The Yellow Submarine[0]. They were the ones that hated music and creativity, and folks didn't like HIG telling them they couldn't have fuschia scrollbars in their new, five-thousand-dollar color Macs.

But that said, I think you are correct[1]. The last MacHack I attended (maybe 1989?), Dean Yu hadn't yet been hired by Apple, and was infamous for writing system-crashing inits (Kill Dean's Inits!). I think he wrote the "Energizer Bunny" init that could have the EB running around every computer on the network.

[0] https://en.wikipedia.org/wiki/Blue_Meanies_(Yellow_Submarine...

[1] https://en.wikipedia.org/wiki/Blue_Meanies_(Apple_Computer)


Because they're focusing on the arch change. And the big change to end users for the arch change? iOS apps run unmodified in catalyst.


Apple is the chef, and the Mac user is the frog...

Apple just started the burner to low, changing the UI only.. next phase will be to add just a little more restriction over time until one day it is iOS and most user (the ones that did not leave) did not even notice it happened.

That is how the game is played, slow methodical change.


"The move towards "iOSing" the OS."

People have been claiming this for literally a decade+ now. The death of macOS is perpetually imminent.


It's remarkable how overwhelmingly negative the comments on HN are to everything Apple does. The top comment is always some "this is the final straw...I'm leaving the platform" screed (if you look back, those people have usually made that declaration a dozen times), and everyone who claims anything contrary is pushed to transparent.

HN needs to go out and take a walk and get some perspective. What a bunch of sour assholes.


I can't speak for anyone else, but I have been writing Apple software since 1986. I have stuck with them through a lot of abuse.

They just turned 1.5 trillion. I don't think they are losing a whole lot of sleep over it.


> The death of macOS is perpetually imminent.

The same for PowerPC processors, Intel processsors, and now ARM processors.

Remember when the first rumors of Apple switching to ARM surfaced? I think it was about five years ago.

To be fair, Apple has been on its "deathbed" a hundred times, in the last three decades. I have been programming Apple since 1986. I have seen quite a bit.


To be fair, they nearly did bite the bullet until Microsoft stepped in with that loan (for entirely non altruistic reasons on their part of course).


This is true. I think they also brought stock. I wonder if MS hung onto any stock? It would have been a pretty good investment.


And we have made strides in the decade towards that then in the years prior.

Sip and custom delivery of software are increasingly difficult.


Did you miss the big announcement that arm macbooks will run iOS apps unmodified?

Do you think you'll be able to run an non apple signed kernel on aarch64 macbooks?


I guess we'll find out a couple of weeks once the DTKs ship.


Maybe? The Intel DTKs had a very different boot process with a normal BIOS instead of EFI.


I am really curious what will happen to applications like Steam and similar. Will the Mac suddenly find itself in the same restrictions as the iOS App store? Will there even be a separate Mac App Store?

Steam allows for users to apply activation keys paid for outside the client, normally these are sold directly by the game developer but can be the result of a give away. Can that happen in the current App store?

The promises of better performance and all are nice but I am waiting to see if the walls come with it and will Apple pass the savings on to the consumer or lay claim to the increased performance as the reason prices stay the same or get higher.


>the iOS model is the one that needs to be followed, if security and privacy are the goals

How so? Why should a centralized store be the zenith of these goals? Even if you assume the only way to be protected is for Apple to anoint binaries as good and pure, you would still gain privacy through decentralized distribution and secure signatures even if Apple controls the whitelist.


I don’t think that part of their comment was referring to the App Store, but rather to the way that apps on iOS are sandboxed preventing them from accessing system wide settings and files outside of their sandbox. That’s the way I read that part of their comment anyway.


Yup.


"I suspect that non-App Store apps for the Mac may be seeing the twilight."

Obv someone doesn't shop on Steam.


I do, and being a Mac user on Steam sucks. Epic is even worse.

Apple doesn't seem to be particularly worried about it.

For all the hate dumped on Mac as being a "toy" computer, it is a terrible gaming computer. I use it to do work.


I tried so hard to be a gamer on macOS, and I kept telling myself it would eventually get better. But I had to give up. You can't even turn off mouse acceleration, which is a bare-level requirement for RTS games like StarCraft.

There's something about an OS that won't let me decide how my mouse works that just irked me to no end. For a while there was some sketchy kernel extension you could pay $20 for to kill the acceleration, but even that stopped working.


Valve doesn't seem particularly worried about it either. Steam's macOS support has always felt like an afterthought -- they've put more effort into supporting the miniscule number of users on Linux (through efforts like Proton) than macOS.


From my understanding it's because Valve really wants to directly interact with Apple as infrequently as possible.


Oh yeah, I can't even run Steam anymore (because they require 10.11 or newer) so I can't play any of the hundreds of games that WOULD run on my machine, because the downloader (Steam) can't run... :)


My x86 Macbook Pro has been just extraordinarily capable, dual-booting Windows and using an eGPU has been great for gaming. Maybe some of this will survive through virtualization if we can assign eGPUs to virtual machines.


Considering that 80% of mac games got axed with the removal of 32bit support, and approximately 100% of the remaining with the ARM transition, id say they don't use Steam at Apple either.


x86 games will run under emulation on ARM hardware.


Not well, of course. Just passably.


That's not something we know at this point. All that we know is that Apple has demonstrated a game running acceptably well on their current development hardware. Performance is likely to improve in production versions of the OS, and/or with production hardware.


I used the App Store to install a couple apps once, and I didn't see the value add. It offers nothing (I care about) and can't comes with trade-offs I don't need, such as untimely updates.


> I suspect that non-App Store apps for the Mac may be seeing the twilight.

I doubt it. There's literally no reason at all to transition macOS to be like iPadOS... since iPadOS already exists.

If they wanted to do that they could literally simply discontinue macOS and only sell devices with iPadOS. Boom, done. Of course, there are a million reasons to not do that, which is why macOS will continue to exist.


I agree.

I just can't see Adobe, Autodesk, etc, distributing their desktop apps via the Mac App Store.


I think Apple wants to achieve the “philosopher’s stone” of software development: write once, deploy everywhere. Securely and robustly.

I’m not sure that making money (in this case) is their biggest goal. I think having as much control as possible, over every aspect of the process is more what I suspect they want. That’s fairly typical “strong brand” stuff, and Apple has one of the strongest brands in the world.

I’m not sure they can achieve it, but everything they are doing (SwiftUI, etc.) points that way. Since they have full control of the hardware, SDKs and operating system, as well as a huge pile of money, they may actually pull it off.

There’s no reason to think they wouldn’t establish “special deals” with Adobe and Microsoft, to allow their software to be distributed through the store at a reduced (or possibly even subsidized) rate, as it would definitely be to Apple’s advantage to have the software in the store. Big corporations do that all the time. In a previous job, Intel gave us tens of thousands of dollars worth of help in optimization, for similar reasons.

They seem to have done a number of really difficult things. I’ve learned not to write them off. They are fairly good at playing the long game.


Isn't it more the web model?

I mean, the web is older, it's locked down, and browsers get more and more features.


I wouldn't call the Web "locked down." Maybe to JS coders, but it's a real leaky bucket.


I think the useful similarity between the web and iOS is that neither a rogue app nor a rogue website can access your documents without asking you.


Snow Leopard (version 10.6) was the peak of OS X. Everything since then has been downhill. https://en.wikipedia.org/wiki/Mac_OS_X_Snow_Leopard


I would not go that far. Security changes. iOS integration with AirDrop and Messages. Laptop trackpad gestures.

But note what I'm highlighting for the most part: mobile features. They've added good stuff for laptops, and for interfacing with iOS devices. But for desktops, and laptops living on a desk? Not so much. If you're using a full keyboard and a mouse, sans trackpad, I can't think of a lot of real UI enhancements since 10.6.

(Which makes total since, given they sell so, so many more laptops than desktops. But still. I get it.)


Snow Leopard was a development bump for all the things that were broken/unoptimized in the switch from PPC to x86. I wouldn't call it the peak of OS X.

In hindsight it was the last true developer release. Most of the restrictions weren't in place. As they moved to polish the MacOS for customer use they also buttoned it up.


As a younger hacker, who has never been into the Apple ecosystem and doesn't know much about Steve Jobs, the video presentation of the funeral for MacOS 9 sort of blew me away. It was so funny and theatrical, but got the message of Apple's new direction across so clearly. You can even see people in the front rows standing up and gawking, trying desperately to figure out what was going on. What a fascinating production.


Got a good link?


It's the first video in the original post. https://videopress.com/v/cTvJLHm8


I’ve been unhappy with macOS and Apple for the past few years and in January switched back to developing on a windows PC. I could not be happier with that decision and with the “roadmap” Apple provided yesterday, I expect it will be a long time before I go back to macOS, if ever.


I don't understand how you can get anything done without fighting windows nonsense proprietary tools.


I've had to help people in my classes fight Apple nonsense proprietary tools because no one could understand why Apple won't let you open a file using your own code. If there wasn't anyone there that understood the issue (and that was almost the case) they'd have to use Windows.

Trust me, macOS is much worse on that front than Windows is. At least on Windows if you have issues with your OneDrive account folders won't start disappearing from your Desktop.


What does that mean? Windows ships with an entire subsystem for Linux, you can literally install any linux distro you want and it will run native inside windows with access to each others file systems, ports, etc - with basically no performance penalty (that I see in WSL2).

It is more Linux than OS X is by orders of magnitude.

And at their last conference they previewed native X Windows support in Windows. So you can literally start gimp in Windows from a zsh shell inside Arch. Couple that with their new faux tiling window manager in Powertoys (FancyZones), VSCode, native docker and kubernetes support, NFS support, their new terminal, etc - developing in Windows from a unix perspective is way way way way better than OS X in modern times.

What proprietary tools are you talking about?


You're complaining about proprietary tools in advocacy of Apple? Developing on Windows is a breeze. Unless you're specifically talking about Microsoft's dev stack (ie Visual Studio).


Not sure what you are trying to say, windows like macOS is not perfect but with patience I was able to replicate my existing development workflows and improve some.

Which “nonsense” tools are you taking about?


Uhhh.. what? Like, can you give some examples?


Windows 10 with WSL on premium hardware like Surface is more Mac than Mac at this point


As someone still on Mojave and really disappointed with the Mac strategy from Mavericks onwards, I've resigned myself to the idea that my next laptop will not be an MBP. I was enamoured with the original OSX Look&Feel and its unixy underpins; the former is now dead for good (the Big Sur icons are 2001-gnome-level horrendous, and everything is flatter than flat), and the latter dies a little bit more with each release.

I have to admit that I dread going back to the utter boredom of the Microsoft ecosystem, though. I would dearly love a Linux laptop that doesn't make me deaf with fan noise, can last 3 hours of Firefox and Pycharm on a battery charge, and doesn't freak out when waking from sleep. Does such a thing exist yet?


I have the system76 oryx pro (last years model). I'm moving from my 13" MacBook pro (2015) which I really like but it was time for a new machine (should have bought more RAM/ SSD space..).

Its pretty great. Compared to the the MacBook its much faster and has a ton of ports, and everything just works (why you pay a premium for a linux notebook). Build quality isn't as good, but not bad.

Its not perfect, However when running Nvidia video the fan does spin up (its not quiet, but not that bad). The battery life is, terrible, (probably about 3 hours), but it runs jet brains software really well. I've only had one problem waking from sleep in the year I've owned it. Even steam runs well on this thing (the video card is a beast).

Its the year of linux on the desktop (again....maybe)


Just go dell or thinkpad, you'll get your wish list no problem.

I'm on a ThinkPad x1 Carbon and Pop! OS. I would recommend fedora though. I'm too lazy to switch but everything just works on fedora better than any other distro I tried, and I tried all of them.


One of the big turndowns from Apple for me was when they did not give me a choice but to upgrade an Ipad 2 i had loaded with cool music apps. The ipad was 1 instrument more un my studio, could trigger MIDi or have good apps. I can remember why i had to upgrade, but waited as long as i could. After the upgrade the ipad became unresponsive, they slowed down the processor but it did not came out untill they got sued. I gave the ipad to my mother and never again bought one. I still use a Mac pro in the studio. I run El capitan, with newish protools and Ableton, and have another partition with 10.6 with legacy Pro Tools and a ton of nice plugins. But when i need to do office or programming i boot in Linux. I have also and old macbook air that is in the process of be turned into Linux. For music it was and it is amazing to be able to run withouth trouble, it was always a hassle to update as music software was a year behind at least. Apple was cool, now is just another big corporation.


As someone who likes to save money by postponing buying a new phone, this has been a problem for me as well. My iPhone 4S became very slow with iOS 9 and had a quickly draining battery. I eventually stopped using it when the speaker broke as I could no longer use it as a phone.


Apple never got sued for slowing down the iPad.


Nor did they force any iPad upgrades...sigh


The issue was that I could not sync my Ipad with Itunes anymore i believe, so I could not for example load music to my Ipad. So they did not "force" me you are right, but I did not get other choice because they disable some core functionalities. I aprecciated if you elaborated the meaning of "sigh", as you might noticed english is not my native language. And please do tell on your experience on the subject.


My original iPod Touch from 2007 could still sync to my computer as of about a year ago.

Also as of about a year ago, I could still download “the last compatible version” of apps like Hulu, Netflix, and Crackle on my first gen iPad and they worked. Music I bought from iTunes was still playable and still synced automatically to it. I didn’t try to sync from my computer.


Whether or not Apple "forces" users to update, on current iOS they will prompt users to update on a daily basis and there is no way to turn off those notifications. There is also no way to prevent the automatic downloading of updates unless you block them at a firewall level or have limited free storage space on your device.



I took the time to read this article and also check my phone settings. I do have automatic updates disabled but have been getting daily update notifications.

From the article:

"Turning off Automatic Updates and deleting the latest update from your Storage will prevent you from getting the alert windows in the future, although some users have reported that iOS re-downloads the update when the iPhone is connected to Wi-Fi.

If the 'Install Now/Remind Later' alert re-appears, check in your Storage & iCloud Usage and delete the update again.

You'll still get alerts from the App Store when Apple releases a new version. But these only come along every few months; not daily."


True. «Fined« and it was Iphones https://www.bbc.com/news/technology-51413724 but the exact same thing happened to my Ipad. After the upgrade it was slow


Again, that's not true either. It was specifically about iPhones and specifically about older batteries. Apple still doesn't have the type of battery management for iPads.

The article you cited specifically lists the devices affected.


I believe Apple has faced multiple class action lawsuits over this type of issue.

Here is the pretty broad result of one:

"Friday’s settlement covers U.S. owners of the iPhone 6, 6 Plus, 6s, 6s Plus, 7, 7Plus or SE that ran the iOS 10.2.1 or later operating system. It also covers U.S. owners of the iPhone 7 and 7 Plus that ran iOS 11.2 or later before Dec. 21, 2017.

Consumers contended that their phones’ performance suffered after they installed Apple software updates. They said this misled them into believing their phones were near the end of their lifecycles, requiring replacements or new batteries."

[0] https://www.reuters.com/article/us-apple-iphones-settlement/...


So in response to a post saying that “Apple never got sued for slowing down iPads”, you post another link where they got sued for slowing down a number of iPhones.


Yes, because your original argument was with someone who said their iPad was slower after a software update. This is evidence that hundreds of people had iOS devices slow down noticeably after an update and Apple decided to pay them to remove the case from court.

As a software developer I am aware of how adding features to an operating system to can indeed slow it down on existing hardware or raise the level of what the minimum specs are. Many of those issues would affect both iPads and iPhones.

I understand that you're trying to 'debunk' something here but the honest truth is that some people have had bad experiences with Apple products.


No “Words Mean Things”. The original poster said Apple got sued for slowing down iPads. I said one sentence.

Apple never got sued for slowing down the iPad.

He claimed that Apple got sued that wasn’t the case. I never said that iPads didn’t slow down because of iOS updates. My first gen iPad became very unstable because of iOS 5. I didn’t buy a new iPad until 2017.

So how are two citations about Apple getting sued for slowing down iPhones relevant?

Not only that, the lawsuits were because Apple slowed down phones when batteries got old. The slowdowns on the phones when Apple was sued were caused by the battery not the OS.

The iPad has never had the power management features that the iPhone has for old batteries.


> Apple never got sued for slowing down the iPad.

I agree.

> So how are two citations about Apple getting sued for slowing down iPhones relevant?

This is proof that iOS devices like the iPad and the iPhone can be slower after an Apple software update. As the case was settled without Apple admitting any wrongdoing, we can only hypothesize about the performance issue.

I agree that in the case of iPhones battery management is a factor in performance throttling. I also suggest that added features in later iOS updates will slow the device down. Some of the most obvious examples of this are when they have improved their UI and rendering or added additional services that run in the background.

If you really believe that an iPad 2 running iOS 4.3 would have the same performance when it is running iOS 9.3 then I would recommend you buy it as your next tablet.


The citations are proof of nothing related to OS upgrades. It’s true that iOS devices do slow down with newer operating systems, but the articles that were cited about lawsuits were about slowdowns on iPhones caused by battery management. The lawsuits were about Apple not informing users that they could have fixed the specific slowdown by buying new batteries.

I’m not arguing that older devices don’t get slower with newer os’s. I’m arguing that the citations have nothing to do with OS upgrades and they have nothing to do with ipads.


I am glad you drive for acuracy in posts, I did assume that the issue with slowing down was the same as the one they where fined for. I guess I will never know. The point of my posts is conveing a bad user experience, which you seem to have had yourself too, and posted several posts later. So the "sued" is not relevant anymore. we can agree on the slowing down/bad experience aspect :)


And I’m not trying to be not picky.

If you had a 6s and you noticed a slow down. The answer could be buy a new battery and still be able to run iOS 14 decently when it comes out.

If you had an iPad that was slow. The answer would never be “buy a new battery”.


OS X (the name) is gone for a while, it was replaced by macOS. The numbering change now is inconsequential.


Most people probably don't even refer to the macOS versions by number but by codename.


Sure, but version names are hard to order: was Catalina before or after X? who will remember? (maybe if it at least was like Ubuntu with names following alphabetic order)


Nit: Big Sur is a marketing name, not a codename. Apple has their own names for this things ;)


yeah , like look at iPad OS.. its now iPadOS 14 :|


Yes, they have spread the theme across all of their OSs but they are still distinct members of a family. It looks like they brought some of the best new features from iPad to the Mac and I see they are bringing sidebars from the Mac to iPad. Makes sense.


macOS, not MacOS.


Thanks, I always forget that Apple uses reverse capitalization.


In my country, the old school papers use initial capital letter, then lowercase, regardless of name.

I.e., IBM becomes Ibm, IKEA would be Ikea, and macOS would become Macos.

The reasoning is that companies shouldn’t be able to dictate flow and attention in an article by manipulating capitalization in brand names.

(Just an anecdote, obviously not applicable to comments here)


At least they didn't call it "MAC", haha


Interestingly in iOS 14 the feature that lets you change your MAC address labels it as your "Wi-Fi Address".


OS X was a marketing name and was dead for a while now.

Besides the CPU arch change and a slightly refined UI it is pretty much the same.


Homebrew packages work fine, you can still disable SIP, Gatekeeper hasn't gotten tighter. Everything is almost exactly the same as Catalina. There's really no substance whatsoever to the claims that MacOS 11 is locking the OS down. You could make this claim for Catalina, but Big Sur isn't changing anything of consequence at all.


Maybe people are only just now starting to see Apple's intentions of locking down their system. I remember it being a bad omen a few releases back when i found i could no longer (easily) delete the Chess.app.


I see some smart decisions here then. They don't have to admit to the issues with the changes they made last time. Apple would probably argue that developers haven't moved quickly enough in the right direction while those outside would argue differently.

It'll be interesting to see what things look like later since this architecture transition will weed out some problems Apple see, but in a different way. On the other side of the transition they can make further changes to lock down the OS with fewer issues. This has to be Apple's ultimate goal because it's what serves their needs best from many perspectives. I wouldn't suggest it serves the needs of many devs though.


What many UNIX fans that abandoned their Linux and BSD distributions for OS X fail to understand is that Apple culture was never about CLI, shell scripts and daemons.

Mac OS was never like that, A/UX was a failed attempt into UNIX market while keeping the Mac OS spirit, and for NeXT is was a way to embrace UNIX workstation market and extend it with NeXTSTEP Objective-C frameworks.

Steve's opinion regarding UNIX is well known.

And then there was BeOS on the race or the failed Copland project, both without any major UNIX story.


Apple marketed directly to UNIX and Linux power users with their famous advertisement:

https://julxrp.files.wordpress.com/2015/08/apple_unix_ad.jpg

In this era, they were all about marketing to UNIX people.


When the ship is sinking anything goes, they also had sessions at CERN back in the day, but that was never a central role in Apple culture, nor its users for the most part.


I would argue that their UNIX focus was a major contributor to the popularity of the (post-OPENSTEP) Mac.

It attracted UNIX nerds, which attracted other nerds and all those nerds spread the word to family, friends, and coworkers and helped spur adoption in a significant way.

That's how I seem to remember it, anyway.


That ad was well after the rebound.


I mainly program web apps on my MacBook Pro 2017, and upgrading to Big Sur hasn't changed almost any of this for me. I can still install brew and do my work. The UI is a rough around the edges in some areas, but I actually don't really mind, and some of the new features are pretty nice.


That family tree is wrong, right?

1. Unix was first, then BSD came from Unix... then Mach was built quite a bit later.

2. NextStep was then built by jobs after he was let go from apple by Mr. Soda man.

3. Jobs comes back to apple, brings NextStep- reads the tea leaves with OS7-9 and sets up NextStep as the successor with OSX. Apple branded NextStep becomes OSX.

Is my chronology correct?


The family tree is slightly off, but it’s not completely chronological. A 40 year old man might still marry a 25 year old woman and have children with her and the tree would put them on the same line.

Your chronology is correct though.

I think it was an aesthetic choice to make the grandparents symmetrical, but the only relation the Macintosh has to any other prior system is the Lisa, you could say the Lisa was influenced by the Xerox Alto. The Macintosh Project was founded by Jef Raskin and his vision for the Macintosh was entirely different before Jobs essentially pushed him out. A lot of the Lisa influence comes via Bill Atkinson who designed both QuickDraw and the Lisa user interface and worked on both Lisa and Macintosh.

The tree discounts both Copland and Rhapsody as well. While Copland never shipped, much of the tech developed for it did, and while Rhapsody never shipped, Mac OS X Server 1.0 did.

The only relation the Apple II has to any of this is they’re computers.


I read somewhere that nobody had ever identified a technical person who had worked on Rhapsody. Now that's vapourware.


It was realized more or less in Mac OS X Server 1.0, which despite its name is more Rhapsody than Mac OS X, and enough people bought it to complain when Apple quickly moved ahead with Mac OS X as we know it and didn’t offer any kind of discount on Server 10.0.

EDIT: Not the best quality, but here’s a demo I found on YouTube for those interested: https://youtu.be/HrHNub_vJtA

Basically imagine NeXTSTEP/OPENSTEP, but with a Platinum skin, the window server is still a Display PostScript interpreter, there is a Classic Environment in the Blue Box, and instead of the Finder you have NeXTSTEP’s Workspace Manager and its still App Kit and Foundation Kit. Carbon isn’t a thing yet, or it’s in early development and hasn’t been announced yet.

I don’t recall if the kernel had replaced Driver Kit with IO Kit yet though.


I believe your chronology is correct, but I don't see how it differs from the (primarily hierarchal) family tree given in the article.


Excepting a minor detail - NextStep brought Jobs to Apple rather than vice versa. Gil Amelio chose NextStep rather than Apple's own Copeland as the successor to System 7, bought NeXT, and never knew what hit him.


I think your chronology is correct, yes. The family tree in the article is not a chronology, I think, so your chronology augments it with useful information.


I think so.

OSX was bought, NT was bought, Android was bought.

Which big OS vendor build their own OS? Maybe Google with Fuchsia?


NT wasn't bought.


Maybe a shortcut of Dave Cutler plus what Wikipedia said "While creating Windows NT, Microsoft developers rewrote VMS in C. Although they added the Win32 API, NTFS file system, GUI, and backwards compatibility with DOS, OS/2, and Win16, DEC engineers almost immediately noticed the two operating systems' internal similarities; parts of VAX/VMS Internals and Data Structures, published by Digital Press, accurately describe Windows NT internals using VMS terms. Instead of a lawsuit, Microsoft agreed to pay DEC $65–100 million, help market VMS, train Digital personnel on Windows NT, and continue Windows NT support for DEC Alpha"

Also https://www.itprotoday.com/compute-engines/windows-nt-and-vm...


Using VMS in the 80s at my university, it was a solid foundation for creating a proper OS that NT became.


It's a charitable gloss. One can argue that it was bought since DEC accepted compensation in the form of a ‘strategic alliance’ rather than go to court.


OS X was bought from a co-founder in exile to revive the hardware platform championed by said co-founder, so the situation is a fair bit more complicated than most acquisitions.


First OS X was big cats, then it was mountains.

Very surprised they didn't switch to a new naming theme!

I guess an architecture change was as good a reason as any to revert back to a normal, sane versioning system where the first number increments like most programs. But it's just weird how "unannounced" or unacknowledged it is.


Not mountains, but California landmarks.


Big Cats was way better IMO. I was surprised they switched to landmarks, and I'm more surprised that they haven't changed to something else entirely yet.


> I guess an architecture change was as good a reason as any to revert back to a normal, sane versioning system where the first number increments like most programs. But it's just weird how "unannounced" or unacknowledged it is.

There are many ways to version software.

They changed the major version from 9 to 10 when they went from classic Mac OS to the NeXTSTEP-based Mac OS X.

Arguably they could have bumped the major version a couple of times between them and now, for example when they dropped support for the PPC architecture back in 10.6 Snow Leopard. Likewise, when they deprecated Carbon in 10.8 Mountain Lion (alternatively, with the release of macOS 10.15 Catalina when Carbon was officially discontinued and removed entirely). And probably they could have had other major version bumps along the way too.

Nonetheless, the fact that they didn’t hasn’t really seemed that strange to me for a long while, given like I said, that there are a lot of ways to version software.

Choosing Big Sur as the name for the first major version bump in a long while seems like no coincidence to me.

From the Wikipedia article about Big Sur, California:

> It is frequently praised for its dramatic scenery. Big Sur has been called the "longest and most scenic stretch of undeveloped coastline in the contiguous United States," a "national treasure that demands extraordinary procedures to protect it from development" and "one of the most beautiful coastlines anywhere in the world, an isolated stretch of road, mythic in reputation."

> The stunning views, redwood forests, hiking, beaches, and other recreational opportunities have made Big Sur a popular destination for about 7 million people who live within a day's drive and visitors from across the world. It is among the top 35 tourist destinations world-wide.

> The region receives about the same number of visitors as Yosemite National Park [...]

Seems to me that the release of this version of macOS is a big deal to Apple, and I would agree that it is. With the introduction of their upcoming computers that will be running ARM CPUs, I can see how they would want to signify that this is something really worth noting both in the version number and in the name.

Personally I am really excited to see these new upcoming computers, and to use them and the macOS 11 Big Sur.


The naming scheme is "places in California" not mountains.


This blog post is 90 prelude/buildup toward these two claims:

* MacOS X was a very good OS for developers,

* MacOS X 11 is not.

The first claim is discussed in the prelude, but irrelevant for the conclusion. Three arguments are given for the second claim:

* macOS11 runs on ARM

* macOS11 UI supports touch

* macOS11 is not open and good [like macOSX] but easy and good enough

That's it. This leaves me quite disappointed.

macOS 11 running on ARM, RISCV, x86 or powerpc has little to do whether it is good for developers. I have an intel mac and do a substantial amount of ARM development on it: qemu works just fine. I expect qemu to work as good on arm for developing for x86.

macOS11 supporting touch does not say anything either about whether it is good for developers. Windows UI supports touch, and that's quite good, even for developers. I also have a Lenovo Yoga, and I can fold it, and sketch on it, which is great for web meetings. All thanks to having a good touch screen and good OS touch support. So IMO this change could make macOS11 better for devs than macOS10.

I haven't heard any information suggesting that macOS11 is less open and less good than macOS10. macOS10 was never great. It has quite old BSD utilities that are often subtly incompatible with their Linux or modern BSD variants for apparently no reason. macOS11 does not make this better or worse. macOS10 requires pretty much every process to grant permissions for everything nowadays, which makes it a bit of a pain to do something in the terminal without granting all permissions to the terminal, but that's only one click away, already "prepared" by apple, and you only have to do this once, etc. Each release of macOS 10 has made it more painful to use for development in some ways, and also better in other ways (zsh as default shell, C++17, C++ modules, dropping CUDA support, no native Vulkan support...). It was never great, it was always kind of a mixed bag.


So Apple changed their OS numbering system. Is there any more significance to this event?


From reading the article, it is more about the migration of macOS toward an iOS model in philosophy and, occasionally, system design.


why exactly did they change the version number? i dont get it


I don't think Apple has explicitly said anything, but the fact that this version will support a new CPU architecture seems like a pretty good reason.


Not really, OS X supports PowerPC and Intel (32 and 64 bits) at various stages in it's lifetime.


macOS 11 will support Intel and ARM.


CPU architecture change wasn't enough for the 10.x line to bump the version number. And it was a much bigger deal then that it is now.


Wow, I totally missed that version number in the About box. Very interesting! I guess 10.16 will be the last version of "OS X" as we knew it (even though it was already called macOS). The end of an era, indeed.


10.15, you mean?


Yeah maybe! Some other commenters were saying the intel version is still called 10.16 but it depends where you look in the OS... My guess is it will be updated everywhere to be called 11.0 , so I guess yeah 10.15 would be the last of the "OS X" generation (even if it was renamed macOS already).


I have fond memories of when the first OS X CDs arrived at our college computer lab, and we installed them on a PowerMac G4. The aqua UI felt so futuristic, and it matched the hardware!


Before leaving on my morning hike, I cleaned up some storage, getting 50G free on my MacBook, and downloaded the 9G installer while I was hiking. It turns out that I was still about 12G short in required disk space.

I mention this to save other people from lost effort.

A question: lots of my 256G of storage is taken up by remote mobile storage in ~/Library. If there an easy way to tell macOS to temporarily not keep local copies, and get them back later from iCloud?


Did you happen to have Time Machine enabled? (To answer your question, I think what you're looking for is brctl evict.)


I deleted ~/.stack and ~/.mvn and had plenty of room. I usually do Haskell dev on a very fast VPS anyway.

I am running the macOS Big Sur beta and also the iPadOS 14 beta. Things seem to run slowly on my MacBook, but ti seems like some work is done the first time apps are opened. My $3400 Common Lisp kit does not work on the Big Sur beta, so I will probably revert in a few days, but I am having fun with the beta right now.

I have had no problems with the iPadOS 14 beta so far.


Mac OS 11 feels like it might end up being the Windows 8 of Mac OS. A company is taking/took a well-loved and highly productive desktop/laptop optimized operating system (OS X for Apple, Windows 7 for Microsoft) and is making/made it more like a tablet/phone operating system. Unfortunately, that made it less optimized for the desktop/laptop use case.


I don’t see anything that makes it less optimized for desktop. Yes, the styling joins the family look of the iPad, but this doesn’t mean that it becomes a focus on touch. Mac is still very committed to a mouse/trackpad/keyboard interaction model. I think that 11 looks good and brings in some useful features from iPad and lends some good UI elements like sidebars to the iPad. It’s a good marriage but not a merging.


That could be true of macOS 11 but it's too early to tell unless they completely removed Finder.app and blew it up into a new user interface for macOS, which Windows 8 did for the start icon that destroyed the user experience, but we'll see.

If that's the 'Windows 8 of Mac OS' I don't want to see what the 'Window 8 of the Linux Desktop' is.


I think Gnome 3 is the "Windows 8" of Linux: It's pushing touch, it's moving away from tried and true desktop concepts and its release led to the creation of numerous "alternative" desktop environments for people who liked the old Gnome.


What's interesting to me after reading a bunch of the threads here is that consumers are constantly re-evaluating your product.

Mac became popular, not because of hackers, but because it was a great platform for doing graphic design and artwork. Apple put a lot of time and energy into the human factor and it paid off. That and the iphone basically saved the brand from extinction.

It's clear that apple is moving the same direction that Windows did years ago where they're putting the mobile OS on the desktop. I'm wondering how this is going to pay off for desktop/server users. Apples fortunes are in mobile and they're basically ballooning their mobile device inventory with multiple form factors.

I think a lot of what keeps apple users buying more expensive laptops/servers is because they simply 'work'. However, apple has been sneaking in a lot of annoying features that I would expect on mobile devices but I have come to dislike on a desktop. No i don't want my email address tied to my purchases, no I don't some random device in my house popup with a MFA code, no I don't want what I do on my home laptop sync'ed to every mobile device I own.

Usability and human design has always been the 'win' for apple and it's what Linux/Windows can never match (because they don't control all the parts for a uniform customer experience). However, there has always been a line between phones and mobiles devices. That line is in the fact that my laptop doesn't have a GSM/CDMA cell device in it. Because it would be "silly" to take calls on a laptop (i guess). I'm not sure I really like this line blurred. I'm not really sure that my laptop is just another IoT device running the same OS making annoying popups and tweets that I have to cancel while I'm working.

Anywho there is an intangible distinction between a laptop and a phone and by blurring this line I think some people might begin to dislike it. There are other options out there.


Am I the only one who wants to call it MacOS 11 Big Bottom? Any Spinal Tap devotees here? Oh never mind.


You can have one chuckle for that.


I predict that Apple will go on from this in strength. Moved some of my VITAX directly into AAPL.


So what’s the significance signaled by the numbering change?

I often like Thomson’s writing (and have him in my RSS feed) but this meanders a bit too much for me to follow. Can someone help me out?


This is just the 2020 version of Windows 95+NT = XP.


Wait... so in a few updates Macbooks will be locked down like iPhones without a possiblity for "sideloading" ?


I hope not. In the presentation they mentioned homebrew, so I suppose that's something.


Does anybody have any reasons as to why they totally downplayed this major version release of their flagship OS?


They did?


You spend $6K on a laptop and then Apple kills the product line. Sweet.


i switched 2002 from a Mac Powerbook G3 "Wall Street" to Windows. I will switch to a Mac again at the end of the Year when a Mac with an Apple Chip comes to Market. Exciting Times :)


> an open platform on top of the tremendous hardware innovation being driven by the iPhone sounds amazing.

macOS is anything but open though. If anything, it's completely the opposite, in many senses of being not open. If you want an open platform, use Linux.


Depends on what you call open and what the 'opposite' you refer to is.

Open as in 'welcoming to users'? Open as in 'it has hinges on the device that open'? Open means many things.

If you refer to Open Source in terms of the OSI, then no, macOS is not that kind of open. It's also not entirely closed as there is plenty of 1-to-1 source code available, also under OSI-approved licenses (but not all of it).

If you refer to Open in terms to 'the ability to add and remove at least some components so you can do your work', then macOS is plenty open. Want to get rid of Pages and install Microsoft Word? No problem.


Apple for example refused to support Vulkan and don't allow anyone else to provide drivers for it. It means they have control over what features you can use - you have no say in it. Even Windows being as closed as it is doesn't have such restrictions. So I'd say macOS is closed to a sickening level and it's completely disingenuous to call it open.

May be if you compare it to iOS which is closed in even worse ways, you could say macOS is "open". But it's like comparing something that's rotten to something that's rotten even more. Both are rotten.


That's a rather extremistic view. Apple doesn't support vulkan because they tried to get OpenGL and OpenCL working. Nvidia didn't play ball and AMD wasn't in the picture back then. The result was that Apple had to come up with something else (Metal) and that has worked out great with them.

But unless you are writing a graphics engine (which almost no user does) that doesn't even matter.

Regarding your openness, you still haven't provided with a description as to what you describe as open or closed. Is Windows open? No. None of it is, except a few example apps and things like the calculator. You don't get kernel sources, you don't get library sources, and you can't run any custom versions. With macOS you can.

Same goes for DirectX if you want to stay with graphics stacks (Direct 3D to be more specific). That's about as closed as it gets: you can't remove it, you can't use it elsewhere, you can't get the sources and you can't modify it.

You can still run Vulkan on macOS if you want -- you are free to implement it, run it, make shims or layers if you wish etc, just not with Apple's support.

What you call open or closed isn't clear, but the fact that you use the word 'rotten' signals to me that you are emotional about it and probably not really looking for a discussion.


I don't really care for their excuses why they don't do it. Even if anything of that applied in the past, they have no excuse today.

The bottom line - they don't do it and don't allow let's say hardware makers to provide drivers, like they can do even on Windows. That alone is an indication that it's anything but open.

> You can still run Vulkan on macOS if you want -- you are free to implement it, run it, make shims or layers if you wish etc, just not with Apple's support.

You aren't free to do that. Hardware drivers for macOS are all controlled by Apple. Otherwise you'd see AMD, Intel and Nvidia providing native Vulkan for macOS already. The only thing you can do is to provide translation into Apple's Metal lock-in (like MoltenVK, gfx-rs and so on).

macOS is one of the worst examples of closed systems.


> You aren't free to do that. Hardware drivers for macOS are all controlled by Apple.

No, they are not. Apple ships some drivers themselves because they obviously support the hardware they deliver, but you are free to get your text editor out and write one yourself, compile it, and load it.


Apple won't allow you to run any hardware driver without them signing it. And Vulkan would require GPU driver support for it. So they are the gatekeeper who is preventing this from happening. Keyword here is gatekeeper. "Open" system much? In open systems gatekeepers aren't in the way of the user.


it is frankly unbelievable how unusable the play store is.


I know this is about the coming MacOS 11.0, but Catalina was the end of OS X for me, and the end of Macs in general for me. I will not be migrating from Mojave to Catalina, and will use my current generation of Macs until they are obsolete, while finally making the transition to Linux. Mac and I had a good run, but I'm clearly no longer the target demographic and their corporate goals aren't aligned with mine as a customer any more.


I suspect you're being downvoted because this is an empty, meaningless comment without some elaboration on why you've decided this. As it stands, there's no room for discussion: the only paths forward are for others to simply agree or disagree, or to try their best to guess at what your concerns are in order to further the conversation.


The comment states the why clearly when read within the context of the article: "I'm clearly no longer the target demographic and their corporate goals aren't aligned with mine as a customer any more"


Did we miss a memo where Apple suddenly changed their target demographic and corporate goals?

I don't think they did.

The comment seems like an emotional nostalgic conclusion they have after watching the WWDC, without any mentions why they think future will be worse than what we have now or what we had in the past.


Those things are rarely (publicly) unambiguously stated, so no, we didn't miss the memo because there wasn't one. Does not mean that over the course of time a target demo can change for a company though. Apple would hardly be the only company who did.

Pity though, Snow Leopard was something to be envious of.


Maybe no statement as such, but at the very least the move to non-user-upgradeable hardware feels like such a move.

For me personally, the ability to upgrade RAM and storage, and graphics card on a desktop are central to a professional workflow, and part of my purchasing decision as a prosumer.

Coupled with a million other issues I've had over the past few years, including reliability and support, I no longer feel the added cost of a Mac versus a Linux machine is justifiable.


That statement is still basically meaningless. It’s a conclusion, not reasoning.


> to try their best to guess

Or perhaps ask for clarification.


I mean, I'm still on Yosemite, and I didn't even want to "upgrade" to that (but had to, to use the iPhone 8 I bought). I would rather have stuck with Mavericks. Every new macOS update seems to bring new freedom-limiting changes which I fundamentally disagree with, and it's getting harder and harder to justify using such a closed platform...

Stratechery's mention of what hackers/programmers move to as being an indicator of what's next is interesting, as currently it feels like these types of people are largely moving to more open platforms. I see it all over my social networks where super talented engineers are embracing not only open software, but open hardware, too... We shall see, I guess.


Same here. Still in 10.14. Work machine is already on Linux. Personal one will be soon. Sad to lose some apps (e.g. Dash, Affinity, Alfred, Fantastical, Postbox, 1Password, ...) but in the end it will be more powerful and fast with Linux and i3 as I spend a lot of time inside the terminal and text editors. My 2014 MBP is slowly dying from hardware problems (screen goes blank at random, keyboard wears out, cursor jumps at random from time to time, ...). I am bored. Made a detour with Apple for a few years and it was fun (beautiful and Unix based) but I'm coming back to Linux with a minimal setup. When it comes to servers, I've never left Linux. And last but not least I prefer free software, open source, the mentality around it than closed proprietary stuff. I have a dying iPhone too.


Same here. Currently I'm doing a Linux desktop and a minimal Windows laptop for remote work via SSH. Once I'm convinced I'll never realistically need to run mainstream apps I'll go full Linux.


Same here. Although it will be 10.13 for me, which is the last version with a good subpixel hinting. The best possible font display is of utmost importance to me, and retina+subpixel rendering is optimal for now. I know that it is possible to re-enable it in 10.14, but it works only so-so. And with 10.15 its certainly gone.


I was bothered that they removed 32-bit support, so my desktop is still on Mojave. I suppose I can understand it. With the ARM transition, it certainly makes things simpler.


> I will not be migrating from Mojave to Catalina, and will use my current generation of Macs until they are obsolete, while finally making the transition to Linux

I've recently been doing this, after some time (ahem, a decade) away from "serious" desktop Linux. Some observations/pointers:

- Running ultralight window managers with minimal services still feels the most "right" and predictable/stable way to have a Linux GUI. So that hasn't changed. Unfortunately this means manually screwing around with scaling and size problems on 4K displays now. Like, really weird stuff like various windows being drawn at different scales or part-scaled-part-not and drawing your cursor a different size when you mouseover. It's no fun sorting all that out. Boo.

- Wayland is less horrible than I expected but does give the impression of being a lovely new way for any ordinary application to crash your entire windowing system. As if Linux didn't have enough of those already.

- It's still easier and more predictable to just run the damn thing in a Virtualbox VM under Windows, if you've got the horsepower for it, letting Windows handle the drivers for the actual-real-hardware. This may not be true on stuff like System76 or oldish Lenovo machines or something, but probably is if you've got, say, a dedicated recent-vintage graphics card, unusual USB devices, anything like that. It's just way less crashy in a VM, on the same hardware, unless you're damn lucky and probably running fairly old kit. Was true in 2010. Still is, it seems.

- Gnome3 has gotten so incredibly resource-hungry and unstable that it's finally driven me to KDE (see above about handling scaling issues in the lighter WMs being a PITA for why I'm not in Awesome or XFCE or something), which is a DE I've never liked since I first tried it two decades ago. It is a little better now, admittedly, and doesn't make me feel ashamed or like I'm Doing It Wrong for not running exclusively K* applications like it used to. The settings panels are still some kind of bad joke, organizationally. It's not that there are too many options, it's that they're trollishly organized. Friggin' gnome-shell. What an absolute garbage fire. Which is a shame because aside from being even harder to customize meaningfully than macOS (!) Gnome3 looks fairly pretty. It's seriously, no joke, awful though. I wish I could use it but it's simply broken, at present.

- All the new packaging stuff I've tried sucks. AppImage is closest to being good but it's exactly as good as having a bare exe file on Windows or downloading a static binary on Linux, so... it's fine, but does nothing for you like adding console launch commands ("code ." for example) or configuring your environment, as far as I can tell. None are as good as Homebrew and Homebrew-Cask (yes it's on Linux but there are way fewer eyeballs on it so fewer packages and they're often broken) for managing user-, not system-level packages. Snap's complete crap, like, they should just give up, it seems irredeemable at this point. Flatpack is significantly better but still not great. Usable for some unimportant things if you don't mind them doing weird, broken stuff sometimes or just not working when you need them to (see again: unimportant things)

Incidentally, I've settled (back) on Debian after trying Fedora 32 due to tons of recommendations on Reddit (I... did not enjoy the experience) and already knowing I didn't care for Ubuntu (anymore—it was great in the late 00s) as I'd tried that in VMs a few times during my Linux hiatus and not liked it. I also toyed with Void Linux on a crappy little machine for a while and damn is it nice for low-end hardware. I tried Manjaro but I-forget-what serious thing was broken about it on my rig and was going to be a pain to fix. Great installer, though. 10/10 on that.


Thanks for the great tips! I’m not super thrilled at the idea of running Linux on Windows, though that may be the best option. It would be nice to have more games available, but I find using Windows a grating experience with its constant popups and annoyances. It’s a thousand little things that annoy the heck out of me when using it. I had a Windows work box for 3.5 years and the only nice things I have to say for it we’re Outlook, Excel, and SSMS, which are fine products.


Strongly agreed, I haven't even kinda liked Windows since Win7. Still, it's not as bad as the constant oops-rebooted-there-goes-your-work fest I recall from early Win10. I forget it's there most of the time, until I suspend my work VM and fire up Steam for some games.

I don't really trust a ZFS root on Linux not to, you know, suddenly break in ways that keep it from booting up, so absent that the snapshotting you can do for your whole OS when it's running in a VM is very handy, too. Though of course you could do that with any OS playing host—Windows is just convenient as the best driver shim available on generic x86 hardware, really. Takes a bunch of the headaches out. "I want to upgrade this, will my Linux installation become way less stable if I do?"—who cares, it sees what VBox shows it.


I'd be interested in Linux were it not for the fact that I also want to be commercial applications, which are a major Linux blindspot.

I wonder if Windows has matured as a developer-friendly environment since I left with all of the WSL stuff that's been happening.


To me, the first indications that Apple was no longer interested in software professionals were: 1) Removal of USB-A ports, 2) Removal of the ESC key. I bailed out of the Mac world right then. The subsequent software issues (dropping of 32-bit, instability in Catalina, this new MacOS 11) just confirmed my suspicions. The Linux Desktop is still frustrating in so many ways compared to the MacOS, but I can be reasonably productive.


Lots of pointless user-facing "innovations," some of which break things with no warning, and very little improvement of the base OS from what a Unix person would notice, combined to cause stagnation.

When Apple moved to file 'tags' from comments which one could edit which would be attached to the file, they literally nuked thousands of annotations of mine. I was never warned that they would be blown away. What kind of company just does this?


Intel version is still 10.16, new ARM version is 11.0.


I can confirm this is false. I have MacOS Big Sur on my MacBook Pro 2017, and it identifies as 11.0.


What does it say if you run:

sw_vers -productVersion


10.16.

However, it is clear that Apple wants to move everything to 11. Xcode also says 10.16, but Apple Developer Docs say 11. My suspicion is just that the developers of Xcode and other parts of the OS weren't aware of the switch yet.


Ahh interesting! Yeah, it doesn't surprise me. It's 10.16 now, but I wouldn't be surprised if all of those version numbers will be updated to say 11.0 upon the final production release.


It's a marketing number, just like System 7.7 got renamed to Mac OS 8 at the last minute so that Jobs could weasel out of System 7 contracts.


My Intel MacBook Pro says 11.0 Beta (20A4299v). So, I don't think it's 10.6 for Intel.


Source?


I know that Java is not hip nowadays (was it ever?) but at some point Jobs bragged Macs are the best Java workstations. Clearly Apple not only does not care about Java but hates it. macOS Catalina is also pretty scary.

So, I am enthusiastically waiting on getting a new laptop with the Ryzen 4800 CPU to run Linux on. I do some Java tooling projects and I will just use the existing Mac gear to support them, for a while.

As weird as it sounds, even Windows is starting to look like a good target OS for somebody spending lots of time in the terminal and doing some Java.


Apple is a huge Java shop. They're quite secretive but a very large portion (the great majority?) of their backend is Java.

OpenJDK has a Linux/AArch64 port, and it will have a macOS/AArch64 port, too.


Do they currently run their backend on MacOS?


No they do not. They (as some would maybe not expect) even use Oracle software as backend for some systems. But not macOS. Unless you count developer workstations and things like Xcode build servers or their ASD replacement.


A lot of Java software is stuck in JDK 1.8. I wonder if anyone will port old versions like this to AArch64 and macOS 11.


1.8 applications run and compile fine on JRE 11 and up


Not necessarily. Apache Spark for example, prior to just released 3.0.0 did not work with anything later than JDK 1.8.x. Apache Cassandra in version 3.x.y does not work with 1.11. Version 4.x will support JDK 1.11.

Things are not that simple unfortunately.


Apple has supposedly prepared a patch for OpenJDK to make it work on Apple Silicon.


There are several BSD ports of OpenJDK to AArch64, anything Linux specific has already been fixed.


That was in the OS X early days, as Apple wasn't sure developers would buy into Objective-C, they jumped into Java as alternative, having their own VM and Cocoa bindings.

As it became clear that they would gladly use Objective-C, they killed the other horse.


Well, they lost interest in it. They didn't actually kill off Apple Java until the Oracle Thing happened, and they weren't alone there; a lot of companies backed slowly away from Sun stuff after its acquisition (remember when MacOS was going to have ZFS support any day now?)


That happened during Sun days still, and they keep using WebObjects among other Java stuff on their backends, as per job offers.


A lot of their backend is Java. Prior to acquiring Apple for one Steve Jobs (and getting 400 million dollars in change), NeXT had a Java web framework called WebObjects.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: