Hacker News new | past | comments | ask | show | jobs | submit login
M4 Macs can't virtualise older macOS (eclecticlight.co)
177 points by Brajeshwar 8 days ago | hide | past | favorite | 158 comments





Commenters lament that Rosetta will go away before users are ready.

In my opinion, Rosetta should be more heavily gated* to push everyone from Adobe to Steam to build for aarch64. Countless "Apple Silicon native" claiming tools require Rosetta under the hood for dependencies or even (bless their hearts) only their installer.

* Like right-click to open packages or install not-from app store, except Rosetta dialog should note that once it's installed any other old software not made for this system will run without warning. Turns out avoiding Rosetta is a great way to ensure not just apps but your CLI tools are all up to date... Alternatively, make Rosetta sandboxed with apps, and/or let it be turned off again more reasonably than the safe-mode surgery required now.


> In my opinion, Rosetta should be more heavily gated* to push everyone from Adobe to Steam to build for aarch64.

Honestly I see game developers simply abandoning MacOS (more than they already have) if they need to do more work on their back catalogues. Nobody at Adobe cares if CS3 doesn't run on the latest MacOS, it's no longer a revenue source for them. EA would care a lot more if all their games older than 4 years needed work to run on MacOS, there's a lot more long tail revenue in games (but a lot of that is over aggregated back catalogues where the revenue per title is pretty low even if they sum up to a significant amount) than there is in the more typical B2C cloud SaaS attached apps that the App Store model is designed to serve.


The final straw for me was Apple deprecating modern OpenGL. There's no quick and easy solution to migrating even to Metal. So, they shafted us, and I'll never recommend another game developer bother with an Apple device ever again.

Use Windows for games.


But wasn’t macOS the only major gaming-relevant platform still using OpenGL anyway? If you’re already writing a Mac-specific backend, why not just write directly for Metal instead of OpenGL? Or, at the least, write for Vulkan and use a translation layer like MoltenVK?

> But wasn’t macOS the only major gaming-relevant platform still using OpenGL anyway?

At the time, there was a decent number of non-gaming cross-platform applications that relied on OpenGL for rendering. OpenGL wasn't perfect (especially compared to DirectX 9) but it was a good-enough solution for simpler apps and games that wanted the write-once-run-anywhere treatment.

> If you’re already writing a Mac-specific backend, why not just write directly for Metal instead of OpenGL?

Because a lot of people don't write Mac specific backends in the first place. Unless the app was designed to be Mac native from the start (a rarity in the professional world), there is very little impetus to rewrite everything to work with Metal and/or AArch64 targets. OpenGL suggested a future world where this would be unneccesary, and people liked it. With Metal as the only option now, a lot of people feel like Apple slammed the door on people that wanted to write cross-platform apps supporting Mac.

> Or, at the least, write for Vulkan and use a translation layer like MoltenVK?

MoltenVK is too slow for games (compared to DXVK it is an utter slouch) so most people don't even bother. There are a few apps that you can run in it, but for the most part it is a toy that rightfully isn't relied upon to deliver industry-standard experiences.


How is Mac no longer a revenue stream for Adobe?

Poster specifically mentioned CS3, which was a perpetual license. Adobe is not incentivized to keep a version of their software someone purchased once seventeen years ago working when they would much rather sell a monthly subscription.

The scuttlebutt is that Steam is not going to get ported, and Valve has given up on Apple since the 32 bit drop in general and the Metal/Vulkan mess.

I have seen a huge downtick in games with Mac releases too… even for stuff where it seems like it would be possible with an extra platform export.

Apple seems to care about gaming for about 15 minutes every year, and one day they will figure out it can work on their stuff if they’re willing to accept that their platform will be nobody’s priority.


> the Metal/Vulkan mess

But what about MoltenVK?


Going from DirectX to Vulkan to Metal is simply too many translation layers to work well. You're almost always going to end up with an annoying bottleneck and poor hardware utilization. MoltenVK alone might be fine if Vulkan was more widely supported by shipping games.

I think it's more plausible that Valve decides to make a Proton for Mac using the D3D to Metal translation layer from Apple's Game Porting Toolkit—but that would be going against Apple's intended purpose for the toolkit.


I don't know, this is just stuff I heard, but I think some of it is Valve just being _annoyed_. Like Apple made this choice despite reaching a point where the highest end graphics card they ship is equivalent to a high-end... laptop graphics card! You shouldn't be in charge of a thing if you're not going to cover a wide spectrum.

At least with USB-C Apple "donated" their designs.


> At least with USB-C Apple "donated" their designs.

What do you mean? I've always thought that it was designed jointly by the same standards committee as the previous iterations. The first USB-C device I know about and have used was the Nexus 6P. This was early enough in the standard's life that no one had any USB-C cables (or had any idea that it's a thing) so I had to carry my own one at all times in case I wanted to charge my phone. Apple started putting USB-C ports into MacBooks a year or so later, iirc.


My understanding of the narrative was that Apple showed up with a lot of stuff regarding USB-C at the outset, even if they weren't the first person with it on their device.

This was just stuff I heard though. Vaguely, many years ago.


What about Game Porting Toolkit? To be honest I feel like that's resulted in fewer native ports and more Mac players enjoying the x86 library that would never be ported natively.

I don't see how. The performance of Game Porting Toolkit is in my experience very poor.

It's ok for older games but those would be very unlikely to receive a native port anyway, and for anything new GPT just sucks...


> to push everyone from Adobe to Steam to build for aarch64

Not happening. The other commenters are right - many developers are content letting their apps get abandoned if the choice is between Universal binary or x86. A lot of software, particularly legacy software and older games, don't even have the opportunity to build for aarch64. The moment Apple put an expiration date on Rosetta they were confronting you with the inevitability that your software would one day die. There is no convincing some people - for christsake, four generations of Apple Silicon came and went and Steam is fine leaving their client x86 only. They know all their MacOS users are playing through Game Porting Toolkit's Windows version anyways.

From where you're standing, it must feel like an 18 carrot run of bad luck. Truth is, the game was rigged from the start.


The Linux Steam client is also still 32-bit, so it's more like 20 years with no 64-bit support.

Why should all other companies and developers spend money to help $2T company make even more money when it was that company that broken all the existing software? Apple users will through their money at Apple every opportnity they get but won't spending any money on new version of 3rd party software that Apple has broken with their changes. I would abandon Apple if had any software released on their platform.

Apple don't care to make games compatible, valve can't fix their design decisions why do you think theyare investing so hard in Linux, while Mac is already half the Linux userbase

> It seems that M4 chips can’t virtualise any version of macOS before 13.4 Ventura

13.4 was released on May 18, 2023. That's actually not very far into the past.

Anyway, what would be the most common use cases for this? And how common are those?


If you're building macOS apps, it's common to want to test them on all system versions you support. Especially so considering Apple's attitude towards backwards compatibility.

Are there any Macs that can run 13.4 but can't run 13.5?

I don't think so, but no Mac before 2016 can run 13.x: https://everymac.com/systems/by_capability/maximum-macos-sup...

Virtualizing an older ARM version of macOS was never going to be sufficient to QA x86 applications running on older Intel Macs. For that, you'll always want real x86 hardware.

And yet Monterey is EOL. Very few apps still support it. I wonder if it's just something that wasn't tested exactly for that reason.

EOL or no, there are many people who still use older OSes because they own older computers, can't afford an upgrade or don't want one (it works just fine!), and can't be bothered with figuring out running a newer OS on officially unsupported hardware.

Case in point: I built a macOS app that implements Google's Nearby/Quick Share (an AirDrop-style file sharing thing on Android). Multiple people tried running it on Catalina and were disappointed that it wanted a newer OS. So I did end up backporting it to Catalina.


>Very few apps still support it

Citation? I use Monterey at work and every app I need works on it still.


CI is the big one, or similar testing against older versions for backwards compatibility. Usually good enough to just compile for it on MacOS, but sometimes we get a surprise in a third party library or something that only shows up on an older release.

I once used an old Mac OS X in order to extract a Cisco AnyConnect certificate from Keychain that wasn't possible to extract in any other way.

https://reverseengineering.stackexchange.com/questions/6043/...

I've first tried recompiling the Keychain app, but it had too many dependencies that were not trivial to build, so, using an older Mac, in that case, was the easiest way to get the private key from my own keychain.


Dev environments, and testing.

Macs are the best at virtualizing macs.(look up hackintosh to see how many hoops must be jumped through on non mac hardware)


You don't need a Hackintosh to virtualize a Mac - you can actually download the MacOS image directly from Apple and boot it right into QEMU with the proper configuration. I've used a few scripts over the years that could have an OSX image running on Linux in less than 15 minutes.

Any tutorial?


official support for any macos version < ventura has been dropped (monterey support ended 2024-09-16)

so i wonder whether apple will consider this a bug...


Seems like a bug if it only affects certain processor cores.

They added custom instructions to Apple silicon to more easily emulate x86 behavior (e.g., https://developer.apple.com/documentation/virtualization/acc...). They may have now removed them because their analytics say that Rosetta2 use on new devices is minimal.

Virtualizing older macOS on M4 hardware has nothing to do with Rosetta 2. And it would be ridiculous for Apple to remove hardware features that Rosetta 2 relies upon before they're actually ready to retire Rosetta 2—that would force Apple to invest more software engineering effort in updating Rosetta 2 to work without those features.

i guess there will be always a previous processor supporting something which the most recent doesn't.

but when the bug report is regarding supporting a software version which they don't support themselves anymore, personally i don't think they will give it any priority


A few years ago we added `GOAMD64=v3` [1] to how we build our go binary into docker images, as all our production servers have support to that and that can give us some free performance boost.

Then it turned out Rosetta does not support that, so those docker images can no longer run on developers' mac laptops, so we have to revert it back. This is why we can't have nice things (we don't use any arm server so we never bothered to build multi-platform docker image).

[1]: https://go.dev/wiki/MinimumRequirements#amd64


From the reference: "GOAMD64=v3: all v2 instructions, plus AVX, AVX2, BMI1, BMI2, F16C, FMA, LZCNT, MOVBE, OSXSAVE."

Were those supposed to have been included? What of those is not emulated by Rosetta?

I'm struggling to understand the chain of following how this results in us not having nice things?

Iiuc, paraphrasing, sounds like go made an assumption about user requirements, that turned out to not be true when the arm macs came out? Wouldn't the arm mac users of go prefer to have docker images that dont need to be emulated anyhow?


https://en.wikipedia.org/wiki/X86-64#Microarchitecture_level...

amd64 v3 are instructions included in CPUs starting from 2013, so basically any modern amd64 cpu supports all of them. As mentioned there on the Wikipedia page, QEMU 7.2 also implemented all of them. Those are more efficient instructions, thus "free performance boost".

But Rosetta doesn't. Which instruction(s) it doesn't implement doesn't really matter (I can't remember either). What matters is that when it runs an instruction it doesn't implement, it will throw an illegal instruction error and the code hard crashes.

So because of Rosetta, we can't build code with amd64 v3 enabled, and cannot have free performance boost (nice things).


Rosetta now supports AVX and AVX2 on Sequoia, but I believe all the others are missing (F16C, BMI, OSXSAVE definitely are not there)

I sure hope this is addressed by a future OS update. I’m not a hardware person, so I can’t imagine why this wouldn’t be possible to address. If someone with more experience in that area can offer theories, I’d love to hear them.

Apple broke the compiler suite on Mac OS X 10.3. As part of a security update they shipped a new version of libstdc++ -- which was built using a different ABI than the compilers for 10.3, and they weren't going to update the compilers for 10.3. (Some sort of GCC ABI flag day happened.)

You could still build programs for 10.3 -- but you needed the latest version of Xcode and the compilers for that, which Apple only shipped for 10.4 and up. So you had to set Xcode for 10.3 compatibility mode and then you could build your app.

If you develop for Apple's platform, you need to be running the latest version, period, end of statement. If you aren't, there's no telling what may break.


So, is it assumed as a bug more than a removed feature? If this is a bug and could be fixed I would treat this different to "no more support for older macOSes".

Based on the virtualization framework documentation, this seems very intentional.

My guess is that they do this because their development processes only assure that the macOS version shipped with the hardware works on that hardware. And the virtualization layer is really thin.

They see no reason to spend the extra QA time, but rather have everyone upgrade to the latest macOS version their hw support.


Do you think it could be simple to fix, even by a hackish third party?

In this case, it's an unintentional bug.

> In this case, it's an unintentional bug.

How do you know? Maybe it's in the mentioned bug report 15774587; I tried to look at it, and was sure I could log into Radar at some point, but now I can't seem to dig up any valid login.


Unless you work at Apple, none of us outsiders have even been able to look up other people's bug reports

> Unless you work at Apple, none of us outsiders have even been able to look up other people's bug reports

Are you sure? I'm pretty sure that I have in the past, and even that it used to be standard practice when discussing an ignored Radar report to ask other sufferers to go to the bug and do whatever the "me too" action for Radar is (I don't remember). Certainly there is a login at https://radar.apple.com, although that doesn't prove that it's meant for outsiders.


I miss how the VM are managed, UTM or something else ?

Macs in CI are an absolute nightmare. For some reason (well, I do have a reason, they want to sell you more Mac Minis) macOS is the only modern OS that has no real container solution builtin. Windows has Docker and true containers, FreeBSD has jails, Linux has a bajillion solutions, Darwin (macOS)? Nothing. They've ripped half of FreeBSD already, just pull jails too!

It at least has the virtualization framework now. There’s a product called Anka that plugs into Jenkins and lets you deploy macOS VM images as build agents on top of physical Apple hardware. While slower than containers, and limited to 2 VMs (?!?) you can have reproducible and sane build environments via VM images.

It's limited to 2 VMs because Apple's software license agreement for MacOS: https://www.apple.com/legal/sla/docs/macOSSequoia.pdf

    to install, use and run up to two (2) additional copies or instances of the Apple Software, or any prior macOS or OS X operating system software or subsequent release of the Apple Software, within virtual operating system environments on each Apple-branded computer you own or control that is already running the Apple Software, for purposes of: (a) software development; (b) testing during software development; (c) using macOS Server; or (d) personal, non-commercial use.
Apple just really doesn't care about you, and as a developer, you're just a sucker to extract money from.

Realistically you can run more than 2 VMs with some work[0], but legally companies that provide CI and other virtual solutions can't buy 1 mac then get a license to run 100 virtual macs.

0: https://khronokernel.com/macos/2023/08/08/AS-VM.html


(insert name of any corporation ... listed on any stock market) just really doesn't care about you, and as a developer, you're just a sucker to extract money from.

https://darwin-containers.github.io/

Also, if you want to cross-compile in Linux instead of run a container: https://github.com/shepherdjerred/macos-cross-compiler


Don't worry, as soon as this becomes production grade enough to justify buying less Mac Minis in data centres, Apple with shut it down faster than the speed of light. They could have done this themselves if they wanted, the reasons were never technical to begin with

You WILL buy more apple stuff, whether you like it or not.

Why do IT professionals keep insisting that their consumer electronics are built with the intention of supporting their professional goals?

Because corporate policy requires these IT professionals, developers, and related engineers to use these consumer electronics in the workplace?

You are forced to use macOS if you want to make apps for macOS. If there's some difference between an older macOS and the latest one, you can't run said older version, get to the bottom of what's happening, and patch your application to fix it on the M4 Macs.

Because Apple spent a lot of money marketing them as tools for professionals?

Come on, Apple. What are you doing? I was thinking just the other day that Apple should virtualize older iPhones within the latest iPhone system software, so you could seamlessly open old apps and games (32-bit, anyone?) in their own containerized environments. I can't think why they haven't added this feature for any reason other than money grubbing.

You could even customize the containers to be completely closed off from the rest of the iPhone—no contacts, no Internet access (or high security Internet access), etc.

Come on, Apple. Do something good for once. Oh and bring back the headphone jack.

-Mark


For better or worse it's never been Apples MO to keep software working forever, that's Microsoft's schtick. PPC OSX software is gone, x86-32 OSX software is gone even on hardware that could still run it natively, AArch32 iOS software is gone, and if history is any indication it's only a matter of time before x86-64 OSX software is gone too.

One time I had to run a very old version of Eagle CAD on Linux and it turned out that even tho I had a native Linux version, it was easier to run the windows version in wine! I guess stable interfaces have their advantages.

the community has been joking that win32 is the most stable Linux api

I have a half-joking, half-serious thought: has anyone written a desktop environment for Linux that uses the Win32 API? Since Win32 is much more stable than Qt and GTK, it would be easier to target that API. The side bonus is API compatibility with Windows.

This might not have been viable 25 years ago when KDE and GNOME were in their infancy, but WINE has come a very long way since then. Standardizing on Win32 would eliminate the churn of dealing with Qt and GTK major version revisions.


> Standardizing on Win32 would eliminate the churn of dealing with Qt and GTK major version revisions.

What makes it so hard to write a GUI toolkit that is long-term (say for 25 years) backwards compatible. If Microsoft is capable of doing this, why can't open-source developers?


In the Linux desktop world, there is no single entity in control over the entire software stack ranging from the kernel all the way up to the desktop environment. In a typical Linux distribution, you have the Linux kernel (run by Linus Torvalds), various command-line tools written by many different developers and managed by different projects (some of them are part of the GNU Project, but others aren't), some type of display system (this used to be solely X11, but Wayland is growing in popularity these days), one or more GUI toolkits (Qt, GTK, some custom ones), and a desktop environment (typically KDE or GNOME, but others exist). The goal of a Linux distribution is to take these disparate parts and present a coherent system to the user.

The problem, though, is that because the Linux desktop is made up of disparate parts from separate teams that have separate, often competing visions for their roles in the Linux ecosystem, often major changes are made with little regard to how they affect the system as a whole. This is the essence of the lack of control over the entire software stack. Thus, the developers of X11/Wayland, Qt, GTK, and other infrastructure can make breaking changes, and application developers relying on those subsystems have to either adapt or lobby for forks. Thus, the churn.

By comparison, Microsoft is in full control over Windows, and Apple is in full control over macOS. Even the BSDs are in full control over their base systems (for example, OpenBSD isn't just a kernel; the OpenBSD team also has control over the command-line tools that make up the base system), though I'm not aware of any BSD (besides macOS) that is in full control over GUI environments. It's not to say there is no churn in these environments; indeed, macOS does not prioritize backwards compatibility like Windows does and thus there's some churn macOS developers need to deal with in order to keep up with the latest releases. But there seems to be a lot of churn at the GUI level in the Linux desktop ecosystem.


ReactOS comes to mind, although it's not Linux.

> has anyone written a desktop environment for Linux that uses the Win32 API?

No, but window managers who use Xt, Xaw or Motif are ok.


Linux (the ecosystem; not necessarily the kernel) is actively hostile to binary software.

It baffles me as to why. I think it’s hilarious how Linus is so careful to not break user space (for good reason) and all the user space libraries break every week.

Because every distro runs it’s compilers with a variety of flags and safety features and library versions and ABI quirks that make supporting stable ABIs a pain in the butt.

Distribution maintainers pretty much do whatever they want with the OS level ABI. That on top of whatever those user space libraries want to do anyways makes native application ABI stability basically impossible.

Again, how do Windows and macOS (to an extent) solve this? It’s possible, just not incentivized. Show me the incentives and I’ll show you the outcomes.

I think the main thing is that it's common for apps to bundle their dependencies. The default for Linux is to use the system libraries for everything - not just glibc but also things like zlib, libpng, etc. As a result you have to go to significant extra effort to make a portable binary app, e.g. linking against musl.

That's one of the attractions of Go, and to a lesser extent Rust; it's way less work than C to get a portable binary.

I think 90% of the problems I've encountered are due to glibc. They could easily fix all of them by adding a GCC flag that would allow you to compile against old glibc versions.

They'll never do that though because they are ideologically opposed to it.


Of course

Source software is the way to go (compiled specifically for one version of thé targeted OS, you sont have many issue)

Distribution opaque binaries are indeed not the best way, even if you can do it easily with static linking


Back when IDA Pro for Linux & macOS was finally released, they decided to make every OS a separate license purchase. The net result of this was that every single person I knew who used it just kept buying Windows licenses and using it under WINE when they wanted to use it on their other computers.

You make a good point. It was kind of the breaking point for me when Apple killed 32-bit executables, because it meant even old steam stuff couldn't run.

But that's a casual consumer viewpoint. It's valid to buy them if they solve your problems in the here-and-now. (I used one for a year at work and it was a bad experience, but a lot of that was having x86 libraries I had to use, so... Bad choice for here-and-now.)


Interesting juxtaposition against yesterday's front page: Valve updates Half Life 2 for 25th Anniversary.

If the requirements are still accurate, it will run on XP with 512MB RAM.


In addition to this, Steam provides an option for developers to expose the older game versions to the players, which Valve themselves make an active use of. So if you have a specific old mod that’s not compatible with the new update, or don’t want to deal with the update in a middle of playthrough, you don’t have to upgrade

Maintaining the build artifacts and pipelines and testing backward compatibility for a long-lived project like HL2 must be pretty difficult, I would think? That’s a great example and counterpoint.

I think it's likely x86-64 support (via Rosetta) will continue for quite some time.

Rosetta is giving Apple a competitive advantage by being able to run x86-64 binaries in VMs (Linux or maybe even Windows) at near-native speeds. This enables doing cool things like running MS SQL Server in a Docker container - which enables developing on a full local .NET stack on a Mac.


Maybe I'm missing something but I run SQL Server in Docker under Windows WSL2 at near native speed.

What's the competitive advantage here?


Having tried both, I would in fact say that WSL is a huge advantage for Windows over the Mac in many cases. Sure, the Mac is a *Nix, but there are lots of small differences from Linux that cause issues. WSL runs very, very well.

I suppose allowing developers targeting x86_64 Linux to still use Macs and the power efficiency of ARM CPUs, since I don’t think (maybe wrong) that ARM Windows machines support emulation inside WSL.

But that’s more feature parity with x86 Windows machines, not an advantage.


How is feature parity not an advantage? If it were to be removed, the lack of it would be a disadvantage.

Using macos rather than windows.

Perhaps there will be an intermediate step where they drop support for x86-64 executables, but retain support for x86-64 virtualization. That would still let Apple slim down the macOS system frameworks to a single architecture like they did previously when they dropped the x86-32 frameworks.

There is no support for x86-64 virtualization on ARM Macs. Do you mean dropping support for Rosetta for macOS apps but keeping support for Rosetta for Linux VMs (which run ARM Linux kernels and use Rosetta to support x86 userspace software)?

Yeah, that's just their MO. I think it's easier to run old windows games on a mac than to run old mac games.

And architecture aside, at one point I had to install an old version of iWork (I thin it was '09) to update a file so the latest iWork could read it. They had code on hand that could read those older files, but decided not to integrate it. They don't prioritize backwards compatibility.


It's definitely for the worse that they've gone so far in that direction.

32bit ARM and aarch64 are wildly different instruction sets. 32bit ARM may as well be x86 or MIPS as far as running it on aarch64 hardware, it is going to require just about the same level of emulation(memory models may be similar which would help, but that's about it).

Unlike x86/64, the 32bit silicon is entirely gone in most aarch64.


I wonder why Intel and AMD still keep the 32 bit and even 16 bit parts. Are there people still running many legacy apps?

As a consumer, yes. Old steam games are a big deal.

In business... not where I work, but I hear stories of shops that still have lots of old 32-bit COM stuff, etc.


Intel has proposed dropping 32-bit and 16-bit support in the future.

https://www.intel.com/content/www/us/en/developer/articles/t...


The proposal doesn’t remove 32-bit user land or (I think) virtualization.

X86S allows 32-bit ring3 (userland) but even VMs are stuck in long mode and only support 32-bit code for userland. Booting a VM for a 64-bit OS that has a legacy bootloader with 16-bit or 32-bit code would require software emulation of that code.

On windows, a lot of installers are 32-bit even if the application they're installing is 64-bit so that they can fail gracefully on 32-bit OSes.

Why would you care that the installer fails gracefully?

It's helpful for the users

The OS already throws a specific error message, and it is the OS that should be responsible for this.

This gives you no opportunity for a customized product-specific upgrade UI!

Choosing to install the 32-bit version could also be an option I suppose.


32-bit applications are still pretty ubiquitous, including Office add-ins, and there is no particular benefit on x86 in removing support for 32-bit on the application side.

Yes

>32bit ARM and aarch64 are wildly different instruction sets

Maybe for the CPU implementation, but having written a lot of ARM2 assembly, the disassembly of Aarch64 is far more readable than x86_64 to me.


Apple does a lot of good stuff, But remember that their whole business model is selling hardware. They have no financial interest in making it easy to continue to use old phones.

They have to maintain a balance that still incentivizes current purchases. Otherwise it’ll be a constantly trend of “don’t buy now, support might not last.”

For some value of 'old' perhaps.

In terms of length of official support, and aftermarket value, Apple is at the top of the game. Those strike me as the most important metrics here.

And while you might think that once official support is over, that's the end of the story, this is far from true. Those phones end up in developing markets, where there's an entire cottage industry dedicated to keeping them going. Jailbreaking is getting harder, so that might stop being an option eventually, but so far it's viable, and that's what happens.


This isn't as true as it used to be, now that Apple is getting increased revenue from subscriptions. If your old iPhone continues to work well, then Apple has a better chance of selling you Apple Music, Apple TV, etc. etc.

Old phones no, but old apps yes. If a developer has abandoned an app and hasn't been investing in the update treadmill, but end users still care about it, that can make people feel negatively about Apple.

> Old phones no, but old apps yes. If a developer has abandoned an app and hasn't been investing in the update treadmill, but end users still care about it, that can make people feel negatively about Apple.

On the other hand, it is well within the standard Apple approach to say "here's how we want people to use our hardware. We are well aware that this is not consistent with how some potential and past users want to use the hardware, but we are comfortable with losing those customers if they will not adapt to the new set-up."


I know it's not the Apple approach, I'm just pointing out an interpretation that it isn't particularly focused on end user needs in this area.

I feel like it's mostly an attitude about where to focus engineering resources, which is very "inside baseball", but people have post hoc justifications that it's really what end users want anyway.


In this case, they only broke their newest hardware.

Or any interest in reducing the incentives to buy their $200 Bluetooth headphones.

> Apple should virtualize older iPhones within the latest iPhone system software, so you could seamlessly open old apps and games (32-bit, anyone?) in their own containerized environments

What is the practical, broad use case for this? (And can't you virtualize older iOS version on a Mac?)

> bring back the headphone jack

The article is about Macs. If you want a headphone jack, get a 3.5mm:USB-C converter.


Speaking of headphone adapters. It’s crazy to me that something like an iPod released in 2005 will output better audio when playing a lossless file than the most state of the art $2,000 iPhone with Apple’s most state of the art $549 headphones in 2024.

The remarkable thing is that 90% of listeners don’t seem to notice.

Their reference point is a lossy 128kb/s file from a streaming service double transcoded over bluetooth so that must be what music sounds like. Who would have thought technology would progress backwards.


What streaming service even does 128kb/s? Youtube is the only one that comes to mind and that's for free usage only. Paid accounts get 256kbit AAC

Spotify uses OGG Vorbis codec and streams at 160 kbps at standard bitrate and 320 kbps at high quality

In addition to AAC, the entire Apple Music catalog is now also encoded using ALAC in resolutions ranging from 16-bit/44.1 kHz (CD Quality) up to 24-bit/192 kHz

Amazon Prime Music at 256 kbps

That's about 99% of the streaming music market people actually use


Tidal, SoundCloud, Deezer, and Bandcamp offer lossless support.

> remarkable thing is that 90% of listeners don’t seem to notice

That's not a remarkable thing, it's the reason.

(And out of the remaining 10%, a good fraction may notice but prefer the new convenience. Those who remain can find the difference between most and all, or go corded.)


>Their reference point is a lossy 128kb/s file from a streaming service double transcoded over bluetooth so that must be what music sounds like. Who would have thought technology would progress backwards.

The only major streaming service that doesn't do lossless is Spotify.

Further just about no one is going to be able to tell the 256kb/s AAC that the iPhone sends to headphones across bluetooth from the lossless audio file.

Also, portable headphones have progressed leaps and bound since 2005 and they'll all basically sound better playing over bluetooth than the portable headphones that were out in 2005.


Of course to make this strawman argument you have to ignore the previous comment that says you can do wired connections just over a different port type.

Let’s also ignore any understanding of the DAC quality between older iPods and newer iPhones, where even the dongle Apple sell are considered a high quality DAC.

Let’s also ignore any advances in Codecs in that time, or advances in audio hardware itself.

Let’s also ignore that most iPod users would have bought low quality MP3s or medium quality AACs at the time. Not to mention that most customers use streaming today so wouldn’t even be able to take advantage of the higher quality codecs today.

Finally let’s ignore customer preferences and what niche set of customers would have bought high end enough audio equipment and have the hearing to appreciate and also not want wires today to even fall into your narrow band description.

Who would have thought that if you ignore all aspects that are inconvenient to an argument that you could make any argument you want?


And they’re listening on AirPods or whatever stuck on their ear. I have AirPods 2 Pro and sure, they sound nice. Less sweaty on the treadmill. But even a $100 DJ headset from a $200 streamer blows it away.

That’s a bit apples to oranges because you’re comparing different form factors completely.

Form has a huge impact on acoustic properties and comfort.

You’d want to compare them against IEMs.


>I have AirPods 2 Pro and sure, they sound nice. Less sweaty on the treadmill. But even a $100 DJ headset from a $200 streamer blows it away.

Doubt. Doubt. Doubt. Airpods Pro 2 are actually decent headphones worth the amount of money they are. They are most definitely better than $100 DJ headsets.


Sure they are quite good. My DJ headphones are Sennheiser HD25s (so $125). Night and day difference. The bass is completely different.

Yeah.. Those aren't as good or as accurate as the Airpods Pro 2. They are very bass heavy and muddle mids and highs where the Airpods Pro 2 are much more neutral in sound signature.

Now if you like it that way, great! But that doesn't mean the headphones are objectively better than the Airpods Pro 2. It just means you like those ones better.


Ha ha. Bollocks. I mean they are good for little things you stick in your ears, but ha ha.

Believe what you want, but I dare you to state your opinion in r/headphones and see how that goes for you.

I mean, if you need the backup, you do that and I’ll get popcorn. I’ve never seen anything on Reddit get agreement. Even r/FuckTedFaro had someone arguing he was just misunderstood.

iOS dropped 32-bit support because the CPU itself no longer supports AArch32. Virtualization won’t help.

I don't think almost anyone actually want a headphone jack anymore. Just the consumer reality in 2024. I do obviously, but it's clearly too rare for them to bother with.

Why do you want a headphone jack? Not a rhetorical question, just genuine interest. Is it about audio quality, avoiding batteries or something else?

I would find it so cumbersome to use a cable on a handheld device nowdays. But different things for different people! :)


Not who you are replying to but my $30 wired Apple earbuds (came with my 6S) have outlived all of my co-workers half dozen $160 AirPods. That’s reason enough for a lot of people.

For $8.99 you can buy a high quality USB-C (or lightning) DAC with a 3.5mm output directly from Apple.

It's tiny and lightweight. I keep one in the back of my headphone case.


There are never enough USB-C ports.

You can buy Apples wired earbuds with the lightning connector for $18. Or the lightning to 3.5mm adapter (that’s what I have because I also still have my decade old original earbuds).

Yeah I agree, that is a very valid reason by itself.

My wired Shure in ear monitors have much better sound quality, and battery management on AirPods is pretty annoying. Even when they’re not running out mid-trip, it’s just unnecessary mental overhead to keep another thing charged.

For me it’s because

- I already have high quality earphones, same set for many years

- they don’t require charging

- audio quality is great

- they’ll work on any device with a 3.5mm jack, no proprietary lock-ins

- I have never lost a set of earphones and if I did replacement wouldn’t break the bank


I don't daily drive my phone for commuting anymore, but the trade-offs aren't exactly new: - battery frustrations - cost of a dozen cheap but good quality headphones vs a wireless equivalent - easier to lose wireless headsets when you put them down somewhere (wired too, but way cheaper so less big deal) - audio quality? Who knows

For people that demand noise cancelling, you need an active power source, but I personally hate noise cancellation and always turn them off. Maybe valuable in a plane with lots of engine noise.


Easier to store vs bulky charging case and charger and charger cable etc. The wired solution is more portable, believe it or not.

Simpler interface to debug and fix than bluetooth.

Yes people seem happy to buy not only new phones every couple of years but new accessory devices as well. I don't understand it but it's quite apparent.

I find it very practical with small Bluetooth earbuds, but I agree on the consumption aspect of it. I really don’t like that I can’t change the batteries in my AirPods. I would even be semi-okay with having to hand in them to a technician for battery exchange, for a reasonable cost. But the current battery exchange for airpods is just another name for buying new earbuds. And the third party solutions that actually change the batteries cost about as much as new buds.

Someone would have to maintain all the old OS versions that would be required to run those old apps, and keep those OS versions reasonably secure. That sounds like a maintenance nightmare to me.

No, the old versions would not have access to anything else, so the only thing that needs to change is the part running the container. Present one folder as the whole disk drive to the old. Send touch screen input to the old. That's about it really.

If you're trying to play older 32-bit games, chances are you'd like to have sound, too.

Who is going to develop the virtual audio device drivers (for each OS) that are required to virtualize sound? Who is going to run all the tests needed to make sure that the guest-side 32-bit iOS 7 (and 8, 9, 10) drivers are going to play well with the host-side driver?

Who is going to accept the risk that someone takes over the guest VM after exploiting one of the well-known and unpatched kernel-level vulnerabilities present in iOS 10? What happens if malware finds and exploits a bug in that new audio driver virtualization pipeline so it can escape the VM and then compromise the host iOS?


yeah it's much better to lose access to games you've paid for. this is why I stopped buying apps on Android - if it's not open source, I'm not interested. I also don't buy DRM stuff unless I know I can remove the lock.

It's probably worth finding out whether this is a bug or an intentional decision before making assumptions.

For Apple Hanlon's razor rarely holds, they're both too competent and anticonsumer for this sort of thing to not be intentional malice in an attempt to sell more stuff. Maybe not, but very unlikely and even if it's a bug they'll probably call it a feature and keep it.

[flagged]


This is one of the things I dislike about hacker news: People responding to what is clearly emotional hyperbole as though it were a literal statement.

The OP expresses disappointment with Apple -- exactly what the cause was is unstated. People are allowed to have such feelings. I've had them myself. In recent years, Apple has killed things I liked and pushed a lot of services/login crap I have zero interest in. Other people like the new changes. That's OK too.


Nobody wants to read emotional hyperbole when it's just an excuse to lie or exaggerate instead of being truthful or accurate. It's more common than not to find people expressing disappointment with Apple online who are lying, ignorant to the point that their complaint is just incomprehensible, using long outdated and incorrect information, exaggerating the problem to the point of absurdity, or applying expectations or standards that literally no company or alternative product on Earth meets. A good chunk of the time you can even get people to reply in a way that betray the fact that they never even owned or used the Apple product they're complaining about, they're just seeking upvotes or karma with easy and popular sentiments even if they're factually wrong.

Holding Apple accountable and having a negative opinion is fine, but it's a fairly rare thing to find someone online doing it in good faith.


I agree that feelings are okay. But also the internet and society is so overloaded with emotional hyperboles. I like with HN that a lot of people make the effort to be bit more diplomatic, less aggressive and more based in facts than most online communities.

> OP expresses disappointment with Apple

I read it as polarisation against Apple. As in what follows is not to be taken literally, but as a diatribe. The message being, in essence, "I don't like Apple." If that wasn't the intended message, OP is right--the comment is stronger without it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: