Hacker News new | past | comments | ask | show | jobs | submit login
After initially rejecting it, Apple has approved the first PC emulator for iOS (theverge.com)
237 points by Shank 3 months ago | hide | past | favorite | 181 comments



I just ran genuine 64bit Linux on a non jailbroke iPhone! I have waited so so so long for this. Finally. Thank you EU regulators!!


The things we could have had if it wasn’t necessary to use regulation to force Apple into doing the right thing.


The things we had. Full and complete service manuals to TVs, VCRs and cars. Access to parts etc. and actual ownership of what we purchased.

We had all that. It was taken away.


>Full and complete service manuals to TVs, VCRs and cars. Access to parts etc.

To be fair, most electronics these days are so tightly integrated (literally IC, Integrated Circuits!) that there's not much you can do even with a full on schematic.

Long gone are the days of bigly electronics parts connected by a spaghetti nest of wires and third grade soldering.


Phones, laptops, GPUs and many other modern highly integrated electronic products are repairable. Look it up, there are tons of videos on YouTube of people doing component level repairs. Leaked schematics make a huge difference here.


> Look it up, there are tons of videos on YouTube of people doing component level repairs. Leaked schematics make a huge difference here.

I've watched these videos — and the people you're referencing doing this are rarely using any kind of schematics at all to repair modern digital-logic boards. And not for lack of accessibility!

Modern logic-board designs consist of a few proprietary ICs, plus generic labelled support components (e.g. VRM caps, surface-mount resistors and diodes, etc.)

You can repair these boards, but these repairs fall into three categories:

1. bad solder joints or broken traces — which mostly just requires looking at the board carefully to notice these.

2. bad generic support components — which in theory you can determine the need for by testing with various multimeter modes across the individual component's legs; but more often you just notice that what the component is in line with isn't working, and "swap out to see if that fixes it." And which where such a swap-out can be done by just looking the part to figure out what it identifies itself as; then de-soldering it and soldering on a replacement.

3. bad proprietary ICs — which you determine by tapping the signal lines leading from/to the IC on an oscilloscope; and which you "fix" by buying other for-parts copies of the board, de-soldering those ICs off of the sacrificial parts boards, and soldering them onto the "almost good" board.

In none of these cases would referencing a schematic help! They're all effectively "context free" repairs — see, probe, think, do.

(A schematic can in theory help you to find test points to differentially diagnose 1 vs 2 vs 3 in the case where a board is failing mysteriously... but once you have some experience in board repair, you can get 80% of the same information by just staring at the board for a minute.)

---

Of course, if you're repairing a power supply, or an audio receiver, or some still-half-analogue electronic appliance from the 1970s — then yeah, schematics help. But these types of systems do still come with those schematics! (You just need to buy the thing directly. You aren't getting a schematic for an "embedded" PSU with a computer; but you do get the schematic if you buy that same PSU at retail at a Shenzhen parts-mall. And you get forwarded that same schematic [and more] if you make an industrial order of 10000 of them, too.)


Have you missed Louis Rossmann‘s videos? He appears to use schematics in troubleshooting a lot and seems to consider them to be important to his work.


Those ICs have decomentation and i2c pins that allow the vendor to debug them. Providing that documentation to the user (along with any keys needed to make use of it) would be the modern equivalent to electronics service manuals.


Then how come Apple and so on are actively working against the DYI community?


Accessibility goes in, shiny new toy comes out

Humans go ooo, accessibility never comes back again..


I implore you to not trivialise and belittle “humans” for not always prioritising what you personally would like them to. Different people have different preferences, and that can and does can happen intelligently, not just the result being “wooed” by marketing or whatever.


However you call it, it is essentially trading one's freedom for convenience. Yes it is a result of more general social dynamics rather than mere advertising, it even follows the political trends of our times. But it is time to realise the price we pay for this sort of convenience, and that it does not come for free. Advertising does not tell that.


In our quantified society, we might need some numbers to characterize relative freedom and convenience.


I mean that could be true only if you don’t account for the obvious reality that the lockdown of hackability, serviceability, access to parts is deliberate.

Personal preferences matter little in a duopoly industry. Sure, you could argue that other aspects are more important than serviceability, but it’d be insane to argue that people want less serviceability. So in short, we were better off in this aspect before. 100%.


> the obvious reality that the lockdown of hackability, serviceability, access to parts is deliberate.

I mean yes, it literally is, the law requires that smartphones incorporate a killswitch capable of resisting a device format/reload of the OS. This was seen as a consumer benefit in 2014, and did succeed in bringing muggings and phone theft down by as much as 80%.

https://www.pcworld.com/article/440002/10-things-to-know-abo...

https://www.theguardian.com/technology/2015/feb/11/london-sm...

https://www.pcworld.com/article/431818/drop-in-smartphone-th...

This is what the sibling means about tradeoffs: people at the time wanted the tradeoff of fewer muggings. The killswitch was seen as a benefit, and largely Apple is in fact just responding to the legislative requirements pushed upon them.

And if you have to have a killswitch that can resist a OS reformat… there has to be some component pairing mechanism that works underneath the OS layer, naturally. That is the sensible way to implement it, otherwise you'd just swap in a new motherboard and hey, phone's working again. Or strip the phone for parts.

The idea of parts being anti-theft/strip-resistant is pretty much inherently in contradiction to right-to-repair. And again, I think people probably understand that... but never forget that theft-resistance was a legislative initiative pushed over the objections of Apple and other phone vendors at the time, having some unforeseen consequences. People actually did want this, people lobbied for this, as much as modern readers may not believe it.

https://www.cnet.com/tech/mobile/is-the-smartphone-kill-swit...


Serviceability is not an unalloyed good. There are always tradeoffs.


From the point of view of the consumer, serviceability is negatively unalloyed. That is, it can always be removed without an associated (positive) tradeoff for the consumer.


[flagged]


The topic at hand is not about a preference that most people would not find objectively good or bad. Why is this particular preference bad enough that one should feel bad for having it? Without statements supporting this view your post doesn’t make much sense.


they wouldn't because, like many teenagers smoking their first cigarette, they don't understand the consequences of satisfying that particular preference

people often have preferences that harm them, so the fact that a particular choice is voluntary isn't a very strong argument that it's healthy, even individually, much less collectively

this probably isn't a productive place to try to reason out the societal consequences of people acceding to these particular losses of freedom


[flagged]


This subthread could exist verbatim in any comment thread because it's devoid of substance. "Some positions are better than others", "why?", "because {poignant but vague analogy}".

Why specifically is it bad for people to prefer devices that are small and tightly integrated to devices which are repairable but make trade-offs to get there? I own a Framework laptop for its repairability but would warn anyone interested in it that the battery life won't work for all use cases. It's a trade-off that I've made that doesn't make sense for everyone.


>Why specifically is it bad for people to prefer devices that are small and tightly integrated to devices which are repairable but make trade-offs to get there?

For repairability, upgradability, hackability, the environment, waste, tighter vendor control, and so on...

Do you need more reasons?


Reasons is a good start. Explaining and justifying those reasons would be better.

I choose hackability and upgradeability and less vendor lockin because I myself value those things, but I see that as a personal preference and not something I want or need to impose on the industry as a whole. Benefits to the environment and reducing waste are clearly valuable but it's not obvious to me without seeing data that such benefits would be high—many people tend to upgrade phones long before batteries need replacing anyway, so I would need to see data that demonstrates that a substantial number of things would be saved from the landfill by virtue of higher reparability.


>Reasons is a good start. Explaining and justifying those reasons would be better

The question was "Why specifically is it bad", and I gave 6 reasons. Does the "justyfing" demand stop at some point, or we'd get down all the way to justifying why being alive is better than being dead, why anything ever matters, and so on?

I think the reasons are self-evident - i.e. it shouldn't take any argument as to why repairability and upgradability are good, why being more at the mercy of the vendor is bad, or why protecting environment and being less wasteful matters.


I brought up counterpoints to each of your reasons, but if you don't want to have a conversation about them that's fine. I'm not particularly invested in my counter position, I just get very tired of the zeitgeist where people think it's okay to dismiss a popular perspective because it's "obviously" wrong.

And no, flat-earther comparisons aren't going to fly, I'm talking about perspectives held by tens of millions.


Waste and environment are the only reasons you gave that speak to the preference being bad in any sort of objective sense. The status quo ante iPhones was lack of repairability in cheap, small devices. The devices cost little and there was little economic friction to just getting another one. I don’t know for sure but I imagine the average length of use for Nokia devices was less than that of iPhones.

The lives of people with wealth (which by world standards is just about everyone in the U.S.) is enormously wasteful, and unsustainable. Instead of focusing on one aspect of waste in our lives we should take a holistic approach.


>I don’t know for sure but I imagine the average length of use for Nokia devices was less than that of iPhones

Wasn't and didn't have to be. Ancient Nokia mobile phones would still be perfectly usable, protocols aside, for basic phone functions like talking and messaging. And you could even use them to bash on and mend a dent in your anvil.


it isn't


It’s unclear who the “they” are in your comment and even if you did make it clear your post is less meaningful than the one I responded to. What was the goal with this retort?


this probably isn't a productive place to try to reason out the societal consequences of people acceding to these particular losses of freedom, but i thought it was worth pointing out that people often have preferences that harm them, so the fact that a particular choice is voluntary isn't a very strong argument that it's healthy, even individually, much less collectively

i appreciate your critique of the clarity of my comment and have edited it to clarify


And the fact that your preference is different than someone else's doesn't automatically make yours the better one.

This line is copy/pasted into dozens of comment threads across dozens of topics, and while it's very representative of the zeitgeist it's a dead end for an interesting conversation and frankly it's a toxic, elitist mindset.

Defend the position, don't just assert that some positions are better than others and leave it to the readers to guess which one you're claiming is a priori superior.


>And the fact that your preference is different than someone else's doesn't automatically make yours the better one

Of course, it's better for other reasons, not because it's mine.

>it's a toxic, elitist mindset.

I'm all for elitism, it means celebrating and promoting higher standards.


Then defend your position. "Not all opinions are equal" is a lazy line that does nothing to further any cause and if anything just makes you and your cause feel dismissive and devoid of critical thinking.


>"Not all opinions are equal" is a lazy line that does nothing to further any cause

My primary wish for this comment about not all positions being equal wasn't to "further a cause", but to stop the bad argument that amounted to "people just have different preferences" as if that made all of them ok.

If we agree that "not all opinions/preferences are equally good" is a better idea than "all preferences are equally ok" my work is done!

As for the specifics, others (and me, elsewhere in this post's threads) have argued several reasons why this tight integration is worse. In fact I gave six of them.


People's different preferences all being equally valid should be the default position, with evidence required to say otherwise. Simply stating that not all preferences are equal does not contribute to the conversation as much as reminding everyone that different preferences exist does.


That there are good ideas and bad ideas, good plans and bad plans, good things and bad things, beneficial things and detrimental things, and if you like the former your preference is a good one, and if you like the latter your preference is a bad one is controversial in 2024, and shouldn't be the default position?


I too like to argue by claiming that my opinion is the default position that doesn't need to be backed up with reason while everyone who thinks otherwise better come up with an irontight proof that theirs is better. Checkmate.


The things we could have if we finally accepted regulation is important for markets not hostile to consumers.


What real world problem did this solve for you?


It's a step in the right direction, being able to squeeze all the perf out of these devices to run proper full-fledged OS' that Apple seems to not want to do. It's okay to get excited over stuff like this and I'd bet a lot of good shit that solves your real world problems started from hobbyist adventures (see this little project called Linux).


I'd say Linux created far more problems than it solved, and would count as one of the largest net negatives in the economy of the technology sector historically.

I think it's fine to get excited, but there's also getting too excited if there's no rational need for the software. Which I can't really see, since there always have been other devices to buy.

Like if I need to do spreadsheets I will get a device made for that, not demand that regulators force Apple to let me install Excel on my phone.

Hackers and Europeans framing this as some kind of civil rights or human rights issue is to me abut as ridiculous as if regulators would force every restaurant to make their menu gluten free. Which will probably happen soon, since the mental children are at the helm.


Could you explain why you think that Linux created more problems than it solved? In my eyes it is the backbone of most of what I do as a developer and I can’t imagine trying to work without it.


The accumulation of wasted man-hours for developers and users. Without the allure of "free", I believe there would have been ample competition for more user friendly paid software doing what Linux is used for now. With the bonus that the developers would have been paid for their work and that tech giants would have to pay for their tech rather than use unpaid labour for free. (I know some of them also contribute to open source)


The point view that Linux is a net negative for the world is incompétence to understand the field. To believe that a commercial offering is absolutely better than an open-source offering is so childish. A world without Linux could be better, sure. But there’s only a tiny little chance of it happening.


It can only be speculation, but it's my belief. Linux became a server system for big applications. Who benefits most from such software are huge businesses. Who pays for their benefit are the Linux developers doing very specialized and competent work for free.

Why is there no consumer friendly solution for having a server connected at home and publishing your own website to the world, without having to be very skilled at computers? There are consumer friendly solutions for everything else: office suites, e-mailing, photo editing, all sorts of creativity. Consumers are now regulated to posting their online stuff on social media, because that is what is accessible to them. And I want to blame Linux partly for this, because the great allure of "free" halted any consumer solution.


Entirely irrelevant.


That response tells a lot, doesn't it?


Yes: being able to do what we want with the things we bought is reason enough.


Feels like this app alone make ipadOS more usable than all the past effort from Apple combined.


Don't get too excited. It's the SE "Slow Edition." Apple won't let apps access JIT acceleration, let alone the iPad's virtualization capabilities, so this is all interpreted and barely usable.


Baby steps :)

We know that recent (M2+) Apple Silicon has hardware support not only for virtualization, but nested virtualization. Time may improve Apple's risk/reward for enabling virtualization in specific iPad use cases.

Thanks to aShell and iSH emulators for pioneering Linux-alike experiences on iPads, at the cost of overheated silicon.

Thanks to Asahi for bare-metal Linux, including virtualization, on Apple Silicon.


> Time may improve Apple's risk/reward for enabling virtualization in specific iPad use cases.

It's more likely that the EU will improve it.

As things stand today, Android allows for JIT acceleration on all apps, essentially unrestricted. The DMA only allows for security measures that restrict APIs insofar as those measures are "strictly necessary and proportionate".

Since Android manages just fine on that front with full JIT, Apple's defense relying on that DMA exception will fail.


So why does my android phone not allow sudo or chroot, and when I root it my bank application stops working, citing the EU payment service regulations? Virtualization is not enabled too, even though the cpu supports it.


Recent Pixel devices support Android pKVM nested virtualization, with optional 2-way isolation, https://android-developers.googleblog.com/2023/12/virtual-ma...

  Even if Android is compromised all the way up to (and including) the host kernel, the isolated VM remains uncompromised
This allows virtualized Linux VMs on supported Pixels, running alongside Android, https://www.xda-developers.com/nestbox-hands-on/ & https://www.esper.io/blog/android-dessert-bites-13-virtualiz...


When will that come to non-Pixel devices?


Possibly when Qualcomm brings Oryon (with silicon support for nested virtualization) from laptops to smartphones in late 2024, https://www.androidauthority.com/qualcomm-snapdragon-oryon-c... & https://www.qualcomm.com/developer/blog/2024/01/gunyah-hyper...

> we have replaced pKVM with Gunyah Hypervisor Software – our own, more versatile hypervisor – on our Qualcomm Technologies’ chipsets. With Gunyah, we can use one hypervisor across use cases as varied as automotive, mobile broadband, IoT and wearables. We’ve upstreamed Gunyah for general use by the Android community.


Chroot works fine though


No, it doesn't. It requires root, and that breaks all the apps.


>Android manages just fine on that front with full JIT

Is this the same Android that has a 5x+ share of malware compared to iOS?


I'd be interested to see how much of that malware targets and is effective on big OEMs' devices using Google Mobile Services compared with non-GMS Androids mainly in the China domestic market. In principle the non-GMS devices can be just as secure, but in fact do they give privileged access to vulnerable code?


how much of that malware do you think actually uses the jit?


"... Since Android manages just fine on that front with full JIT"

How do we know this? No vulnerabilities/exploits reported using these?


There’s a reason that you’re falling back on imprecise language: “manages just fine”. That is not sufficient ground to responsibility implicate this regulation. There’s an obvious reasonable security argument for Apple to make here. To say otherwise is just letting this stupid OS war get in the way of any actual engineering competency you may have. Be fair.


You aren't really being fair by letting a multitrillion dollar company get away with such a simple excuse


Even for apps in 3rd party storefronts? I always thought these rules are somewhat relaxed for external publications but perhaps I was mistaken. It’s not like we have proper “side loading” yet.


These restrictions are enforced by the OS, not Apple's store rules. Even in the EU sideloaded apps can't do these things. It remains to be seen if their hands will eventually be forced, my understanding is that the EC is still working on striking down the Core Technology Fee, and these battles are going to take a while to catch each and every way Apple is trying to retain control of people's phones.


Who put the restrictions into the OS in the first place?


Tim Cook?


JIT will only be allowed for alternate web browsers in the EU.


If Apple loses the notarization battle then that's impossible. Any app that wants it would be able to request the relevant privileges.


Yeah. Even using sideloading before the EU stuff you couldn't use JIT. Altstore (not the EU version) managed to get it working I think, but it was very finicky and I think it doesn't work in newer versions of iOS


what's the reason behind being that restrictive?


App store monopoly.


Yeah but still, why would you restrict apps for no genuine reason?


If theres a question the answer is usually money in America...

Things they see as problems:

Reducing need for MacBooks,

Virtualised app-stores that use creative accounting or some other trick to circumvent apple tax,

True free-software culture on their platform,

Lack of control re privacy / data collection / platform guidelines / software publishing,

DRM circumvention...


With nested virtualization, iOS DRM/appstore VM can be safely isolated from OSS VMs.


Usable? In what sense? Not in the normal sense of the word. Because if it wasn't usable, very few would own an iPad. Instead, only a few care about running linux.


Some people think Apple's JIT restrictions standing up to the DMA is a given, but I heavily doubt that will be the case.

The DMA allows for security exceptions only when measures are "strictly necessary and proportionate", Article 6(4). Apple argues that this ban is crucial for security, yet Android permits JIT compilation and maintains robust security on that front for 99% of its user base without significant incidents.


>Apple argues that this ban is crucial for security

Foxes argue the ban on chicken wire is crucial for their security.


There are some tricks to make interpreters very fast.


the best i've been able to do with things like numpy, threaded interpretive forth, and ocamlc is about 15–20% of native performance. is that what you mean by 'very fast'? i wouldn't call that 'very fast' but it is an order of magnitude faster than cpython i guess

for example, this ocaml program runs 6× slower when compiled with ocamlc (which uses an excellent bytecode interpreter) than with ocamlopt

    let rec fib n = if n <= 2 then 1 else fib (n-1) + fib (n-2)
      and n = 40
        in Printf.printf "fib(%d) = %d\n" n (fib n) ;;
the ocamlopt-compiled version is about 5% slower than a handwritten assembly version (and orders of magnitudes slower than a version compiled with gcc, which is why i wrote the assembly version)


Well, we’re talking about emulation here, not a bytecode interpreter for a high level dynamically typed language. CPython is fairly slow, it’s written in C in a fairly understandable way, effectively using a large switch statement. There’s like three jumps or so per executed bytecode, it probably uses some jump table.

„All the tricks“ for an interpreter would mean 1 jump per interpreted instruction, or perhaps 0.5 for some of them. It would probably mean no explicit jump tables. It would probably mean the interpreter is set up pipelined like a real cpu - loading instructions on parallel with executing previous instructions. You can get perhaps 5-15% of native speed using for an interpreter all the tricks. QEMU can maybe get 20-50% (not sure these days actually, I haven’t followed improvements for a while). Rosetta can get close to 100%, using a JIT but also hardware that is specifically designed to aid emulation (memory accesses like on x86).

Given nowadays CPUs execute like 3-5 instructions per cycle, getting 5-15% performance for an interpreter would be very fast and very difficult to achieve, but possible - if designed from the ground up for performance.


i think it's unusual to hit 3-5 ipc sustained in normal code, but it seems like we're in broad agreement about quantitatively what kind of performance an interpreter can achieve. i mentioned three interpretive systems that do reach about 15% of native single-threaded performance with very different approaches: the ocaml bytecode interpreter, threaded interpretive code like gforth, and numpy

it's just that i call 15% of native performance 'very slow' and you call it 'very fast', because you're comparing the moped to the walker you're used to, and i'm comparing it to a sports car because that's what you're paying for

instruction set jitting getting a speedup instead of a slowdown goes back to last millennium with hp's dynamo https://dl.acm.org/doi/pdf/10.1145/349299.349303 and was central to transmeta's business plan. qemu's jit is sort of simpleminded to make it easier to maintain


A cpu emulator will only get to 15-20% with a lot of tricks. Usually a cpu interpreter would get like 5%. It’s not like Numpy, where you can increase performance by trying to offload as much execution as possible into native code. When emulating cpu instructions, the atomic instructions are tiny. But as I said, python is a poor comparison because the interpreter is very slow and there is a lot of overhead.


although i've written compilers and interpreters, i don't think i've ever written an emulator for a real cpu. i'm interested to hear about your experience


Those tricks are called compilation (whether that be AOT, JIT, or a mix of the two).


No, then it’s not an interpreter anymore.


Yes, that was the point.

The tricks to make interpreters fast are to make them not interpreters.

This one is already using just about all the tricks in the book to make a "fast" interpreter and it's performance is a drip compared to a JIT version.


QEMU interpreter uses all the tricks? Very much doubt that. QEMU is a JIT first, built around maximum flexibility, not performance.

A dedicated, optimized x86 on ARM interpreter could probably get 20% off the performance of QEMU Jit, if not more.


Do you actually have any experience with systems emulation, JIT/AOT compilation, Apple's x86->ARM hardware assistance that it's JIT interfaces offer, etc? Or are you just speaking off the cuff in a very general manner?

Because it very much sounds like the latter.

I have, quite extensively. I can assure you, that the source set they are using most definitely does use all the "tricks" (the chief two being threaded code [no, not that threading] and preoptimized jump tables). I can also assure you that losing the hardware acceleration offered by Apple's extensions and the general performance boost JIT/AOT offer is much more than 80%.

But, sure. For argument's sake, let's accept your (incorrect) premise. Taking a 1ghz potential emulated target processor down to 200mhz is a fairly drastic drop. Disallowing a whole slew of modern code/target OSes and verging the usability on nil.


i wouldn't describe sacrificing 80% of the machine's native performance as 'very fast'. a moped goes 40 km/hour; a ferrari goes 200 km/hour. you're paying for a ferrari but getting a moped


An interpreter that runs at 20% native speed would be considered a very fast interpreter. That would be an order of magnitude faster than a „trickle“.


what i'm saying is that, because an interpreter that runs at 20% native speed would be considered a very fast interpreter, which is still very slow, there are in most cases no tricks for making interpreters very fast


Apple built iOS with the assumption that apps can’t use JIT. They can migrate to a new set of constraints, but I’m sure they’re right that right now it’s necessary for security. It could take a while to patch all of the security holes from opening up JIT.


How does MacOS survive attacks while supporting JIT and a subsystem for running iOS/iPadOS apps?


It’s a different operating system.


https://en.wikipedia.org/wiki/Darwin_(operating_system)

  Darwin is the core Unix-like operating system of macOS (previously OS X and Mac OS X), iOS, watchOS, tvOS, iPadOS, audioOS, visionOS, and bridgeOS. It previously existed as an independent open-source operating system, first released by Apple Inc. in 2000. It is composed of code derived from NeXTSTEP, FreeBSD, other BSD operating systems, Mach, and other free software projects' code, as well as code developed by Apple.


I’m aware. They’re still different.


Kernel is largely what matters here


So somehow they managed to create shittier system despite having 30 years of OS building behind their belt is what you’re saying?


“Shittier” Is your word, not anyone else’s. Can you really not relate this to any experience you’ve had developing anything?

Different goals? Different system gets built. Different trade-offs. This isn’t hard to understand. You’re being overly harsh for ideological reasons.


> Different goals? Different system gets built. Different trade-offs. This isn’t hard to understand. You’re being overly harsh for ideological reasons.

Poor Apple with its virtually unlimited resources. Somehow Google, Microsoft, Linux and Apple themselves managed to make it work with JIT, but Apple can’t do it.


You are arguing about resources of a company that has an endless amount of it. The limitations mentioned could be changed/"fixed" with no time and effort.


Can you teach me how it’s meaningfully different in terms of securely running a JIT compiler? What components should I be learning about?


The no JIT rule is just a rule enforced through the app store review process, so it's not going to part of the security architecture. It could of make some static analysis fail, but there are plenty of ways to do that.


I doubt that's true because it goes against every modern OS security practice. Apple heavily practices defense-in-depth. Sandboxing, PAC, AMFID, etc. All those are even more onerous than Android. So no, giving away JIT privs wouldn't do much.

EDIT: What's with the downvoting?


My understanding comes from what jailbreakers say about the OS, what Apple says, as well as time spent working at Android where people that know what they’re talking about told me about iOS security.


Your misunderstanding comes from a group of misinformed individuals than. Or you've made them up.

Either way, you're wrong.


>Either way, you're wrong.

This claim is based on what expertise?


There are about a dozen other responders who agree with said assessment with technical reasonings embedded into their messages. Read those.

Suffice to say, I don't feel the need to rehash why the basic premise of "the OS wasn't built to handle JIT apps" is directly contradicted by it, you know, allowing JIT in all but one app.

If they offered something of a basic technical assessment or reasoning beyond "I know people", perhaps then I would bother to dig into some credentialed argument. So, in the same way it's not worth delving into "proving God doesn't exist", it's not worth proving how OP clearly has no idea what they're talking about. They bear the burden of their claims, not me.


Well, right now this technically works, but is unusable - melted off 15% of my iPad’s battery life just logging in to XFCE and doing an apt upgrade (Debian ARM64), and is very slow overall. I’m going to try turning off the GUI and decreasing the core count to 2, but I don’t expect this to beat a-Shell on practicality or even iSH on usability.

In short, I’m going to keep using an ARM sidecar: https://taoofmac.com/space/blog/2023/10/07/1830

Apple doesn’t let us JIT nice things, and it’s just sad.


How about we ban Apple from imposing arbitrary rules to cripple what is just a general computing device? They are too big and too monopolistic to be allowed this degree of control.


> just a general computing device

No, it's an Apple device. It _should_ be a general computing device, but Apple ensures that it will never be, and you should be aware of that when you buy it.


I don’t know about that. My washing machine could be a general purpose computer but I didn’t buy it for that.

I didn’t buy an iPhone to be a general purpose computing device for the same reason


And there should be a ban on big washing machine locking down their washing machines. People should be able to use the hardware for what they want.

It's a pretty pointless thing to say "I didn't buy an Ipad to do X so why should it be able to do X?". Of course you didn't buy your ipad to do X. It's not capable of doing X so you'd be pretty stupid if you bought it for that purpose. That has nothing to do whether the manufacturer of the ipad should be allowed to control whether you can do X.


Apple have no obligation to allow their devices to do X even if the device is capable of it. It’s well understood when you buy an iPhone you’re getting a closely managed experience. If that isn’t acceptable, buy something else


> Apple have no obligation to allow their devices ...

Pretty sure that once people have bought the device, it's literally theirs and not Apples.

At that point, what Apple wants for the device should just be advisory rather than something they can exert any actual control over.


The EU disagrees with your statement at least for some X. So Apple indeed is under the obligation to allow users to those X if they want to sell their devices in the EU.


And I guess technically correct is the best kind of correct, but rather than give money to Apple for some fantasy feature they will fight tooth and nail against, just buy something else.


These are not fantasy features. Just look at the simplicity of having USB-C on the iphone. The only reason this is there is because of the EU. Iphones would still be in the dark ages with lightning were it not for the EU.


iPhone is a computing device, washing machine is machine capable of computing. Big difference.

Jobs literally presented iPhone with “Desktop class applications & networking”.


There are plenty of options for handheld computing that Apple don’t make. I am confused by the attitude as if Apple don’t make it very clear what you are buying. Buying an iPhone and expecting it to be something other than a locked down walled garden is wishful thinking


Stupid consumers don’t understand they’re supposed to abandon their devices and ecosystems instead of demanding Apple to make changes.


Are consumers (outside this relatively niche environment) demanding Apple make changes?

I'm personally wishing the EU would shut up, but I can't demand they do anything.


> Are consumers (outside this relatively niche environment) demanding Apple make changes?

Relatively niche environment shouldn’t have a voice?

> I'm personally wishing the EU would shut up, but I can't demand they do anything.

What exactly bothers you when EU requests improvements from Apple?


The vocal minority of tech enthusiasts who want a more open computing platform are the same people who can make great use of the alternatives.

And the EU isn't "requesting" things, nor are they definitively "improvements". Apple has worked very hard to create a safe platform, a safe ecosystem, and the EU is demanding that they weaken some of those defense mechanisms.

I personally feel Apple has struck a reasonable balance, and I value the benefits.


> The vocal minority of tech enthusiasts who want a more open computing platform are the same people who can make great use of the alternatives.

Or like any other consumer they can do a feature request.

> And the EU isn't "requesting" things, nor are they definitively "improvements". Apple has worked very hard to create a safe platform, a safe ecosystem, and the EU is demanding that they weaken some of those defense mechanisms.

I’ll assume you’re talking about alternative app stores for EU, so how exactly having an alternative store going to weaken defense mechanisms for those who stay within walled garden?


Then iPhone is a phone capable of computing.


It's a computing device with a phone peripheral.


Just like computing device with washing machine peripherals.


Not so sure I agree. In the case of an iPhone, the making of voice calls is probably a small amount of all use cases for the device, it's more so a computing device with a phone peripheral. But a washing machine's main function is to wash, and the computer controlling it is really secondary to its main function. The computer could be removed and it wohld still be essentially the same function with relays and timers. If you take the computer out of the iphone and replace it with relays and such to control the "phone" part of it, then it wouldn't resemble an iphone at all, and have virtually none of the functionality that an iphone has.


It’s a simple theory to prove.

Take away phone part and how many people would use an iPhone? Given how ubiquitous modern messengers are I’d say between 80 to 95% would still use it.

Take away washing part and how many people would use washing machine?

Therefore iPhone is a computing device with phone peripheral and washing machine is a washing machine machine with computing capability.


You’ve decided to ignore the part where the guy who invented it said it is capable of “desktop class applications”, I see.


That guy is dead


Why? Why would you want an iphone or ipad for another os? Why not get an Android, or one of its derivatives?


People install Linux on Mac, Windows on Mac (Bootcamp) or Mac OS on PCs (Hackintosh) all the time.

Because it's their machine. They decide what to do with it.


That doesn't answer the question. People microwave their phones, fine. But it's not a reason to demand microwaveable phones.


It answers the question, which is "it's none of your business". And equating installing Windows on Mac to microwaving phones is ridiculous. (Bear in mind Apple developed Bootcamp, not third-party developers.) You understand that.

If people are genuinely interested in real-life use cases (in which case the question should be phrased like that, instead of with the current half sarcastic and condescending tone), consult reddit or ChatGPT. There are tons of use cases out there that are well explained.


You're talking about less than 0.000000000000001% of people.


I can only tell you the Bootcamp and Hackintosh community is (was) much bigger than you think.

Or let me say this -- if there are 1000 Bootcamp users -- likely < 0.01% of the Mac user base -- Apple would never bother to create Bootcamp, write a bunch of drivers and keep drivers updated. Apple would completely ignore that use case.

But they did all those things. Which means even Apple sees a business need to do something to allow people running Windows on Mac computers.

You think you are smarter than Apple?


Apple discontinued Bootcamp. But it's not the only feature they made that saw little use. Automator comes to mind.


Isn’t this kind of a huge deal? You can literally install and run Flash on an iPhone, at last.


You already could play Flash in iOS Safari for quite some time with Ruffle. Or do you mean Flash the authoring program?


You already could, technically. A few iPhone models had working Android versions (through a terribly roundabout way) which you could load an old version of Flash onto.

I can't say I see the use case for this on iPhones, but this does unlock a lot of options for iPads. Too bad this thing is still locked to JIT-less mode.


Ruffle already works in Mobile Safari, at least if the website embeds it, though we don't have an extension build for it yet.


My tinfoil is that Apple banned this specifically because you can use it to get a desktop on iPadOS, which offends their "fingers can't touch mouse apps" sensibilites.


Far more likely that Apple doesn't like emulators because they allow for the distribution of software that bypasses the 30% revenue cut, as well as enabling content that they don't want (adult content, extreme political content, etc.)

It likewise seems clear that Apple relented in this case because a not-particularly-easy-to-use, interpreter-only PC emulator doesn't enable anything that isn't already possible with a Web app.


They don't allow that more than a web browser does.


Apple allows VNC apps, which access desktop OSes through a touch screen.

No need for making up new theories with no evidence, I think. They just don’t want to grant a third party JIT permission because of security concerns


> They just don’t want to grant a third party JIT permission because of security concerns

The initial rejected UTM submission had no JIT compilation.


UTM SE is specifically a no-JIT version.

And I actually do have evidence for my conspiracy theory: iDOS 2. Apple was perfectly fine with it up until someone in tech media posted a guide on how to run ancient versions of Windows on it, then they pulled the app.

Furthermore, "fingers shall not touch mouse apps" was a Jobs decree. It's specifically the reason why Apple got into touch UI. Jobs designed the iPad out of spite due to a friend of his who was on the Windows tablet team around the turn of the millennium. Jobs was so angry that they were just putting Windows on a tablet - and handing people styluses to work around the small touch targets on XP - that he made Apple design a tablet computer demo around finger-only UI. That was then grafted onto the Purple project (which at that point was a clickwheel iPod with a phone modem in it) and we got the iPhone.

So my suspicion is that Apple has an internal "things which make Jobs spin in his grave" list that they would never allow, come hell, high water, or the European Union's antitrust regulators.


iPads might even eventually become usable devices!


You can already get a desktop on iPadOS, though. A remote one. Apple has never had a problem with RDP/VNC/etc clients for iPadOS.


iPadOS has a built-in feature called Sidecar that lets you use it as a Mac external display with mouse-apps and all.

https://support.apple.com/en-us/102597


“It doesn’t come with any operating systems, though the app does link to UTM’s site, which has guides for Windows XP through Windows 11 emulation, as well as downloads of pre-built virtual Linux machines.”

Huh… so I could run Linux on my phone!


Does anyone know what this is: “QEMU TCTI implementation was pivotal for this JIT-less build.”


I was curious, so I looked into it.

- TCG = Tiny Code Generator; QEMU's framework for emulating CPU architectures via translating instructions[1]

- TCI = Tiny Code Interpreter? The source code says: "(TCG with bytecode interpreter, experimental and slow)"[2]

- TCTI = Tiny Code Threaded-Dispatch Interpreter? The source code says: "TCTI (TCG with threaded-dispatch bytecode interpreter, experimental and slow; but faster than TCI)"[3]

So, apparently, it's some kind of optimized interpreter. Exactly what it means by "threaded-dispatch" is unclear, there's some surprisingly tricky looking things going on[4]. Does threaded refer to OS threading, or does it maybe mean that it's doing something a bit more like a cached interpreter? I wonder if it's even more clever than that.

[1]: https://wiki.qemu.org/Documentation/TCG

[2]: https://github.com/tctiSH/qemu/blob/1e4d72b004c26724cd049798...

[3]: https://github.com/tctiSH/qemu/blob/1e4d72b004c26724cd049798...

[4]: https://github.com/tctiSH/qemu/commit/1e4d72b004c26724cd0497...


Threaded code is a way to make an interpreter run slightly faster without requiring a JIT, which iOS bans. https://en.wikipedia.org/wiki/Threaded_code


Threaded interpreters are a kind of interpreter that runs code by having an array of jump addresses in a row representing the ops to interpret so that you can amortize out the decode step.

It works kind of like ROP/JOP gadgets.


Ah, I either didn't know about the term threaded code or forgot about it. Thanks for the pointers. (Replying to you but also the sibling comment since both were posted around the same time.)

At first I was thinking of cached interpreters as often seen in video game console emulators, but actually, this reminds me more of the "virtual machines" used in executable packers/obfuscators like VMProtect and Themida.


> but actually, this reminds me more of the "virtual machines" used in executable packers/obfuscators like VMProtect and Themida.

Yeah, totally. Only changing the stack at runtime is helpful for avoiding the ire of anti-virus.


It's a backend for qemu's cpu JIT that doesn't actually JIT code for the host CPU, but code that's simply more performant to interpret than the target CPU arch as a set of JOP/ROP gadgets. It's so it works as perfomantly as possible without in case without the ability to set pages executable like on iOS.

https://github.com/tctiSH/qemu/blob/with_tcti_vectors/tcg/aa...


I wonder why the prebuilt Mac OS 9.2.1 VM has disappeared from the UTM gallery


From X:

> Also shoutout to @ktemkin whose QEMU TCTI implementation was pivotal for this JIT-less build.

Does this mean that it is just a JIT-less build? How is the performance for Linux emulation compared to the previous UTM SE build?


Tried it myself on iPad Air 4. x86 Arch Linux installer is unusable. Took ~5min to boot and not responding to commands.


I'm using Debian 11 with LXDE on an M1 iPad Pro and it's surprisingly useable. I can use this. I'd recommend trying it out and seeing for yourself.


Do things like the internet connection and access to peripherals pass through? Could I use this to have better ssh access to some capabilities of an iphone, for example?


I use UTM on Mac, it's a great experience. Donated a little to them using github sponsors because of how useful the tool is, and how easy it is to obtain and install unlike VMware Player (see: https://matduggan.com/the-worst-website-in-the-entire-world/ ).


The AltStore team helped them but it’s not on AltStore? I’d really like to see a nice strong offering there, more pro and privacy/foss oriented perhaps, similar to F-Droid.


The article shows a tweet where it is mentioned coming soon to AltStore.


Unfortunate that we're not affected here in Switzerland.

Although I do expect it soon, since especially opening up NFC could benefit the nationwide contactless payment solution (twint).


I kind of wonder what the performance impact of Apple's ban on JIT would be. How does iOS without JIT compare to Android with JIT?


This is fantastic. Does anyone know how to run Alpine Linux on it?

Last time I tried through AltStore, I couldn't get it working.


I misinterpreted the title as Apple approving development of iOS apps on PC. That would’ve been an exciting development.


With mouse support coming in visionOS 2, this could be an interesting experience on the Apple Vision Pro.


1.66GB!


Finally a meaningful way to use a chunk of that 160 GB of free space I have on my iPad...


Great.

Now Apple just needs to support JIT


Maybe we can back door that into this app… quietly


This is cool.. Mac OS can of course access the iPad App Store so … we can run this on MacBook Pro :)


But why? There exists a full JIT- and virtualization-enabled version of UTM for macOS.


  $ brew install --cask utm




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: