>Full and complete service manuals to TVs, VCRs and cars. Access to parts etc.
To be fair, most electronics these days are so tightly integrated (literally IC, Integrated Circuits!) that there's not much you can do even with a full on schematic.
Long gone are the days of bigly electronics parts connected by a spaghetti nest of wires and third grade soldering.
Phones, laptops, GPUs and many other modern highly integrated electronic products are repairable. Look it up, there are tons of videos on YouTube of people doing component level repairs. Leaked schematics make a huge difference here.
> Look it up, there are tons of videos on YouTube of people doing component level repairs. Leaked schematics make a huge difference here.
I've watched these videos — and the people you're referencing doing this are rarely using any kind of schematics at all to repair modern digital-logic boards. And not for lack of accessibility!
Modern logic-board designs consist of a few proprietary ICs, plus generic labelled support components (e.g. VRM caps, surface-mount resistors and diodes, etc.)
You can repair these boards, but these repairs fall into three categories:
1. bad solder joints or broken traces — which mostly just requires looking at the board carefully to notice these.
2. bad generic support components — which in theory you can determine the need for by testing with various multimeter modes across the individual component's legs; but more often you just notice that what the component is in line with isn't working, and "swap out to see if that fixes it." And which where such a swap-out can be done by just looking the part to figure out what it identifies itself as; then de-soldering it and soldering on a replacement.
3. bad proprietary ICs — which you determine by tapping the signal lines leading from/to the IC on an oscilloscope; and which you "fix" by buying other for-parts copies of the board, de-soldering those ICs off of the sacrificial parts boards, and soldering them onto the "almost good" board.
In none of these cases would referencing a schematic help! They're all effectively "context free" repairs — see, probe, think, do.
(A schematic can in theory help you to find test points to differentially diagnose 1 vs 2 vs 3 in the case where a board is failing mysteriously... but once you have some experience in board repair, you can get 80% of the same information by just staring at the board for a minute.)
---
Of course, if you're repairing a power supply, or an audio receiver, or some still-half-analogue electronic appliance from the 1970s — then yeah, schematics help. But these types of systems do still come with those schematics! (You just need to buy the thing directly. You aren't getting a schematic for an "embedded" PSU with a computer; but you do get the schematic if you buy that same PSU at retail at a Shenzhen parts-mall. And you get forwarded that same schematic [and more] if you make an industrial order of 10000 of them, too.)
Those ICs have decomentation and i2c pins that allow the vendor to debug them. Providing that documentation to the user (along with any keys needed to make use of it) would be the modern equivalent to electronics service manuals.
I implore you to not trivialise and belittle “humans” for not always prioritising what you personally would like them to. Different people have different preferences, and that can and does can happen intelligently, not just the result being “wooed” by marketing or whatever.
However you call it, it is essentially trading one's freedom for convenience. Yes it is a result of more general social dynamics rather than mere advertising, it even follows the political trends of our times. But it is time to realise the price we pay for this sort of convenience, and that it does not come for free. Advertising does not tell that.
I mean that could be true only if you don’t account for the obvious reality that the lockdown of hackability, serviceability, access to parts is deliberate.
Personal preferences matter little in a duopoly industry. Sure, you could argue that other aspects are more important than serviceability, but it’d be insane to argue that people want less serviceability. So in short, we were better off in this aspect before. 100%.
> the obvious reality that the lockdown of hackability, serviceability, access to parts is deliberate.
I mean yes, it literally is, the law requires that smartphones incorporate a killswitch capable of resisting a device format/reload of the OS. This was seen as a consumer benefit in 2014, and did succeed in bringing muggings and phone theft down by as much as 80%.
This is what the sibling means about tradeoffs: people at the time wanted the tradeoff of fewer muggings. The killswitch was seen as a benefit, and largely Apple is in fact just responding to the legislative requirements pushed upon them.
And if you have to have a killswitch that can resist a OS reformat… there has to be some component pairing mechanism that works underneath the OS layer, naturally. That is the sensible way to implement it, otherwise you'd just swap in a new motherboard and hey, phone's working again. Or strip the phone for parts.
The idea of parts being anti-theft/strip-resistant is pretty much inherently in contradiction to right-to-repair. And again, I think people probably understand that... but never forget that theft-resistance was a legislative initiative pushed over the objections of Apple and other phone vendors at the time, having some unforeseen consequences. People actually did want this, people lobbied for this, as much as modern readers may not believe it.
From the point of view of the consumer, serviceability is negatively unalloyed. That is, it can always be removed without an associated (positive) tradeoff for the consumer.
The topic at hand is not about a preference that most people would not find objectively good or bad. Why is this particular preference bad enough that one should feel bad for having it? Without statements supporting this view your post doesn’t make much sense.
they wouldn't because, like many teenagers smoking their first cigarette, they don't understand the consequences of satisfying that particular preference
people often have preferences that harm them, so the fact that a particular choice is voluntary isn't a very strong argument that it's healthy, even individually, much less collectively
this probably isn't a productive place to try to reason out the societal consequences of people acceding to these particular losses of freedom
This subthread could exist verbatim in any comment thread because it's devoid of substance. "Some positions are better than others", "why?", "because {poignant but vague analogy}".
Why specifically is it bad for people to prefer devices that are small and tightly integrated to devices which are repairable but make trade-offs to get there? I own a Framework laptop for its repairability but would warn anyone interested in it that the battery life won't work for all use cases. It's a trade-off that I've made that doesn't make sense for everyone.
>Why specifically is it bad for people to prefer devices that are small and tightly integrated to devices which are repairable but make trade-offs to get there?
For repairability, upgradability, hackability, the environment, waste, tighter vendor control, and so on...
Reasons is a good start. Explaining and justifying those reasons would be better.
I choose hackability and upgradeability and less vendor lockin because I myself value those things, but I see that as a personal preference and not something I want or need to impose on the industry as a whole. Benefits to the environment and reducing waste are clearly valuable but it's not obvious to me without seeing data that such benefits would be high—many people tend to upgrade phones long before batteries need replacing anyway, so I would need to see data that demonstrates that a substantial number of things would be saved from the landfill by virtue of higher reparability.
>Reasons is a good start. Explaining and justifying those reasons would be better
The question was "Why specifically is it bad", and I gave 6 reasons. Does the "justyfing" demand stop at some point, or we'd get down all the way to justifying why being alive is better than being dead, why anything ever matters, and so on?
I think the reasons are self-evident - i.e. it shouldn't take any argument as to why repairability and upgradability are good, why being more at the mercy of the vendor is bad, or why protecting environment and being less wasteful matters.
I brought up counterpoints to each of your reasons, but if you don't want to have a conversation about them that's fine. I'm not particularly invested in my counter position, I just get very tired of the zeitgeist where people think it's okay to dismiss a popular perspective because it's "obviously" wrong.
And no, flat-earther comparisons aren't going to fly, I'm talking about perspectives held by tens of millions.
Waste and environment are the only reasons you gave that speak to the preference being bad in any sort of objective sense. The status quo ante iPhones was lack of repairability in cheap, small devices. The devices cost little and there was little economic friction to just getting another one. I don’t know for sure but I imagine the average length of use for Nokia devices was less than that of iPhones.
The lives of people with wealth (which by world standards is just about everyone in the U.S.) is enormously wasteful, and unsustainable. Instead of focusing on one aspect of waste in our lives we should take a holistic approach.
>I don’t know for sure but I imagine the average length of use for Nokia devices was less than that of iPhones
Wasn't and didn't have to be. Ancient Nokia mobile phones would still be perfectly usable, protocols aside, for basic phone functions like talking and messaging. And you could even use them to bash on and mend a dent in your anvil.
It’s unclear who the “they” are in your comment and even if you did make it clear your post is less meaningful than the one I responded to. What was the goal with this retort?
this probably isn't a productive place to try to reason out the societal consequences of people acceding to these particular losses of freedom, but i thought it was worth pointing out that people often have preferences that harm them, so the fact that a particular choice is voluntary isn't a very strong argument that it's healthy, even individually, much less collectively
i appreciate your critique of the clarity of my comment and have edited it to clarify
And the fact that your preference is different than someone else's doesn't automatically make yours the better one.
This line is copy/pasted into dozens of comment threads across dozens of topics, and while it's very representative of the zeitgeist it's a dead end for an interesting conversation and frankly it's a toxic, elitist mindset.
Defend the position, don't just assert that some positions are better than others and leave it to the readers to guess which one you're claiming is a priori superior.
Then defend your position. "Not all opinions are equal" is a lazy line that does nothing to further any cause and if anything just makes you and your cause feel dismissive and devoid of critical thinking.
>"Not all opinions are equal" is a lazy line that does nothing to further any cause
My primary wish for this comment about not all positions being equal wasn't to "further a cause", but to stop the bad argument that amounted to "people just have different preferences" as if that made all of them ok.
If we agree that "not all opinions/preferences are equally good" is a better idea than "all preferences are equally ok" my work is done!
As for the specifics, others (and me, elsewhere in this post's threads) have argued several reasons why this tight integration is worse. In fact I gave six of them.
People's different preferences all being equally valid should be the default position, with evidence required to say otherwise. Simply stating that not all preferences are equal does not contribute to the conversation as much as reminding everyone that different preferences exist does.
That there are good ideas and bad ideas, good plans and bad plans, good things and bad things, beneficial things and detrimental things, and if you like the former your preference is a good one, and if you like the latter your preference is a bad one is controversial in 2024, and shouldn't be the default position?
I too like to argue by claiming that my opinion is the default position that doesn't need to be backed up with reason while everyone who thinks otherwise better come up with an irontight proof that theirs is better. Checkmate.
It's a step in the right direction, being able to squeeze all the perf out of these devices to run proper full-fledged OS' that Apple seems to not want to do. It's okay to get excited over stuff like this and I'd bet a lot of good shit that solves your real world problems started from hobbyist adventures (see this little project called Linux).
I'd say Linux created far more problems than it solved, and would count as one of the largest net negatives in the economy of the technology sector historically.
I think it's fine to get excited, but there's also getting too excited if there's no rational need for the software. Which I can't really see, since there always have been other devices to buy.
Like if I need to do spreadsheets I will get a device made for that, not demand that regulators force Apple to let me install Excel on my phone.
Hackers and Europeans framing this as some kind of civil rights or human rights issue is to me abut as ridiculous as if regulators would force every restaurant to make their menu gluten free. Which will probably happen soon, since the mental children are at the helm.
Could you explain why you think that Linux created more problems than it solved? In my eyes it is the backbone of most of what I do as a developer and I can’t imagine trying to work without it.
The accumulation of wasted man-hours for developers and users. Without the allure of "free", I believe there would have been ample competition for more user friendly paid software doing what Linux is used for now. With the bonus that the developers would have been paid for their work and that tech giants would have to pay for their tech rather than use unpaid labour for free. (I know some of them also contribute to open source)
The point view that Linux is a net negative for the world is incompétence to understand the field. To believe that a commercial offering is absolutely better than an open-source offering is so childish. A world without Linux could be better, sure. But there’s only a tiny little chance of it happening.
It can only be speculation, but it's my belief. Linux became a server system for big applications. Who benefits most from such software are huge businesses. Who pays for their benefit are the Linux developers doing very specialized and competent work for free.
Why is there no consumer friendly solution for having a server connected at home and publishing your own website to the world, without having to be very skilled at computers? There are consumer friendly solutions for everything else: office suites, e-mailing, photo editing, all sorts of creativity. Consumers are now regulated to posting their online stuff on social media, because that is what is accessible to them. And I want to blame Linux partly for this, because the great allure of "free" halted any consumer solution.
Don't get too excited. It's the SE "Slow Edition." Apple won't let apps access JIT acceleration, let alone the iPad's virtualization capabilities, so this is all interpreted and barely usable.
We know that recent (M2+) Apple Silicon has hardware support not only for virtualization, but nested virtualization. Time may improve Apple's risk/reward for enabling virtualization in specific iPad use cases.
Thanks to aShell and iSH emulators for pioneering Linux-alike experiences on iPads, at the cost of overheated silicon.
Thanks to Asahi for bare-metal Linux, including virtualization, on Apple Silicon.
> Time may improve Apple's risk/reward for enabling virtualization in specific iPad use cases.
It's more likely that the EU will improve it.
As things stand today, Android allows for JIT acceleration on all apps, essentially unrestricted. The DMA only allows for security measures that restrict APIs insofar as those measures are "strictly necessary and proportionate".
Since Android manages just fine on that front with full JIT, Apple's defense relying on that DMA exception will fail.
So why does my android phone not allow sudo or chroot, and when I root it my bank application stops working, citing the EU payment service regulations? Virtualization is not enabled too, even though the cpu supports it.
> we have replaced pKVM with Gunyah Hypervisor Software – our own, more versatile hypervisor – on our Qualcomm Technologies’ chipsets. With Gunyah, we can use one hypervisor across use cases as varied as automotive, mobile broadband, IoT and wearables. We’ve upstreamed Gunyah for general use by the Android community.
I'd be interested to see how much of that malware targets and is effective on big OEMs' devices using Google Mobile Services compared with non-GMS Androids mainly in the China domestic market. In principle the non-GMS devices can be just as secure, but in fact do they give privileged access to vulnerable code?
There’s a reason that you’re falling back on imprecise language: “manages just fine”. That is not sufficient ground to responsibility implicate this regulation. There’s an obvious reasonable security argument for Apple to make here. To say otherwise is just letting this stupid OS war get in the way of any actual engineering competency you may have. Be fair.
Even for apps in 3rd party storefronts? I always thought these rules are somewhat relaxed for external publications but perhaps I was mistaken. It’s not like we have proper “side loading” yet.
These restrictions are enforced by the OS, not Apple's store rules. Even in the EU sideloaded apps can't do these things. It remains to be seen if their hands will eventually be forced, my understanding is that the EC is still working on striking down the Core Technology Fee, and these battles are going to take a while to catch each and every way Apple is trying to retain control of people's phones.
Yeah. Even using sideloading before the EU stuff you couldn't use JIT. Altstore (not the EU version) managed to get it working I think, but it was very finicky and I think it doesn't work in newer versions of iOS
Usable? In what sense? Not in the normal sense of the word. Because if it wasn't usable, very few would own an iPad. Instead, only a few care about running linux.
Some people think Apple's JIT restrictions standing up to the DMA is a given, but I heavily doubt that will be the case.
The DMA allows for security exceptions only when measures are "strictly necessary and proportionate", Article 6(4). Apple argues that this ban is crucial for security, yet Android permits JIT compilation and maintains robust security on that front for 99% of its user base without significant incidents.
the best i've been able to do with things like numpy, threaded interpretive forth, and ocamlc is about 15–20% of native performance. is that what you mean by 'very fast'? i wouldn't call that 'very fast' but it is an order of magnitude faster than cpython i guess
for example, this ocaml program runs 6× slower when compiled with ocamlc (which uses an excellent bytecode interpreter) than with ocamlopt
let rec fib n = if n <= 2 then 1 else fib (n-1) + fib (n-2)
and n = 40
in Printf.printf "fib(%d) = %d\n" n (fib n) ;;
the ocamlopt-compiled version is about 5% slower than a handwritten assembly version (and orders of magnitudes slower than a version compiled with gcc, which is why i wrote the assembly version)
Well, we’re talking about emulation here, not a bytecode interpreter for a high level dynamically typed language. CPython is fairly slow, it’s written in C in a fairly understandable way, effectively using a large switch statement. There’s like three jumps or so per executed bytecode, it probably uses some jump table.
„All the tricks“ for an interpreter would mean 1 jump per interpreted instruction, or perhaps 0.5 for some of them. It would probably mean no explicit jump tables. It would probably mean the interpreter is set up pipelined like a real cpu - loading instructions on parallel with executing previous instructions. You can get perhaps 5-15% of native speed using for an interpreter all the tricks. QEMU can maybe get 20-50% (not sure these days actually, I haven’t followed improvements for a while). Rosetta can get close to 100%, using a JIT but also hardware that is specifically designed to aid emulation (memory accesses like on x86).
Given nowadays CPUs execute like 3-5 instructions per cycle, getting 5-15% performance for an interpreter would be very fast and very difficult to achieve, but possible - if designed from the ground up for performance.
i think it's unusual to hit 3-5 ipc sustained in normal code, but it seems like we're in broad agreement about quantitatively what kind of performance an interpreter can achieve. i mentioned three interpretive systems that do reach about 15% of native single-threaded performance with very different approaches: the ocaml bytecode interpreter, threaded interpretive code like gforth, and numpy
it's just that i call 15% of native performance 'very slow' and you call it 'very fast', because you're comparing the moped to the walker you're used to, and i'm comparing it to a sports car because that's what you're paying for
instruction set jitting getting a speedup instead of a slowdown goes back to last millennium with hp's dynamo https://dl.acm.org/doi/pdf/10.1145/349299.349303 and was central to transmeta's business plan. qemu's jit is sort of simpleminded to make it easier to maintain
A cpu emulator will only get to 15-20% with a lot of tricks. Usually a cpu interpreter would get like 5%. It’s not like Numpy, where you can increase performance by trying to offload as much execution as possible into native code. When emulating cpu instructions, the atomic instructions are tiny. But as I said, python is a poor comparison because the interpreter is very slow and there is a lot of overhead.
although i've written compilers and interpreters, i don't think i've ever written an emulator for a real cpu. i'm interested to hear about your experience
Do you actually have any experience with systems emulation, JIT/AOT compilation, Apple's x86->ARM hardware assistance that it's JIT interfaces offer, etc? Or are you just speaking off the cuff in a very general manner?
Because it very much sounds like the latter.
I have, quite extensively. I can assure you, that the source set they are using most definitely does use all the "tricks" (the chief two being threaded code [no, not that threading] and preoptimized jump tables). I can also assure you that losing the hardware acceleration offered by Apple's extensions and the general performance boost JIT/AOT offer is much more than 80%.
But, sure. For argument's sake, let's accept your (incorrect) premise. Taking a 1ghz potential emulated target processor down to 200mhz is a fairly drastic drop. Disallowing a whole slew of modern code/target OSes and verging the usability on nil.
i wouldn't describe sacrificing 80% of the machine's native performance as 'very fast'. a moped goes 40 km/hour; a ferrari goes 200 km/hour. you're paying for a ferrari but getting a moped
what i'm saying is that, because an interpreter that runs at 20% native speed would be considered a very fast interpreter, which is still very slow, there are in most cases no tricks for making interpreters very fast
Apple built iOS with the assumption that apps can’t use JIT. They can migrate to a new set of constraints, but I’m sure they’re right that right now it’s necessary for security. It could take a while to patch all of the security holes from opening up JIT.
Darwin is the core Unix-like operating system of macOS (previously OS X and Mac OS X), iOS, watchOS, tvOS, iPadOS, audioOS, visionOS, and bridgeOS. It previously existed as an independent open-source operating system, first released by Apple Inc. in 2000. It is composed of code derived from NeXTSTEP, FreeBSD, other BSD operating systems, Mach, and other free software projects' code, as well as code developed by Apple.
> Different goals? Different system gets built. Different trade-offs. This isn’t hard to understand. You’re being overly harsh for ideological reasons.
Poor Apple with its virtually unlimited resources. Somehow Google, Microsoft, Linux and Apple themselves managed to make it work with JIT, but Apple can’t do it.
You are arguing about resources of a company that has an endless amount of it. The limitations mentioned could be changed/"fixed" with no time and effort.
The no JIT rule is just a rule enforced through the app store review process, so it's not going to part of the security architecture. It could of make some static analysis fail, but there are plenty of ways to do that.
I doubt that's true because it goes against every modern OS security practice. Apple heavily practices defense-in-depth. Sandboxing, PAC, AMFID, etc. All those are even more onerous than Android. So no, giving away JIT privs wouldn't do much.
My understanding comes from what jailbreakers say about the OS, what Apple says, as well as time spent working at Android where people that know what they’re talking about told me about iOS security.
There are about a dozen other responders who agree with said assessment with technical reasonings embedded into their messages. Read those.
Suffice to say, I don't feel the need to rehash why the basic premise of "the OS wasn't built to handle JIT apps" is directly contradicted by it, you know, allowing JIT in all but one app.
If they offered something of a basic technical assessment or reasoning beyond "I know people", perhaps then I would bother to dig into some credentialed argument. So, in the same way it's not worth delving into "proving God doesn't exist", it's not worth proving how OP clearly has no idea what they're talking about. They bear the burden of their claims, not me.
Well, right now this technically works, but is unusable - melted off 15% of my iPad’s battery life just logging in to XFCE and doing an apt upgrade (Debian ARM64), and is very slow overall. I’m going to try turning off the GUI and decreasing the core count to 2, but I don’t expect this to beat a-Shell on practicality or even iSH on usability.
How about we ban Apple from imposing arbitrary rules to cripple what is just a general computing device? They are too big and too monopolistic to be allowed this degree of control.
No, it's an Apple device. It _should_ be a general computing device, but Apple ensures that it will never be, and you should be aware of that when you buy it.
And there should be a ban on big washing machine locking down their washing machines. People should be able to use the hardware for what they want.
It's a pretty pointless thing to say "I didn't buy an Ipad to do X so why should it be able to do X?". Of course you didn't buy your ipad to do X. It's not capable of doing X so you'd be pretty stupid if you bought it for that purpose. That has nothing to do whether the manufacturer of the ipad should be allowed to control whether you can do X.
Apple have no obligation to allow their devices to do X even if the device is capable of it. It’s well understood when you buy an iPhone you’re getting a closely managed experience. If that isn’t acceptable, buy something else
The EU disagrees with your statement at least for some X. So Apple indeed is under the obligation to allow users to those X if they want to sell their devices in the EU.
And I guess technically correct is the best kind of correct, but rather than give money to Apple for some fantasy feature they will fight tooth and nail against, just buy something else.
These are not fantasy features. Just look at the simplicity of having USB-C on the iphone. The only reason this is there is because of the EU. Iphones would still be in the dark ages with lightning were it not for the EU.
There are plenty of options for handheld computing that Apple don’t make. I am confused by the attitude as if Apple don’t make it very clear what you are buying. Buying an iPhone and expecting it to be something other than a locked down walled garden is wishful thinking
The vocal minority of tech enthusiasts who want a more open computing platform are the same people who can make great use of the alternatives.
And the EU isn't "requesting" things, nor are they definitively "improvements". Apple has worked very hard to create a safe platform, a safe ecosystem, and the EU is demanding that they weaken some of those defense mechanisms.
I personally feel Apple has struck a reasonable balance, and I value the benefits.
> The vocal minority of tech enthusiasts who want a more open computing platform are the same people who can make great use of the alternatives.
Or like any other consumer they can do a feature request.
> And the EU isn't "requesting" things, nor are they definitively "improvements". Apple has worked very hard to create a safe platform, a safe ecosystem, and the EU is demanding that they weaken some of those defense mechanisms.
I’ll assume you’re talking about alternative app stores for EU, so how exactly having an alternative store going to weaken defense mechanisms for those who stay within walled garden?
Not so sure I agree. In the case of an iPhone, the making of voice calls is probably a small amount of all use cases for the device, it's more so a computing device with a phone peripheral. But a washing machine's main function is to wash, and the computer controlling it is really secondary to its main function. The computer could be removed and it wohld still be essentially the same function with relays and timers. If you take the computer out of the iphone and replace it with relays and such to control the "phone" part of it, then it wouldn't resemble an iphone at all, and have virtually none of the functionality that an iphone has.
It answers the question, which is "it's none of your business". And equating installing Windows on Mac to microwaving phones is ridiculous. (Bear in mind Apple developed Bootcamp, not third-party developers.) You understand that.
If people are genuinely interested in real-life use cases (in which case the question should be phrased like that, instead of with the current half sarcastic and condescending tone), consult reddit or ChatGPT. There are tons of use cases out there that are well explained.
I can only tell you the Bootcamp and Hackintosh community is (was) much bigger than you think.
Or let me say this -- if there are 1000 Bootcamp users -- likely < 0.01% of the Mac user base -- Apple would never bother to create Bootcamp, write a bunch of drivers and keep drivers updated. Apple would completely ignore that use case.
But they did all those things. Which means even Apple sees a business need to do something to allow people running Windows on Mac computers.
You already could, technically. A few iPhone models had working Android versions (through a terribly roundabout way) which you could load an old version of Flash onto.
I can't say I see the use case for this on iPhones, but this does unlock a lot of options for iPads. Too bad this thing is still locked to JIT-less mode.
My tinfoil is that Apple banned this specifically because you can use it to get a desktop on iPadOS, which offends their "fingers can't touch mouse apps" sensibilites.
Far more likely that Apple doesn't like emulators because they allow for the distribution of software that bypasses the 30% revenue cut, as well as enabling content that they don't want (adult content, extreme political content, etc.)
It likewise seems clear that Apple relented in this case because a not-particularly-easy-to-use, interpreter-only PC emulator doesn't enable anything that isn't already possible with a Web app.
And I actually do have evidence for my conspiracy theory: iDOS 2. Apple was perfectly fine with it up until someone in tech media posted a guide on how to run ancient versions of Windows on it, then they pulled the app.
Furthermore, "fingers shall not touch mouse apps" was a Jobs decree. It's specifically the reason why Apple got into touch UI. Jobs designed the iPad out of spite due to a friend of his who was on the Windows tablet team around the turn of the millennium. Jobs was so angry that they were just putting Windows on a tablet - and handing people styluses to work around the small touch targets on XP - that he made Apple design a tablet computer demo around finger-only UI. That was then grafted onto the Purple project (which at that point was a clickwheel iPod with a phone modem in it) and we got the iPhone.
So my suspicion is that Apple has an internal "things which make Jobs spin in his grave" list that they would never allow, come hell, high water, or the European Union's antitrust regulators.
“It doesn’t come with any operating systems, though the app does link to UTM’s site, which has guides for Windows XP through Windows 11 emulation, as well as downloads of pre-built virtual Linux machines.”
- TCG = Tiny Code Generator; QEMU's framework for emulating CPU architectures via translating instructions[1]
- TCI = Tiny Code Interpreter? The source code says: "(TCG with bytecode interpreter, experimental and slow)"[2]
- TCTI = Tiny Code Threaded-Dispatch Interpreter? The source code says: "TCTI (TCG with threaded-dispatch bytecode interpreter, experimental and slow; but faster than TCI)"[3]
So, apparently, it's some kind of optimized interpreter. Exactly what it means by "threaded-dispatch" is unclear, there's some surprisingly tricky looking things going on[4]. Does threaded refer to OS threading, or does it maybe mean that it's doing something a bit more like a cached interpreter? I wonder if it's even more clever than that.
Threaded interpreters are a kind of interpreter that runs code by having an array of jump addresses in a row representing the ops to interpret so that you can amortize out the decode step.
Ah, I either didn't know about the term threaded code or forgot about it. Thanks for the pointers. (Replying to you but also the sibling comment since both were posted around the same time.)
At first I was thinking of cached interpreters as often seen in video game console emulators, but actually, this reminds me more of the "virtual machines" used in executable packers/obfuscators like VMProtect and Themida.
It's a backend for qemu's cpu JIT that doesn't actually JIT code for the host CPU, but code that's simply more performant to interpret than the target CPU arch as a set of JOP/ROP gadgets. It's so it works as perfomantly as possible without in case without the ability to set pages executable like on iOS.
Do things like the internet connection and access to peripherals pass through? Could I use this to have better ssh access to some capabilities of an iphone, for example?
I use UTM on Mac, it's a great experience. Donated a little to them using github sponsors because of how useful the tool is, and how easy it is to obtain and install unlike VMware Player (see: https://matduggan.com/the-worst-website-in-the-entire-world/ ).
The AltStore team helped them but it’s not on AltStore? I’d really like to see a nice strong offering there, more pro and privacy/foss oriented perhaps, similar to F-Droid.