Hacker News new | past | comments | ask | show | jobs | submit login
Running postmarketOS on iPhone 7 (project-insanity.org)
293 points by zdw on April 18, 2020 | hide | past | favorite | 100 comments



The iPhone 3G and original iPhone both had Android ports. Back then the (internal) hardware was mostly off the shelf, so it was easier. Still, this is neat.


Do you know of any working Android port for ipad 1? I have an ipad 1 in excellent condition.


The iPad one has a period CPU that, unfortunately, would not run any modern android build with any level of performance that would be considered usable.

It's ewaste, or at least, an interesting artifact in computer design history.


Also Android has much higher memory requirements than iOS due to the Java implementation, that’s why Android phones have typically had a lot more RAM than iPhones. That may not be an issue depending what you need to do with it.


It would be nice if Apple would port e.g. Linux or *BSD to that CPU. They probably never will. But it would be cool.


Interestingly, Apple apparently builds Linux for their chips for hardware bringup.


Darwin is not *BSD?


It is, but I think he means "an open OS so it isn't just a paperweight anymore"


I believe Darwin is open source because most of it were bits of 3rd party open source software.

What frameworks and GUI Apple built on top of Darwin are not open source.

Also, releasing their software as open source goes against their business model as they wouldn't be able to enforce its use only on their hardware.

People wants Apple software but that only comes with their hardware.


Darwin source was release for a long time, with no restrictions on running it on Apple hardware. I’m not sure of the status of that now. They’re under no obligation to do so as it’s BSD licensed.


Yes exactly. Something that is FOSS and which one could tinker with.


They predated the SoC era, as did all the late Symbian handsets.


How are we defining SoC era? Maybe before SoC's were accessible in the consumer market but the first ARM based SoC, the ARM250, dates back to 1992. Plus I'm pretty sure there are SoC's based on other architectures older than that.


Yeah, mobile CPUs were always called SoC since at least flip phones era. Absurd some Apple fanboys think they made it mainstream or even invented it. Those are entirely false.

System on Chip are called System on Chip because they don’t expose I/O ports(like PORTA through PORTD on AVR) to the pins thus can’t interface RAM or ROM or other peripherals natively, but only over narrower buses through internal interface chips/logics.

In my rule of thumb, every surface mounted “CPU” between 80 to 200MHz across industries were called SoC from 2003 through 2010s.

Those CPUs/MPUs that do expose I/O directly to be interfaced to SDRAM, LCD or equivalents of North Bridge chip on PC were usually called either MPU or CPU, often MPU for they integrate some peripherals. Those were roughly 200MHz and above, including PowerPC for NAS or x86 for PC for which heat or footprint were not concerns.

Those under 80-120MHz were often called Microcontrollers(uC). They were also often self contained like SoC but didn’t count, also got a lot higher clocked recently but still don’t.

Intel and AMD refer to their systems as SoC since Apple started doing custom SoC, and that might give you an impression that it’s new and better. That’s not true.

SoC means CPU’s main I/O ports are not exposed to pins.


> Intel and AMD refer to their systems as SoC since Apple started doing custom SoC

Intel and AMD refer to their systems as SoC since they integrated the north bridge (which includes the memory controller) into the chip, has nothing to do with Apple.

> CPU’s main I/O ports

That's a weird definition. Most architectures don't have "I/O ports" and use memory-mapped I/O exclusively. And even if we read that as "any kind of I/O pins" and go back to the

> can’t interface RAM or ROM or other peripherals natively

part: the vast majority of mobile SoCs in the 2010s use external RAM (and storage). The memory controller is onboard and the external connection is directly to DDR3/4/.. chips.


You are inventing definitions here.

> System on Chip are called System on Chip because they don’t expose I/O ports(like PORTA through PORTD on AVR) to the pins thus can’t interface RAM or ROM or other peripherals natively, but only over narrower buses through internal interface chips/logics.

What does IO port multiplexing of the AVR have to do with the definition of SoC?

> In my rule of thumb, every surface mounted “CPU” between 80 to 200MHz across industries were called SoC from 2003 through 2010s.

No. I dont see how the physical mounting of the package changes anything. Most processors are surface mount, even some bga x86 chips.

> Those CPUs/MPUs that do expose I/O directly to be interfaced to SDRAM, LCD or equivalents of North Bridge chip on PC were usually called either MPU or CPU, often MPU for they integrate some peripherals.

No. MPU = MicroProcessor Unit.

> Those under 80-120MHz were often called Microcontrollers(uC). They were also often self contained like SoC but didn’t count, also got a lot higher clocked recently but still don’t.

So modern highly integrated microcontrollers, an effective system on a chip with 300+MHz core speed, sram, flash, eeprom, can, uarts, ethernet, pwm, counters, dac's, adc's, arent system on chips?

> SoC means CPU’s main I/O ports are not exposed to pins.

Uh, No.


Indeed most handsets including the old iPhones and all pre iPhone pocketpc used SoCs of some sort. In fact the original iPhone had a part with stacked SDRAM. ARM and MIPS SoCs have been a staple of consumer devices for at least 20 years.


They were definitely SoCs.


I'm not a great person, I get more satisfaction from feeling like I know more than actually being critical about learning.

Not sure how to purge that psychology.


You’re already headed in the right direction by being aware of it. I don’t have any specific advice (my own natural know-it-all-ness started to fade when I was 30 or so) but being specifically aware of it is definitely a benefit; it at least allows me to pause and notice it and adjust.


> Not sure how to purge that psychology.

I think that's something most of us struggle with. That doesn't mean you're not a great person :-) I appreciate the humble reply.


"SoC era"?


Before Apple bought PA Semi and started working on their own A series of System on a Chip (SoC) processors. They were using ARM reference designs before.


Then what are they talking about with the Symbian phones?

And Apple didn't use ARM reference designs in early iPhones; they were using Samsung SoCs. Pretty much nobody uses the ARM reference designs except as an early prototyping target.


> Pretty much nobody uses the ARM reference designs except as an early prototyping target.

This isn't true. The majority of ARM based SoCs today use the Cortex or Neoverse reference designs from ARM. Today there are only a few handful of vendors who fully develop their own core, such as Cavium, Apple, and Ampere.


There's a difference between ARM reference designs for SoCs (the topic in question) and ARM's designs for their CPU cores.

In the context of reference designs, you see stuff like the ARM Juno designs. I've seen people (very very rarely) just drop those whole designs into their boards. But, like I said, almost nobody uses them except as a prototyping target.


I think grandparent comment was referring to the CPU part not whole SoC. Arm has never sold entire SoC beyond a limited number of test chips. Juno is a test/evaluation chip only.


You're right, I forgot that they used underclocked Samsung 1176JZ SoCs.


Also before the boot was totally locked down in encrypted black boxes.


Before they bought PA Semi, they were helped by Samsung with CPU design.

Samsung also fabbed their SOCs.

PowerVR designed the GPUs in Apple SOCs.


I assume they are referring to the time before Apple used custom system-on-chip designs


This is really cool! I hope they get the GPU running. I would _love_ to run Ubuntu Touch or something on an iPhone. I wonder how much of the responsiveness of iOS is due to the hardware.


Had an Ubuntu touch phone. It was ok as a GuI for an OS but the lamest thing was they locked down the package manager so u could only install stuff from their App Store with like nothing in or you escalated privileges and any customization u made (apt-get packages, system changes) got wiped with each update. They seriously failed to support a real Linux OS for mobile.


Sounds a lot similar to the Apple UX


OP mentioned 3 main points: empty app store, need to escalate privileges, and wiping preferences. Which one exactly is representative of Apple's UX?


The locked down AppStore experience is the same on iOS...you can't install apps from different stores, they don't exist


https://altstore.io/

welcome to 2020.


I am genuinely interested in an answer to this question, even tho it will sound like a troll.

If people want to sideload apps and such, which explicitly hacks around the system designed by Apple, why are they using Apple devices (which are notoriously locked down and restricted to "Apple's way") in the first place? Why not use Android or something else where that sort of thing is designed for?

Same kind of question for jailbreakers. Isn't the desire to jail break indicative that you want a more flexible platform? Why wouldn't you use something else besides Apple?


I did buy an iPad 2 back when it was current. I needed something like a tablet for an extended trip where a laptop would have been too much to carry around and the only real competition at the time was the disappointing Motorola Xoom so I bought an iPad.

I ended up "jailbreaking" it because there were just so many things I wanted it to do that weren't possible with the stock OS and approved applications (adblocking, better keyboard layout, file system access, and mouse support among other things). Many of these have eventually been implemented to some extent or another, but at the time they were unavailable via the App Store but easily done via the Cydia store.

Since then, the iPad 2 aged out of usefulness and my need for a tablet has mostly dried up. Likewise, options for other higher-powered, well built, and lightweight laptops/tablets have expanded so I've not purchased any more Apple stuff for the reasons you point out. I basically "jailbroke" because it was the only way to get my device to do what I wanted it to do. Now that they exist, I just use other devices that don't require me to jump through those hoops.

Still, I wish there was a way to install some sort of lightweight OS on that iPad 2 (which is now sitting in a box somewhere).


Cool, but there are still on ton of limitations and basically no apps.

https://altstore.io/faq/

It's 2020, more than a decade after the iPhone was released, and 99.9%+ of users can still only use Apple's App Store.


Does this work well with Wine?

I don’t want to boot up Mac


My experience is that a good chunk of that comes from the limit on background processes.

At least in Android devices changing that parameter made them much snappier, although some apps would work, because they absolutely required more than one background process.


Why would you want this? The iPhone 7 has a broken bootloader, which is what makes this possible in the first place. It's impossible to be sure of the integrity of the device when you can just get root over USB. It's likely to require tethering to boot every single time, so you have a functionally useless device.


One reason one might want this is because of the hacker spirit. It seems more interesting and impressive to do what something was deliberately designed to not do.

Another reason is that Apple’s hardware tend to be very high quality. For example, on JavaScript benchmarks (which you may or may not consider to be a good proxy for general/single threaded performance), an iPhone7 performs comparably to the current latest Android phones. Also spare parts or replacements are easy to come by.

I’m quite unconvinced that the boot loader exploit demonstrates that the platform is so insecure that you shouldn’t use it. If you want to run a different operating system on a typical android device you would either not need to circumvent such security measures (which doesn’t mean that they exist or work) or you would use some exploit that didn’t get such wide press.


Interesting, I wonder how much of that is the fact that Apple has the only JS runtime which supports tail call elimination:

https://kangax.github.io/compat-table/es6/


I remember the real reason being the optimization of an operation for array access that is relatively obscure, but heavily used by JS.


I think it's more complicated than that.


Do you have more information? I can't find the relevant article.


Probably very little. Apple's SoCs are far ahead of what is in most Android phones.


"Ahead" isn't quite the right word. Making a large die area high performance ARM isn't hard. Its just very expensive, and only Apple has both the volume and margins to justify it.


To use as local input/output devices in your home!

A (old) phone/tablet running Linux connected via USB/ADB/Network can be used as:

* (Additional) Webcam for your computer

* (Additional) microphone/Touch-pad for your computer

* (Additional) (status) screen for your computer/whatever_with_USB

* Security-cam/Digital-picture-frame for your whatever_with_USB

* (Additional) Network-available environmental sensors (light, temp, vibration).

* (Additional) Wifi/xG Network (scanning) device for your PC

If this doesn't make dozens of useful workflows available to you ... then I don't know, what you have been doing with computers before.


The hope is that someday you don’t have to do those things. If Linus had that mentality Linux wouldn’t exist.


Linus wasn't relying on a bootloader exploit in old hardware to write a kernel.


He didn't have to.

Which was a good thing.

The commenter unironically using Orwellian phrasing like "be sure of the integrity of the device" shows how far we've come.


> The commenter unironically using Orwellian phrasing like "be sure of the integrity of the device" shows how far we've come.

You don't want to know that your phone hasn't had a rootkit installed on it by someone who borrowed it off your desk while you were in the office kitchen?


You would because the device would have restarted.


So obviously the only way to avoid this is to give total control to Apple and Apple alone?

Not, say, disable flashing if the passcode protected lock screen is up?

I had a much more snarky answer before this one about how realistic this scenario is for most people, but I will keep it to myself.


I'd rather suffer the possibility of being pwnt than have devices that come pwnt out of the box. Manufacturers could develop systems that removed themselves from the trust chain, but unfortunately more control is beneficial in their view.


A better analogy is Microsoft word. The original source code was lost. It’s now relying on a reverse engineered code base from someone outside of Microsoft.


Perhaps you are remembering this incident (which involved Equation Editor, a component of Office): https://www.bleepingcomputer.com/news/microsoft/microsoft-ap...


"Original source code" is an imprecise term for a project that has been under development for decades.

But your claim is verifiably false: https://computerhistory.org/blog/microsoft-word-for-windows-...


Is this true? I've never heard this before.


How often do you boot your phone? My current phone gets a lot of booting because it doesn't always come back from sleeping, but a phone working properly maybe needs a reboot once a month? That would be less convenient if it needs to be tethered, but not that bad.


I have an iPhone and I don't think I've purposefully rebooted it in months. I think the only time i see the boot logo is if the battery completely dies, or iOS updates.

I guess an os thats not really supposed to be running might be less stable, but I would hope most OSes are stable enough to essentially never require reboots. (aside from power loss + updates)


To play with. I think I'd enjoy that. Also, the iPhones are cool hardware!

And I've used unlocked bootloader devices before. My threat model allows for that exploit to be performed on me should someone desire it. The utility I gain exceeds the risk = f(likelihood, loss).


Where is your hacker spirit?


> I wonder how much of the responsiveness of iOS is due to the hardware.

It's all software.


I hope the right to repair movement catches on.


Does anyone know why there's tape on the button?


Based on context, I kind of assume it's to avoid finger print data being collected.


iPhone finger print data isn't sent anywhere; it stays on the phone


Right but there’s a lot of people who don’t trust that that’s actually true, either thinking Apple is lying or thinking third party bad actors can probably hack the software and firmware to get that data. At least that’s my guess based on what I know of the typical HN crowd.


This technical achievement is awesome as is postmarketOS, but ... I can get that on an arguably better device. Has there been success getting iOS running on other hardware?


Not that I've heard of. Recently I stumbled upon this guy trying to get it to work in QEMU: https://alephsecurity.com/2019/06/17/xnu-qemu-arm64-1/


Did he put tape on the home button for us not to steal his fingerprints ?


It's just sad apple does not let owners to do this with their device by default. We had to wait for a bootrom exploit.


Yeah. I would be buying an iPad these days if I could just install Linux on it, but no, that’s not allowed. So I’ll keep my money, thanks Apple.


Is this using some SecureROM or iBoot exploit? If not, how do they bypass the secure boot chain?


Checkm8 bootrom exploit. Checkra.in uses this for a jailbreak that should work on any iOS version as well.


In the article they describe using the checkra1n jailbreak for exploit process.


Technically, they're just using the checkm8 exploit, not the checkra1n jailbreak.


Thanks -- looks like it's described in an article linked from the one submitted here.


I love that it’s called “Project Insantity”. It gave me a smile.


You mean "Project Insanity"


What I'd really want to see is Android phones running iOS.


Cool


Maybe link to the actual blog post about doing this instead of this blog post about a blog post about doing this? https://blog.project-insanity.org/2020/04/16/running-postmar...



PC took over personal computing and server market because it was an open spec.

Apple didn't learn from its own mistakes which almost made them bankrupt.

While now it is doing very good, iOS devices competing with Android devices is very similar to Apple competing with PC.

People like choice, competition and not being locked down.


When you say "open spec", what exactly do you mean? MacOS is built on top of Unix / Debian, which is a much more open system than Windows.

Apple computers have been able to run Windows for a long time.


macOS is quite unrelated to Debian, it is built on the Mach microkernel and FreeBSD kernel/userland bits.

https://en.wikipedia.org/wiki/MacOS


You're right, I fudged Darwin and FreeBSD in my head and got Debian.


Newsflash: the Mac is currently quite successful.


https://gs.statcounter.com/os-market-share/desktop/worldwide

Still less than 20% of the market. And I feel like the main reason they're even usable as a desktop OS is because 99% of most people's computer activity takes place in a web browser and not because OSX has a particularly special native app collection. Hell it seems like the only time I fire up my mac is because I need to build an iDevice app and nearly every "native" mac app I run is really Electron based.


> nearly every "native" mac app I run is really Electron based.

What does this have to do with the platform? You choosing to use bloated electron apps certainly isn't Apple's fault.


i really like to develop software on my MBP, i must be in the 1% then :\


Currently yes. In the past, no.

>1997: Microsoft rescues one-time and future nemesis Apple with a $150 million investment that breathes new life into a struggling Silicon Alley icon. https://www.wired.com/2009/08/dayintech-0806/


Is it really a "smartphone" yet if it can't make phone calls or use the cell modem? Not discounting the work that's been done, but the title feels a little premature.

https://projectsandcastle.org/status


I mean, it's a great technical achievement... we'll see when it hits full support.


if




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: