Hacker News new | past | comments | ask | show | jobs | submit login

I had a Pixel. That it took a screenshot when I switched apps makes sense. It allows the task switcher to open immediately and show the most recent state of all my apps. A screenshot of some sort is mandatory for the OCR functionality that allowed me to select text from these tiles in the task switcher (super handy!).

I’m now on iOS 15 on an iPhone 12 Pro Max. I think I’ve seen movement on the tiles in its task switcher, so I’m not clear if it takes screenshots. But the fact that the task switcher opens with no delay suggests that screenshots might be used?

I’m only defending taking screenshots. Transmitting them to other parties is problematic.




> I think I’ve seen movement on the tiles in its task switcher, so I’m not clear if it takes screenshots.

In my experience, it seems like only the app you were in when you brought up the task switcher continues to update the screen. If you go somewhere else, like just back to the home screen, it goes static like all the rest.


This is correct. iOS snapshots the app as soon as it's moved into the background, and that snapshot is what you see. When you bring up the switcher, the foreground app isn't backgrounded yet — that only happens if you go to the home screen or actually switch apps.


If the app is using the Background App Refresh entitlements [1] (Background fetch / background processing) then it is possible for iOS to update the screenshot for the app switcher periodically even when the app is in the background

Messages does this, as you will notice that an active conversation tends to be up-to-date in the app switcher

[1] https://developer.apple.com/documentation/uikit/app_and_envi...


As I understand it, each iOS application is sort of like its own 3D plane within a larger environment, hence why the launcher shows up without any lag.

I hope someone can do the work of pasting the original Aqua framework overview that’s probably still hiding somewhere on the Apple website. The manner in which the combination of OpenGL (Metal?) and PDF work to render UI and elements on OS X and iOS is really quite remarkable. I think even now, 20 years later, there isn’t anything comparable being done by Android/Linux or Windows. I would love to be proven wrong, however (I haven’t followed this closely for the past few years).


Yeah the iOS multitasking view tracks all the way back to windows in OS X 10.5 Exposé being actual windows instead of snapshots, and the parlor trick of QuickTime player windows continuing to play video when minimized to the dock all the way back in 10.0 (and perhaps the 10.0 public beta, I forget). It’s the kind of thing that family of operating systems has handled well for a long time.


Compiz and all subsequent compositing managers do the same thing for Linux (each app has its own surface in the GPU and can be composited in 3D), and I believe the compositing in Windows Vista and later is similar.



This is close. There was something that introduced Aqua to the world and played up the combo of PDF and OpenGL... Probably also explained in some WWDC 2001-ish video.


How have you found the transition to iOS? For me, the task switcher OCR feature is absolutely killer, one of the main things still keeping me on Android. Does iOS have anything similar?


I find the Pixel experience to be superior. But I took each of the areas where Pixel is better, item by item, and scored their value, and came out with a score recommending I keep the iPhone: https://www.arencambre.com/iphones-are-inferior-to-android-p...

Context: I made that right after I got an iPhone 12 Pro Max. It was running iOS 14. iOS 15 may bias the score towards Apple even more with the current phone, and iPhone 13 biases it a bit more.

I still like Android better.


iOS 15 now OCRs text across the OS, including screenshots. So you can take a screenshot and get OCR'd text from there.


That's more of a process than simply selecting text on the task manager tile.


I guess. You have to hit the screenshot combo and then tap the screenshot, versus hitting the app-switcher button. Are you doing this often enough for that 1 extra step to be a big deal?


I’m increasingly finding great value in reducing complexity of simple tasks. I thought the push button rear door closer on my minivan was silly, but it came with it, so (shrug). I’ve grown to like it!

Reducing from a few steps plus a major context switch to just one step is valuable.


Where’s the context switch?


For me, yeah this would be a much different experience. I use this feature all the time, to select anything from the title of a song on Spotify to a phone number embedded in an image on the web.


In the latter case, you could just select the text in the image directly. How often do you use this feature per day?


from what i gathered, this is only available on the newer phones thou




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: