My favourite demo is hidden pretty deep in the text - here's a SQL query that shows the machine learning labels that were applied to each photo by Apple Photos:
I didn't realize that you could search for things on Apple Photos. That's the main reason why I use Google Photos, because its search capabilities are amazing. When I search for "chicken" it returns both fried chicken, raw chicken and actual chickens. I'll have to check
The thing I don't like about Apple Photos is that when you sync with your Apple TV 4K, you can't actually play 4K 60 fps videos on your TV. I don't know why that is, but it won't work. That's extremely disappointing to me, and I don't understand what the reason is, unless the AppleTV 4K is underpowered.
I'm pretty sure classification happens locally for Apple as well. They call it "on-device machine learning". It doesn't mention it for MacOS photos but I can see `photoanalysisd` running in the background on my Mac.
I am okay with it fetching models my issue is it uploading my files without consent. Thanks for sharing I had no doubt they would do this I just would like to know the whole picture. I am sick of companies just uploading things without consent. What if I am a competitor? Now you stole my IP.
Thanks. I just read the article and it's interesting but I think it's old. I do believe that you can match the frame rate with the original source in AppleTV now, and it also supports Dolby Atmos. Unfortunately my Sonos doesn't which is a real bummer.
The biggest issue for me is that I can upload a 4K video at 60 fps on Photos but it comes out terribly, with stuttering and looks terrible. It defeats the purpose of using 4K60 on the iPhone is you can't actually view it on a TV.
I sync my Google Photos and Apple Photos and I hope they both also run the model on things I've uploaded from a different device. Wish the Google stuff was also queriable like this.
Originally my plan was to upload everything to Google Photos and then export the metadata out via their API into a SQLite database so I could query it.
Google Photos does not have a comprehensive API - and more importantly they refuse to release geolocation information through it, which killed that option entirely for me. https://issuetracker.google.com/issues/80379228
A feature I've long wanted to see in a photo management tool is ranking based on pairwise comparisons. Say you went on a trip and had 200 "keeper" photos, but that's still too many to publish in an online album, and many of them are near-duplicates anyway.
If the photo manager let you compare any set of photos two at a time, you could (with n-choose-2 keypresses) find something close to a total order of the set. Then you'd easily be able to choose the best photos of the whole set, as well as the best ones of each subject/set of duplicates. No machine learning would be required.
Has anyone encountered a tool like this?
Edit: You wouldn't get a stable total order because the pairwise comparisons might not hold transitively, but something like Elo ranking could work here.
PhotoStructure has image comparison heuristics that is used for deduping, but would enable pairwise similarity grouping. I'd tried some transfer learning in the past to rank photos, but found it to be unusably noisy. I'll be trying again, though, with some new approaches. Fwiw, your PhotoStructure library has a carefully designed schema to let SQLite queries like this be trivial.
It's an interesting idea to let users pick bests out of near duplicates (and maybe prompt the user to workthrough this UI flow on album creation). I've put it on my to-do list!
Thanks, this is useful. Made me realize that I habitually hit Shift-Command-Plus when I'm browsing so I never notice the default text size of anything, even my own sites!
I understand "so small" but what do you mean by "so left"? Is there a usability enhancement in centering designs rather than left-aligning them?
Very cool -- is there a version of Apple Photos where this was implemented? Knowing that this data is queryable suddenly makes it very appealing as a Photo app
I've used photos on an external drive, but it must be afs+, and don't expect it to keep working past ~100-200k images. My frustration with photos helped motivate me to write PhotoStructure.
By default it keeps the photos in iCloud and only stores much smaller thumbnails on your local disk.
I've turned that option off because I wanted to be able to upload the images to S3 from the terminal, but if you just want to use it for search etc you wouldn't need to do that.
I have never managed to get iCloud Photos work on Windows. That means I can't even actually download all the photos I have taken. It's one horrible app.
So apparently some machine learning model is secretly judging how aesthetic your photos are. I would have thought judgements of aesthetics would somehow be the last bastion of human subjectivity and human intellect. Apparently I'm wrong.
This is actually a really interesting for me because I take a lot of pictures many are similar and I don’t actually care which ones I keep objectively thinking but when I look at them I like one aspect of one picture and a different aspect of a nearly identical picture. I was recently thinking about a tool that would just take all the photos from an event and just filter out a handful that it thinks are best and offer to delete the rest.
Did later versions of classic have any automatic classification? You can indeed use the database to get your tags etc out, but at least the versions I used didn't have any automatic tagging/labeling features.
https://dogsheep-photos.dogsheep.net/public?sql=select%0D%0A...