It runs 24/7 aongside the "ok google" detection, and as soon as it detects a song, it'll show the name on the lockscreen. So normally, all you have to do is look at your phone to get the song name [0]. This is also done 100% locally, there's a offline database of 10k+ songs which is only ~50mb [1], and gets updated regularly.
Siri has been able to identify music (I believe using Shazam?) for 3 - 4 years. I suspect they were shopping themselves around, and Apple wanted to own the technology enough to give them their asking price.
The technology Google is using here is very different. They also have active detection on Assitant, but this one is a passive 24/7 detection. There's a local database of the ~10k most popular songs, and it's only ~50mb [0]. It's all done locally, and when it works (Which is very often), you just have to look down at your phone and the song name is already there on the lockscreen without you touching anything.
As mentioned above, this is already running to do OK Google detection. It's also not scanning 100% of the audio, as far as I understand, it analyzes samples roughly each minute. I'm sure they would not make this a feature if it had a big impact. From my own experience, turning this feature on or off made no difference in battery life.
I haven't been able to determine if this is a deliberate plot or just incompetence, but Siri's language parsing for the Shazam music identification became maddeningly limited in some recent iOS update. The only phrasing I can trigger it with now is "What's this song?". "Name this song", "What song is this", etc. all give me some Marx-brothers-esque response like "Sorry, I couldn't find the song What in your library."
Oh, and the only thing you can do with the result is tap it to be bumped to Apple Music, where (if you have a subscription and the song is available), the song starts playing. Hey Apple, when in the history of ever has someone wanted to start playing the song they are already listening to? At least let me copy the damn name. Siri is a dumpster fire.
Google Assistant has also been able to that for ages.
The pixel does something very different : it automatically tries to identify what is playing right now and writes the name of the song on the lockscreen.
It might seem to be gimmicky but I have found myself really liking this feature.
Gracenote (the people who bought and closed CDDB) offer music identifying to anyone who wants to license it. I remember Sony Ericsson feature phones all had this feature powered by gracenote. I would assume that's what Apple is using.
Is it the one that "google now" uses, or "google assistant?" Are they the same? I don't even know with google anymore man I just can't keep track of this shit.
Google assistant has a music identification feature now (up until recently it did not).
Now is still there, but the search feature that used to be on the launcher is just a web search now (can't do anything fancy with it anymore, that's reserved for Assistant).
Pixel 2 has a music identifier that is always listening and will place what song is playing on your lock screen. It does this with a local database and does not send audio data to Google.
> It does this with a local database and does not send audio data to Google.
Do you have details on this? I'm honestly quite surprised; I'd imagine the fingerprints to enough songs for that feature to be usable would take up quite a bit of space.
The one on Pixel 2 is done in the background on device, but it has a small catalog of titles. You can explicitly ask the Assistant for all the other songs.
https://venturebeat.com/2017/10/19/how-googles-pixel-2-now-p...