It will be interesting to see how well it actually works. I would like the idea of being able to track where my cats are and what they are doing. And more importantly, find them when the fire alarm starts going off...
I could also see some interesting use cases of this with home automation, particularly with lights and being able to say that someone is actually in a room instead of just sensing motion (see: cats).
That being said, I do not like the severe potential for abuse and privacy issues here. Given that it can identify specific people, I have to imagine this is a bit more detailed than just a human sized blob? Could it theoretically identify what you are doing, could it theoretically give you a real time view of someone (or multiple people). Even if not super detailed it could still give you an idea of what is going on.
I could see this particularly problematic in apartment buildings.
> being able to say that someone is actually in a room instead of just sensing motion
I often thought that Apple's UWB chip is going to be used for this sometime in the future. Just think; we know Timmy is upstairs watching TV, two people are in the living room, someone just opened the garage door and all the sudden joined WiFi.
Apple: we have real estate data inference. Time to monetize it.
Honestly I have wondered about this for a long time, I already have homepods all around the house I would imagine given that chip it likely would not be too difficult to say that someone is in a given room.
I imagine that hardest part is setup, like would the UI just have you walk around with phone to define walls. What if the homepod moves?
Maybe I am over estimating the technology, but if I could ask "where are my keys" and it says "it is in the bedroom" it would be fantastic.
For everything that would be great, but I'm not the only person who has an Apple Airtag on their keys and wallet because of how fantastic that ability is.
I am confused by the second part, not sure what the problem with that is?
I can already ask the homepod "where are my keys" and it knows what airtag(keys) is mine. It just doesnt tell me by room, just pings it and if for some reason it isnt home it says the address.
But yeah, if there was somehow room level tracking I would have significantly more Airtags. Easily one of my favorite pieces of tech in a long time.
I will never understand why people claim Apple is aggressively trying to monetize user data when their whole business model is not monetizing user data and they have moved towards device-side features and greater user privacy from Apple or anyone else.
For example, iPhones around the 11 or 12, and Apple Silicon Macs, gained the ability to do voice recognition for siri, and dictation, on-device. Apple servers are only used when Siri is asked to do something that would require it, like asking for a web search, the weather, etc.) OneDrive, Dropbox, and Google Drive all do not support E2EE. iCloud does.
Meanwhile Google is not only monetizing your data but also using you, your device, and your data plan as a free data collection platform for their services.
Notice that Apple rather famously told the FBI to go pound sand when they were asked for help decrypting a mass-shooter's cell phone?
Notice that you can shut off all of the user-sourced data (like traffic congestion and wifi/cellular network location data collection) on iOS, and there are no such controls on Android?
Did you notice that one company's devices are much cheaper than the other's, just like those "smart TVs" are much cheaper than "dumb" TVs were?
That's because to Google, you and your data is the product...
> I will never understand why people claim Apple is aggressively trying to monetize user data
Because they are a company, and companies loooove money. And data is worth a lot of money. The App store has ads at the top of every search. Apple's had a taste of that money and, is the cynics are too be believed, it's only a matter of time before the whole company succumbs. If you think Apple is somehow genetically incapable of bad ideas, remember client side CSAM scanning?
I've wanted an applicable use of something like this from UWB (UltraWide Band). Apple is really well-positioned for this, but nothing has yet come to fruition.
If devices like HomePods — or ideally, something much more affordable — placed around your home can begin to triangulate you based on a wearable (phone, watch, tag on a collar, etc.), then some really cool possibilities become imaginable.
This technology is incredibly old, and definitely has already been abused. This just puts it into your hands. There's no sense complaining - anyone motivated or evil enough was already in possession of this.
That is a weird way to see it, just because a technology already exists (from the best I can tell the ability to identify specific people is a unique part of this) doesn't mean that we could be looking at an improvement to the technology or worry about it being put in more consumer technology making it easier for average people to use it.
I believe I experimented with this 6 years or so ago, then I found out other people did it way better than me. If you're talking about passive radar, it's really old. I don't know. Decades.
I could also see some interesting use cases of this with home automation, particularly with lights and being able to say that someone is actually in a room instead of just sensing motion (see: cats).
That being said, I do not like the severe potential for abuse and privacy issues here. Given that it can identify specific people, I have to imagine this is a bit more detailed than just a human sized blob? Could it theoretically identify what you are doing, could it theoretically give you a real time view of someone (or multiple people). Even if not super detailed it could still give you an idea of what is going on.
I could see this particularly problematic in apartment buildings.