> Microsoft had been planning to use a number of sensors on the sides of a device to detect how a phone is held by grip, allowing 3D Touch-enabled phones to block an orientation switch when you're lying down in bed.
Samsung devices do this by checking the orientation of your face through the camera. They will also stay unlocked by checking if you're still looking (while reading a long page of text, for example) and also pause/play videos when you look away.
The problem, of course, is that none of this works when the lights are off - like when you're in bed - which is one of the most common use cases.
They seriously do that? This alone could get me to switch. Switching from the iPad 2 to the iPad Air 2 was painful because they removed the physical switch that could be used to lock orientation. I used to use it for this. Now I have to swipe up (sometimes twice depending on the app) toggle rotation and then swipe down. First world problems. I know. But it's really annoying. To just detect the situation with no interaction is absolutely amazing. It's the sort of subtle solution I would expect from Apple specifically in contrast to Microsoft.
My Z3 Compact has a feature where it will automatically reorient the screen unless you rotate the phone slowly, in which case it keeps the orientation fixed. Sony calls it "Smart Rotation".
It works great except when it doesn't. Pretty frequently I'll lie down, use the phone for a couple of minutes, and then while holding the phone perfectly still it'll go "HEY THE PHONE IS IN LANDSCAPE I'D BETTER ROTATE" all of a sudden.
Which is to say, this kind of thing is great if it's 100% reliable, but if you only get to 95% it's a nuisance. More often than not I just manually lock it because I expect the Smart Rotation feature to flake out, and fixing it afterward is more work than locking at the start (rotate back to the right orientation according to gravity, lock the rotation, then rotate back to my viewing position).
I wish it were possible to move the screen 360 degrees, as opposed to only two orientations. I'm thinking of a "magic 8 ball"... wouldn't it be cool if you were holding your screen at a 45 degree angle, if the whole screen rotated to be straight? Obviously you would then have a bunch of empty space on the side, but it would feel much more physically realistic.
I think on the whole you'd obviously rather have this capability than not, and in the hands of developers interesting stuff will happen.
But I chuckled a little thinking back to all the Fitts's Law discussions that used to go on about the Mac versus Windows menus - on the Mac, the hit area is higher because you can just hit the top of the screen with your mouse. On Windows, it was always lower, because you had to aim below the title bar. This reminded me a bit of that - full fledged touches are dead easy to get right, hovering slightly less so. So I'd hope if this makes it into the final platform (or inevitably gets copied) that there's a lot of guidance to developers to use it only for enhancing existing gestures, instead of making it a primary mode of interaction.
Microsoft has this habit of being first in a market, technology and then either ignoring it or releasing a shitty solution. I'm not an Apple guy. Use both windows PC and phone.
So I'm not being biased against Microsoft when I say that the best bet we've got to seeing this tech in an actual product is for Apple to (invent it) release it first.
I want this simply to have :hover states in mobile web sites. The pre-anticipation mode makes a lot of sense and makes tapping to show controls seem archaic
Fingers already don't have much precision when tapping. It's even worse when hovering, so hover states on your DOM elements are likely not to work in a useful way. And then just imagine a menu that opens on hover suddenly obscuring the whole site. I'd rather take just the highlight of links in the browser as shown in the video than trying to adapt touch screens back to interfaces that were meant to be used with much higher-precision pointing devices.
I am a frustrated mobile phone touchscreen user, feeling like the phone never quite knows what I want to do. This is the first time I've felt a touchscreen might actually be able to help me be more productive, rather than a roadblock.
If anyone from MS is reading this, though, please have the following feature: I'd like to be able to select hover-touch behaviour for various modes: if the phone is locked and the screen is off, hovering with one finger will show me the time, hovering with two might show me mail/text notifications, or maybe a certain hover gesture will open the camera immediately so I can quickly take that pic. etc.
I'd like to define certain hover patterns and responses.
The Galaxy S4 had something like this, except no one (other than the Samsung apps) implemented the functionality to make use of it.
In the end, all I can remember of it was that you could wave over the top of the screen to change home screens, and see a little light ball would move around on the lock screen when you moved your finger over it.
Theoretically maybe. Practically likely no. First of all, this is (assuming perfect sensor accuracy) fairly predictable in that identical interactions yield identical results. It's not very random. Then there is the whole point that the UI presented is context-aware in that it knows where your finger is. Thus is has the potential to be much better to use than a static UI.
You can probably compare it to the clipboard copy button appearing near your selection instead of in a static place on the screen. While it's not in the same location every time, it's in a sensible and usable location.
Of course, this all presumes that you never make mistakes, such as mistakenly assuming that a thumb from the left is used while it's actually the opposite. Then the experience is much worse. However, considering how often my phone thinks it should go to landscape mode while in portrait (never), I guess that's not going to happen often either. If things go wrong, it's your touch-screen. You probably wouldn't worry much about UI popping up in strange places when you simply cannot use your phone.
There are many effects that happen like this on your phone currently, and when subtle, greatly improve usability.
Take your phone and flip it upside down, and don't allow the device to flip orientation using a lock in software (iOS has the control panel orientation lock switch).
You can start to see subtle differences in ease of tapping buttons, targets, etc. The software here is adjusting for how a user actually touches the screen in comparison to where they think they're touching, etc.
Here's we're seeing extreme examples, which probably wouldn't work well in practice. But slight integrations could do wonders.
Offering up UI controls near an available finger seems like a great idea. If you're selecting a file at the bottom of the screen then you might want to have the controls at the top of the screen. If the phone can sense that that is what you're trying to do, this could be a great UI.
Isn't "are magnificent" (as opposed to "would be magnificent") a bit of a stretch? I can't watch the video right now, but it sounds like it is a concept video for something that was never actually implemented. As such, it is wonderful to dream big and to reward big dreams; but that should not, I think, be confused with delivering on those big dreams.
EDIT: Thanks to dang for changing this. Just to be clear, my post referred to the original title, taken from The Verge article which was originally linked.
I would update the title to your suggestion, as I agree with you — but I want to be true to The Verge — I imagine it's classic PR spin on their part...
That said, there's one "feature" I'd like to see implemented first on the modern smartphones - make them stop lagging and hanging! Seriously, this is ridiculous. Any UI improvement like this will be more annoying than helpful on today's smartphones, because the phone will randomly refuse to detect or react to "hover".
I liked my Windows Phone a lot. It was the most responsive phone I'd used in years, and I could resize the tiles to the shape and position that I wanted, so my most common apps were more accessible.
But then they laid off 7400 division employees and had no product roadmap, and it was obvious they were going to write-off their investment in Nokia. So I left not because I didn't like it, or liked something else more, but because of a lack of confidence in their commitment to it.
Microsoft, you need to do a better job of announcing your direction.
Agreed – I had a Lumia that I really loved, but there was a lack of direction for the platform and severely lacking app ecosystem that ultimately brought me back to an iPhone.
This is a highly underrated feature.