Hacker News new | past | comments | ask | show | jobs | submit login
A UI Experiment with the iPhone X’s Front-Facing Camera (fastcodesign.com)
134 points by dirtyaura on March 5, 2018 | hide | past | favorite | 105 comments



The much maligned Amazon Fire Phone had four specialized high-speed head tracking cameras on the front of it, and made this effect the centerpiece of it's UI. I worked on it, and it was very cool. The phone failed for lots of reasons, but this particular "gimmick" actually held some promise. The example clay-molding game which came with it actually showed how being able to perceive the image in pseudo-3D did have a real application.


I worked on it too. You're right about the head tracking - it did work very well. They had some great UX ideas centered around it too. It's unfortunate it didn't pan out for those UX features alone.


Which were the great UX ideas? I think I missed them... The fact that you had to move your head or phone just to see the time was bad. The 'peek' gesture which required moving your head to the side and looking sidelong at the phone to open a side navigation menu was equally lame.


To be self critical, I think there was a lot of, "We've got this cool hammer, let's make everything a nail" sort of thing going on where the perspective stuff was grafted onto a lot of areas better left alone. That said, the tilt scrolling on web pages was actually very nice to use (remember it was based on the angle to your eyes, not the physical phone angle, so you could use it lying down). And like I said, applications where it was helpful to glance around a 3D object, was quite compelling. But yeah, there were too many areas where the effect was used where it probably didn't need to be.


The feature I miss/wish for the most was looking at the phone sitting on your desk and having the screen turn on. "Wonder if I have any notifications...Nope"


Is it a joke? "Wild experiment"? "first to take the effect mainstream"? What kind of bullshit is that?

The said effect and the head tracking principle was already used in the New Nintendo 3DS consoles that was sold millions worldwide. Not only that - 3DS' top screen has a true "parallax barrier" screen, means it displays a 3D picture with depth and all.


N3DS face tracking is only used to correct the stereo separation (so each eye sees a separate image without having to be exactly on the center), not to move the scene depending on where you are. So if you have a scene like in that experiment, you would see it skew to the opposite direction you move your head. Also the vast majority of games don't have that much depth anyway, even at the highest setting, probably because the scene doesn't move accordingly and would make people dizzy.


Yet it was essentially the same technology that was named "wild experiment" in the article above.

Also visually the scene does move accordingly, mostly because the top display is tecnhically two displays in one and they render the scene from 2 different angles.


The virtual cameras don't change (the games would have to be changed), so the parallax effect is not there. The only thing that changes is the grid of limes that obscures one image for each eye.

The only thing in common is that both have head tracking, that's all.


Yep, and that is exactly what I addressed in my first comment - that head tracking mechanism was already used in mainstream technology piece.


This reminds me of this demo from 2007 https://www.youtube.com/watch?v=Jd3-eiid-Uw

And also didn't the Amazon Fire Phone have head tracking?

Also found this, done on an iPhone in 2011: https://www.youtube.com/watch?v=7saRtCUgbNI

Or this table using a Kinect in 2012: https://www.youtube.com/watch?v=2CbiOikirrg


The "experiment" is _how_ head tracking is used, not head tracking itself. I agree it's not exactly new, but nobody bothered to make it for the iPhone X until now.


Not quite the same effect. The 3DS does have an actual 3D image, but you only get a fixed view of it, since the viewing angle for seeing 3D was very narrow. This is fake 3D (hence Trompe l'Oeil), but lets you view the "3D" object from a wide range of angles, since it works via head tracking.

Both have their advantages, but they're not the same thing.

That being said, the Nintendo DSi (the predecessor to the 3DS) did have a trompe l'oeil 3D game, called "Looksley's Line Up". It had a ton of charm, but the DSi's camera wasn't powerful enough to support the feature well. For whatever reason (price), Nintendo sticks with low quality cameras. Still, I've always hoped that Looksley's Line Up will eventually make its way to another system.


Nope, on New 3DS the view wasn't fixed, it actually tracked your head position to fix the picture accordingly.


Ah, interesting! I would expect it still to be limited to a narrow vertical band due to the technology used to send a different image to each eye, versus a purely camera-based solution which doesn't attempt to display 2 different images at all.

Or could it support, for instance, rotating the handheld?


Apple are the first to innovate all tech when one's entire worldview is limited to the Apple product line.


And because Apple has the media/design world by the balls thanks to ingrained habits, they have blinders on for the rest of the tech world.


This[1] was the first person I saw do this effect with a hacked Wiimote and some IR LEDs on his head. It's mentioned in the article. This definitely isn't new. However the good face tracking does make it easier. Here's a face tracking JS lib[2] that'll do it for you if you'd like to make it happen on the web. I've been playing with this (not free) library[3] to do face tracking to achieve this effect for AR and experimental interface purposes for like three months. It's got multiplatform support that is pretty good.

I worked this into an iOS display demo in ~2009 using their basic face tracking that made for like zombie face apps and shit. The RealSense is a MUCH more solid experience, but this is not new at all.

[1] https://www.youtube.com/watch?v=Jd3-eiid-Uw&t=164s

[2] https://tastenkunst.github.io/brfv4_javascript_examples/

[3] http://www.visagetechnologies.com/


I've always found these pseudo-3D effects (including the slight "floating-icons" effect applied to iOS wallpapers) to be unconvincing in real life.

I suspect they look much better in demo videos like this because there's no depth information in the objects surrounding the phone either -- the person's hand, the floor/desk behind them, etc. This makes it much easier for one's brain to map "pseudo-depth" onto the entire video.

In real life, you have depth perception for everything except the images on your phone's screen, making a demo like this look "flat" even at its most convincing. Real 3D requires two distinct visual inputs; this is a gimmick.


The video does mention it works best with 1 eye closed. Maybe this is the reason for that.


definitely the reason


This reminds of the head tracking demo using a Wii remote made by Johnny Lee years ago. https://www.youtube.com/watch?v=Jd3-eiid-Uw


LOL, remember getting that working at home, back in the day - gyrating in front of my monitor with (from memory) the Wii's IR transmitter bar held to my forehead.

It was a lovely little taste of what might have been.. although we're still waiting!


I was more fascinated with his projected touchscreen using a pressure-activated LED at the end of a pen. He's quite the visionary.


You can play with this in browser.[1][2][3] And there's OpenCV.js.[4] But I more often simply put a small square of yellow gaffer's tape between my eyebrows, and do color tracking.[5] Accurate, reliable, and cpu/battery cheap.

And for expert UIs, one doesn't have to emulate reality. So you might exaggerate parallax, be non-linear or discontinuous, etc. There might be a much larger vocabulary available, than merely "look through the small portal" and "make secondary content conditional on viewing angle". Maybe.

Why might you want this? Consider your primary "desktop" environment shifting into HMDs. But you still need to mix in screen time, to give your eyes a break. It would hypothetically be nice to have similar UI idioms on screen. Especially as your tooling becomes less and less like current 2D ones.

[1] old similar demo (broken on FF) http://auduno.github.com/headtrackr/examples/targets.html video: https://vimeo.com/44049736 [2] current tracking-only demo https://trackingjs.com/examples/face_camera.html [3] eye/gaze tracking (demos at bottom; but for me, they require good lighting, and beard removal) https://webgazer.cs.brown.edu/ using https://www.auduno.com/clmtrackr/examples/clm_video.html [4] opencv.js https://docs.opencv.org/master/d5/d10/tutorial_js_root.html [5] color tracking https://trackingjs.com/examples/color_camera.html


This is mentioned in the article too...


I'm torn, there's literally a link to Johnny Lee's work in the article. So your comment doesn't add anything.

But a lot of people read only the comments (or the comments before the article) so your comment does add value

Is it a good HN comment or not? :D


It’s an acceptable kind of HN comment. A link in the article is generally easy to miss and probably others missed it too.

Admittedly the article did talk enough about the other thing that it’d be pretty hard to miss if you read the whole thing, but I think we can forgive it still. Perhaps the parent to your comment opened the page and saw the video but didn’t bother to read the article. I think that’s fine too.

I always read comments first and in fact quite often I read the comments only. As long as we don’t derail the discussion completely by starting a top-level comment about how we perceive something based on the title alone I think it’s fine. That includes it being fine to mention things that you are reminded of by the title as long as you phrase it correctly. In my opinion of course.


Oops! I admit I skimmed the text and looked at the pretty pictures.


[flagged]


Agreed that my comment has little to no value

But I was genuinely curious how other people felt :)


This is neat.

I think what would be even neater is to use the head tracking from the notch to compute a transformation of the back-side camera to simulate that you can see through the iPhone. Of course this wouldn't work when the iPhone is too close, or if the angle is too wide, but in other cases, this would make AR really good.


I assume with two cameras on the back, it would be possible to create a stereo video stream and implement a transformation as you described.


"The notch. Everyone hates it. And what did we get for it? A rooster that sings karaoke. No thanks."

This isn't true. What we got for it is Face ID and the consequential removal of the home button which then allowed for the frameless design.

Not saying it's ideal. But the author has missed the point slightly.


I see it not as a notch but some bonus bits of screen for the clock and signal/battery status.


Yeah, everyone hates it so much that a ton of android phones being released this year have their own notch... It really does just fade away for me, I don't notice it anymore


and the essential phone, that had a notch before it was cool


Samsung S8 has face ID and no home button...


Yo mark willson of co.design. Were you around 10 years ago?

Let's tame the article title a bit huh?

https://www.youtube.com/watch?v=Jd3-eiid-Uw


He talked about Johnny in the article.

>Norrby is far from the first person to play with trompe-l’oeil (also called “parallax”) in the modern age. Johnny Lee, now at Google working on augmented reality, first made his mark by turning a stock television and a Nintendo Wiimote into a wild 3D game.


oh ok.. so he should know it's not 'wild'.

annoying bait title.


Very cool but I do have to laugh at his opening remark, the notch and how everyone hates it. If anything screams that Android phone makers have no imagination it is the notch, suddenly its on their phones.

I do think the notch is bad UI but it can be worked around. I would have more easily accepted a narrower band of black across the top and bottom but Apple isn't losing sales because the rest of the phone's tech is that good.

I do look forward to the day when the components are barely distinguishable from the display.


_Some_ android phone manufacturers are adding pointless notches, but it's a big market. You'll find some android manufacturer doing basically anything.

Personally I find Samsung's current designs prettier than the iPhone X, though not by much - thing's gorgeous.


[flagged]


I mean,

1. Face Id is wildly superior to anything else on the market 2. Apple's processors are still better than either Snapdragon or Exynos chips, though the latter is closing in 3. They still get a lot further on less RAM and battery 4. The iPhone X screen somewhat beats the S8/N8, though potentially not the S9 5. Their force touch implementation is better than Samsung's, too.

I love my note 8 and wouldn't part with it for any iOS device, but suggesting they're entirely noncompetitive is silly.


Still not available on the app store. Are there _any_ apps actually taking advantage of this yet, other than the animated emoji? Kinda bummed I've not had anything nice to play with yet


Yes—Rainbrow, a game you control with your eyebrows: https://itunes.apple.com/us/app/rainbrow/id1312458558?mt=8


Yes, we released Nose Zone: https://nose.zone

You’ll probably see more coming soon. Ours was in review by Apple for over a month.


> Ours was in review by Apple for over a month.

?! That's crazy talk - I've not seen a review process for a new app take more than a few days in recent memory.

I guess using the Face ID API marked your app for more thorough review?


Yeah, they are pretty thoroughly vetting any app that uses the TrueDepth APIs. Even had to go as far as talking with them on the phone a couple times to reassure them we weren't using "face data" in any nefarious ways. (We're not; everything is kept on device)


It just occurred to me that I'm not sure how they can even be sure that you were telling them the truth. Not that I'm suggesting otherwise, but within the scope of the tools they have available to them while doing the review, what's to prevent you from saying "sure, we're keeping everything on device" when in fact you're building an international database of faces that you intend to sell to a nefarious government actor.


Yeah very true. One of the things they did was make us explicitly state in our privacy policy how (and what for) we used data from the TrueDepth APIs. So I guess at least someone would have legal recourse against us in that case.


That sucks - I was on a team that released a fully legal (in the USA) gambling app a few years back and while we had to have a few back and forths, it definitely didn't take a month to get approved.

At least you finally got it out!


>Are there _any_ apps actually taking advantage of this yet

Yes, apple is harvesting your high-quality biometrics using this. 5 years down the line someone will leak the data and your face will end up on some weird porno series.


Paranoia aside the data does not leave the device. All that is available to apps are the 2d images seen by the cameras. These are hardly special.


Critical Apple comments are risky here ;)


Stupid ones are.


What is stupid about expecting this highly personal data to leak or be misused these days...macintux...?


The strong implication of the original comment was that Apple was centrally collecting biometrics and would eventually sell it off or leak it.

The biometrics are stored on each iPhone, in a secure enclave with severely limited access, and they're only useful for identifying (slowly) someone at close range.

So, yes, theoretically in the future someone could hack all iPhones everywhere using one of the most sophisticated attacks ever, and collect data that might conceivably be useful for high resolution cameras to identify faces at close range. Maybe. Although Face ID uses infrared, so would the high resolution cameras need to have that, too?

Anyway, that's why I felt it was a stupid comment.


The bio-metrics shouldn't be collected in the first place. A simple patch on this closed system would allow who knows whom to collect the data. Just considering the current political situation in this companies home country it's a unreasonable risk. Considering their willingness to cooperate with dictatorships and similar ads to this. Fear not stupid at all.

Considering the fact that this exists because a) Apple was unable to come up with any real innovation to sell the new product and b) people seem to be really to lazy to type a number and rather give up that information.

Data Minimization, Data Avoidance are a thing.

> "using one of the most sophisticated attacks ever"

As if we haven't heard that speech before...


Users worried about their privacy can always choose not to use it.

The more likely attack vector is simply capturing scenes through the front camera. That would give you, most of the time, an image of the face of the user, not just depth mapping information of questionable value.

But really, nearly everyone shares images of themselves online, so even that's of dubious value.

I'd challenge you to find any security expert who agrees that Face ID as Apple implements it can realistically result in useful biometric data being leaked to a hostile party. Apple supplies whitepapers documenting the secure enclave, I imagine there's one for Face ID.

Now, the question of whether Face ID is secure enough for any given user's needs as a local authentication is a perfectly valid question, and clearly for some users the answer is no. But, again, it's optional, and that's not at all the threat under discussion.


> Users worried about their privacy can always choose not to use it.

You make it look like it's easy for the normal Apple user to switch to Android. In fact it's quite the opposite and the whole situation is even worse because most of them won't even be aware of the dangers. The major reasons for people I know to chose an IPhone over an Android is the "ease of use" (resulting from the fact that their first smart phone was an IPhone already and they never tried anything different) and because they are "so confused with all the options/apps/general possibilities on Android". Those are the people who need to be especially protected. They are caged within a locked environment of a single US company. This alone should make you think.

> But really, nearly everyone shares images of themselves online, so even that's of dubious value.

This sentence together with this high tech approach you demonstrate on the rest of the comment is mind-boggling. As it's the most common approach of companies/individuals to abolish digital privacy all together. The old version of it was "I'm not afraid of X because I have nothing to hide". Horrifying, but now I understand where your attitude comes from.

Being born in a oppressive state, this is where I would actually use the word "stupid".

> I'd challenge you to find any security expert who agrees that Face ID as Apple implements it can realistically...

As I've wrote above. It may be that FaceID is not a big deal right now. We don't know it for sure since it's all locked down but we assume it. There is however still the ARKit and all those APIs using those depth/facial mapping capabilities. Those becoming the new standard for popular apps is just a matter of time and since you've already given rights to use the camera, those features will be (or are already?) a nice extra. So you see...we don't even need to reach out to possible changes from the paranoid government governing Apple and their data under and awaiting some patch to allow the access to FaceID data. It's far more accessible.

I wonder, would you allow your phone to take a drop of blood for authentication or where does your privacy actually start?


> You make it look like it's easy for the normal Apple user to switch to Android.

No, I'm saying Face ID is not mandatory on an iPhone X. You can use a passcode.

Nor am I saying privacy should be sacrificed on the altar of technology. I do my best to stay away from Google, and I try not to let Facebook know any more about me than necessary (and every day I contemplate ditching it, but there are a few important reasons to stick around).

There are plenty of ways to do biometric security wrong from a privacy standpoint. I trust Apple to do an earnest job of doing it right, because they have positive incentives to work for their user base rather than being a data collection/ad selling company.

And if biometrics aren't where you wish to place your faith, you can simply not enable the feature.


Would opting out of FaceID also lock down the feature completely for the APIs?

FYI: you can have an Android phone complete without a single google app or the google app store. LineageOS is the most popular alternative OS. There also other stores you can put on your phone. Like F-Droid, which hosts open source apps.


> Would opting out of FaceID also lock down the feature completely for the APIs?

Biometrics, yes. AR-based depth mapping, no.


Really? Let's look in the App Store Guidelines:

https://developer.apple.com/app-store/review/guidelines/#dat...

5.1.2 Data Use and Sharing

    (i) You may not attempt, facilitate, or encourage others to identify anonymous users or reconstruct user profiles based on data collected from depth and/or facial mapping tools (e.g. ARKit, Camera APIs, or Photo APIs)
-----

Let's wrap this up here.

- A user issues his fears based on a technology that is the topic here

- he gets downvoted into oblivion but no comment follows

- I trigger a comment by stating the obvious behavior prevalent on every single article posted here that may be or even is critical towards Apple

- you declare the users comment stupid based on your assumption that a single software use of the general feature may not be misused. Even though you can't know that because we are talking about a closed system and the APIs allow that without a possibility to opt-out (if you have already granted general camera permissions).

- you further state that users don't need privacy either way because they gave it all away. Which is actually the only really stupid statement in this discussion here

- after all that you even go so far damning another US company based on actually nothing. A company that allowed the world to develop their own open source operating system and app world after you've done everything to protect a company that provides you with a system you actually know only what they allow you to know about.

macintux, I couldn't have wished for more to demonstrate what is wrong here. There is a quasi cult behavior in the Apple fan base turning people into marketing machines ready to drop everything to protect the brand while condemning everybody else. You owe the guy an upvote. I don't care.


> I couldn't have wished for more to demonstrate what is wrong here

Then I am glad to have been of service.


I like how you ignored everything else ;)


What’s the difference between this and the plethora of AR games and apps that ‘trick your mind to believing there’s a whole world beyond the screen’

Using face positioning vs the accelerometer is interesting, but without seeing it in action, is it that much more compelling?


>>The notch. Everyone hates it.

These broad statements irritate me. I don't hate it - I actually think it integrates super well with the UI and makes the X look unique among the sea of identically looking glass slabs. And I don't even own an iphone.


As it does me, who are they to say what I hate? I most certainly don’t hate the ‘notch’ - I couldn’t care less about it other than hey - I’ve got some extra screen real estate on either side of the speaker and cameras so that the clock and indicators don’t have to cut into application space.


Tech journalists who are always looking for the next best thing and just HAVE to have a strong opinion on everything seem very reluctant to embrace change.

No one I know cares very much, they think it's neat, they all wonder at the manufacturing of the oled screen, but like you, they like the extra screen real estate.

I feel there must be some sort of 'too cool for skool' bubble that journalists live in.


I certainly don’t hate it either, but I also wouldn’t say I like it specifically. I love the phone altogether, and the notch doesn’t ever bother me, but sure, in theory it would be better if all the same features could be implemented without a notch. I recognize it as a reasonable design compromise that is justified and necessary given current technology.



Everybody hates it so much the other manufacturers are scrambling to copy it.

https://www.theverge.com/2018/3/4/17077458/iphone-design-clo...


Yeah, but they’re not copying it because it’s a good design. They are copying it to look like Apple.

That’s because the only purpose of the notch is to differentiate Apple phones from every other smart phone. It’s all about branding - https://medium.com/swlh/the-iphone-x-notch-is-all-about-bran...


Your article speculates about why Apple doesn’t hide the notch, but doesn’t address why it’s there in the first place.

It’s there to have a place to house the various front facing sensors/emitters while sharing verticals space with usable screen area.

Once you decide to do that you have to answer the question of how to deal with the border line between the two.

Obviously, you don’t want something ugly.

(1) hide the border. This means anything and everything displayed on-screen next to the notch has to be on a black background.

(2) create a non-ugly border and let it show.

Perhaps the decision to go with (2) was influenced by branding considerations but it’s a secondary decision, not the “only purpose. And (1) isn’t exactly a great option, regardless.


[flagged]


I didn’t agree much of what you had to say, but it was your opinion to express and that’s fine. Did you really have to break down and call people “tool-bags” at the end though? I can promise you it only prevents whatever point you might have from being heard.


I'm okay with people not agreeing with me. It's a common plight for intelligent folk and I've pretty much had to be okay with it since childhood.

Thanks for your suggestion though! I changed it to "ignorant people".


If you have to tell people that you’re intelligent while denigrating them, it says far more about the fragility of your ego, your anger and frustration than anything else. It might be youth, or it might be struggles with socialization, but if you want to communicate with other intelligent people without being picked out as angry and insecure, your approach could use some refining.


Literally the only purpose of the notch is to provide space for the god-dammed technology in the phone.


> Literally the only purpose of the notch is to provide space for the god-dammed technology in the phone.

No, it's not. It's either for differentiation or to be able to (like the Essential PH-1 before it) claim an “edge to edge” screen (even though in Apple's case this excludes a big part of the top edge), but it's not to make room for the “technology”, which obviously would fit just fine with a rectangular screen and a top bezel.


In other words, you are suggesting the technology take up more room than needed. Your loss, I suppose.


No, in other words he is suggesting that the technology take up an area of screen that makes sense instead of one that doesn't make sense at all.


Makes sense to me. Looks fine, has some extra real estate on the screen… can’t see the downside really.


The shape of technology is more important than how much room it takes up. There's a reason why we all have rectangular phones and not round, triangle or square ones!

Your argument that "the technology [must not] take up more room than needed" is simply false. I can point to plenty of Apple products where the technology was fit into a bigger box than needed so that the box didn't look misshapen.


That sounds like both cases are a function of both aesthetics and the tech. I’m looking at my iPhone SE, and I’d kill for a notch in a bigger screen, that maintained the overall form factor of the existing phone. Right now about two finger-widths, one each above and below the screen, just exist to hold the home button and sensors. It seems like a marked improvement to enlarge the screen, and add a notch at the top.

Maybe this is one of those things that seems awkward until you make some direct comparisons with the alternatives.


Incorrect. I just told you what the only purpose of it is and gave you a whole article about it.

Somehow we’ve had sensing instruments facing us for years now without notches. And yet, here we are. It’s funny how Apple can get everybody talking about their stuff isn’t it?


That’s kind of a stupid move by Apple then, considering everybody hates it. Or maybe they don’t.


I don’t know any iPhone X owner that hates the notch. That opening bugged me and I stopped reading there. :)


When you spend $1000+ on a phone, you'll find any reason to justify its faults. If a person had a choice between the notch, and no notch, with the same functionality. I'm doubtful anyone would choose the notch for aesthetic reasons.


I disagree. Think about it less as a notch in the screen, but as an extra screen around the notch. I'm looking at my phone and it has the camera and speaker above the screen, but the sides are just empty black space. If I could, I would absolutely want to have extra screen there, to display the time or notifications. So yes, I would absolutely pick a phone with a notch over a phone without it.


The assumption was that the screen real estate would be equal, minus the notch.


> I don’t know any iPhone X owner that hates the notch.

Surely that's survivorship bias? I dislike the notch so much that I choose to not buy an iPhone X despite being due for an upgrade.


Of course that’s survivorship bias. I wasn’t saying nobody hates the notch, just refuting his annoying assertion that “everybody hates the notch”.


Journalistic hyperbole? Take something that is controversial and exaggerate the negative side (e.g. everyone hates Trump when he clearly still has supporters).


This is very similar to Johnny Lee's initial exploration with Wii... http://johnnylee.net/projects/wii/


As much as I don't like the iPhone Notch and all that, the video in the article was quite cool, appeared very performant and a great programming showcase. So kudos to the Dev for making this.


How difficult is it to use/implement as a 3D scanner (3d modeling)?


Haven't messed with the depth sensor on an iPhone yet, but you could do it with Kinects and other depth cams like the Intel Realsense series.

In my experiments with those, the main drawback was ergonomics (device of varying size depending on the sensor, connected to a computer via at least one cable as you moved it around the thing you were scanning).

On a phone or other standalone/self-contained bit of hardware, I'd imagine it makes things easier.


> Haven't messed with the depth sensor on an iPhone yet, but you could do it with Kinects and other depth cams

Well, if you’ve used a Kinect, then you’ve used the iPhone’s tech, because Apple bought PrimeSense, which developed the initial Kinect for Microsoft.


I'm pretty sure the identical effect can be (and has been) produced by the combination of normal front facing camera, accelerometer and gyroscope.


The only required piece of information to render the scene is the position of the eye relative to the phone, which can be represented as an x and y coordinate relative to the rectangular frame captured by the front-facing camera. The accelerometer and gyroscope shouldn’t be needed, except perhaps to smooth out movement of the phone in the user’s hand if the camera’s frame rate is too low.

My guess is that the only reason this demo is relying on the iPhone X is that Apple provides a facetracking API, so there’s no need to write any code or use external libraries to do eyetracking with a normal front facing camera.


To really get a nice 3D effect you also need distance of the eyes from the phone. Based on the video in the article, this demo actually uses this depth information. You can't get this depth info easily from a camera. I can certainly see how the more advanced optics in the iPhone X really help here.


I suspect the size of the phone screen isn’t large enough for distance information to affect the rendered output significantly.


I don't think so. This effect, at least as described, can change even as the phone remains static, e.g. if you head moves while the phone sits still on the table.


Parent's comment mentions front facing camera. Probably for the same reason.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: