Hacker News new | past | comments | ask | show | jobs | submit | yodon's comments login

Great job on this - it looks really simple, and I like that the CLI has a built in @frequently setting that can be used for keeping the lambda warm.

thanks don! here's the GH discussion on this topic: https://github.com/scaffoldly/scaffoldly/discussions/17

Video clips don't seem to be loading (not sure if that's an HN hug of death issue or an iOS issue)

Same problem on Android. But everything is in the YouTube video towards the bottom I think.

It takes time but it will be loaded eventually.

I'd actually prefer a plain text list of these quotes - it's a great list of quotes, but the images mean they are hard to search or skim quickly.

Thanks for the suggestion, Initially I had the texts of the quotes below the designs but removed them in favor of a clean layout. But thinking about readability is better to add them again.

The HN FAQ makes a good point that "Did you read the article, it say..." can and should be shortened to "The article says..."


It's interesting to read this in the context of founder-mode

Every senior engineer I know, the dirty little secret they're afraid to admit is that they secretly really like C#

Eye tracking data is incredibly sensitive and privacy-concerning.

HN tends to dislike Microsoft, but they went to great lengths to build a HoloLens system where eye tracking was both useful and safe.

The eye tracking data never left the device, and was never directly available to the application. As a developer, you registered targets or gestures you were interested in, and the platform told you when the user for example looked to activate your target.

Lots of subtlety and care went into the design, so yes, the first six things you think of as concerns or exploits or problems were addressed, and a bunch more you haven't thought of yet.

If this is a space you care about, read up on HoloLens eye tracking.

It's pretty inexcusable if Apple is providing raw eye tracking streams to app developers. The exploits are too easy any too prevalent. [EDIT ADDED: the article is behind a paywall but it sounds from comments here like Apple is not providing raw eye tracking streams, this is about 3rd parties watching your eyes to extract your virtual typing while you are on a conference call]


> if Apple is providing raw eye tracking streams to app developers

Apple is not doing that. As the article describes, the issue is that your avatar (during a FaceTime call, for example) accurately reproduces your eye movements.


Isn't it the a distinction without a difference ? Apple isn't providing your real eye movements, but an 1 to 1 reproduction of what it tracks as your eye movements.

The exploit requires analysing the avatar's eyes, but as they're not the natural movements but replicated ones, there should be a lot less noise. And of course as you need to intentionally focus on specific UI targets, these movements are even less natural and fuzzy than if you were looking at your keyboard while typing.


The difference is that you can't generalize the attack outside of using Personas, a feature which is specifically supposed to share your gaze with others. Apps on the device still have no access to what you're looking at, and even this attack can only make an educated guess.


This is a great example of why ‘user-spacey’ applications from the OS manufacturer shouldn’t be privileged beyond other applications: Because this bypasses the security layer while lulling devs into a false sense of security.


> ‘user-spacey’ applications from the OS manufacturer shouldn’t be privileged beyond other applications

I don't think that's an accurate description, either. The SharePlay "Persona" avatar is a system service just like the front-facing camera stream. Any app can opt into using either of them.


That app gets a real time Gaze vector, which unless I've misunderstood something, non-core apps don't get.


Which app?


I should have said avatar service.


But the technology is there. That is the concern.


The technology to reproduce eye movements has been around since motion pictures were invented. I'm sure even a flat video stream of the user's face would leak similar information.

Apple should have been more careful about allowing any eye motion information (including simple video) to flow out of a system where eye movements themselves are used for data input.


"technology to reproduce eye movements has been around since motion pictures were invented"

Sure, but like everything. It is when it is widespread that the impact changes. The technology was around, but now it could be on everyone's face, tracking everything you look at.

If this was added to TV's so every TV was tracking your eye-movements, and reporting that back to advertisers. There would be an outcry.

So this is just the slow nudging us in that direction.


To be clear, the issue this article is talking about is essentially "during a video call the other party can see your eyes moving."

I agree that we should be vigilant when big corps are adding more and more sensors into our lives, but Apple is absolutely not reporting tracked eye-movement data to advertisers, nor do they allow third-party apps to do that.


It's not a problem with the technology.

The problem is the edge case where it's used for two different things with different demands at the same time, and the fix is to...not do that.

> Apple fixed the flaw in a Vision Pro software update at the end of July, which stops the sharing of a Persona if someone is using the virtual keyboard.


"fixed the flaw "

Or

"Ooopps, so sorry you caught us. Guess we'll have better luck keeping this hidden next time."


Keeping what hidden? Caught who? The eye-tracking technology is literally a core part of the platform. What is it you're trying to say?


From articles first sentance:

" lot about someone from their eyes. They can indicate how tired you are, the type of mood you’re in, and potentially provide clues about health problems. But your eyes could also leak more secretive information: your passwords, PINs, and messages you type."

Do you want that shared with advertisers? With your health care provider?

The article isn't about the technology, it is about sharing the data.


Who are you saying shared what data with whom?


How are they getting the data you claim is shared with them?


Does HoloLens also use a keyboard you can type into with eye movement? If not, seems to be unrelated to this attack at all. If yes, then how would it prevent this attack where you can see the persons eyes? Doesn't matter if the tracking data is on-device only or not as you're broadcasting an image of the face anyways.


Not when I used it, you had to "physically" press a virtual keyboard with your hands


I disagree strongly. I don't want big tech telling me what I can and can't do with the device I paid for and supposedly own "for my protection". The prohibition on users giving apps access to eye tracking data and MR camera data is paternalistic and, frankly, insulting. This attitude is holding the industry back.

This exploit is not some kind of unprecedented new thing only possible with super-sensitive eye tracking data. It is completely analogous to watching/hearing someone type their password on their keyboard, either in person when standing next to them or remotely via their webcam/mic. It is also trivial to fix. Simply obfuscate the gaze data when interacting with sensitive inputs. This is actually much better than you can do when meeting in person. You can't automatically obfuscate your finger movements when someone is standing next to you while you enter your password.


You are an expert user, so of course you will demand extra powers.

The vast majority of people are not expert users, so for them having safe defaults is critical to their safety online.

> It is completely analogous to watching/hearing someone type their password on their keyboard,

Except the eye gaze vector is being delivered in high fidelity to your client so it can render the eyes.

Extracting eye gaze from normal video is exceptionally hard. Even with dedicated gaze cameras, its pretty difficult to get <5 degrees of certainty (without training or optimal lighting.)


Apple does not provide eye tracking data. In fact, you can’t even register triggers for eye position information, you have to set a HoverEffectComponent for the OS to highlight them for you.

Video passthrough also isn’t available except to “enterprise” developers, so all you can get back is the position of images or objects that you’re interested in when they come into view.

Even the Apple employee who helped me with setup advised me not to turn my head, but to keep my head static and use the glance-and-tap paradigm for interacting with the virtual keyboard. I don’t think this was directly for security purposes, just for keeping fatigue to a minimum when using the device for a prolonged period of time. But it does still have the effect of making it harder to determine your keystrokes than, say, if you were to pull the virtual keyboard towards you and type on it directly.

EDIT: The edit is correct. The virtual avatar is part of visionOS (it appears as a front camera in legacy VoIP apps) and as such it has privileged access to data collected by the device. Apparently until 1.3 the eye tracking data was used directly for the gaze on the avatar, and I assume Apple has now either obfuscated it or blocks its use during password entry. Presumably this also affects the spatial avatars during shared experiences as well.

Interestingly, I think the front display blanks out your gaze when you’re entering a password (I noticed it when I was in front of a mirror) to prevent this attack from being possible by using the front display’s eye passthrough.


"privacy-concerning"

Like checking out how you are zeroing in on the boobs. What would sponsored adds look like, once they also know what you are looking at every second. Even some medical add, and the eyes checkout the actresses body.

"Honey, why am I suddenly getting adds for Granny Porn?".



Hololens 2 certainly has support for passing gaze direction, not sure about the first one.

I think the headsets are pretty much in alignment that it's a feature that needs permissions but they'll provide it to the app with focus.

Apple is a lot more protective.


I personally view this as gatekeeping, which should be outright illegal.


As far as I know eye tracking isn’t available in VisionOS[0]

This article snippet is behind a paywall but it seems like it’s talking about the eyes that are projected on the outside of the device.

So basically it’s no more of an exploit than just tracking someone’s actual eyes.

0: https://forums.developer.apple.com/forums/thread/732552


Go behind the paywall here: https://archive.ph/44zwN


the article is talking about avatars in conference calls which accurately mirror your eye position. Someone else on that call could record you and extract your keyboard inputs from your avatar.

Enabling "reader mode" bypasses the paywall in this instance


>Why do you need to redistribute them if your goal is merely to solve your own problem?

Because "you" in this case is typically a game studio. "Solving your own problem" means shipping your game.


Huh. I have no idea what point you're trying to make.

Virtually everyone modifies Unreal C++ code. Some indie projects are blueprint only. But modifying the engine C++ source code is practically an expectation.

Can those source code modifications be easily shared with other developers? Well they can't be publicly dumped on GitHub. Can they be shipped to consume in a compiled binary? Absolutely.


Do you think it would make any sense for epic to forbid studios from shipping games that contain a modified engine?


Gotcha!


Tom Peters picked up the concept after it was invented by the founders of Hewlett-Packard


Please don't chase or promote conspiracy theories here. The place manages karma better than any other site, period. It's a very hard thing to do well. The fact that it's hard to do well is not evidence of conspiracies here, it's evidence that HN is in fact exceptionally well run an managed.


No theory chased or promoted, only numeric data for the historical record.

Have you seen any HN story go missing entirely from the list of stories? This is most likely a bug. If there is more than one instance of missing stories, it should be easier to find the root cause. Currently this story is:

   #276 on #new (10th page) 
   not present in first 500 stories (17 pages)
> HN is in fact exceptionally well run an managed.

Absolutely. Which is why invisible-except-to-new is such an anomaly.


After moderator review, this story has returned to HN front page.


Moved from #15 to #100 at 230 points and 100 comments, https://hnrankings.info/41440855/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: