Hacker News new | past | comments | ask | show | jobs | submit login
Karl Guttag on Apple Vision Pro (Part 1) (kguttag.com)
85 points by baggy_trough on June 14, 2023 | hide | past | favorite | 103 comments



Excellent article, I look forward to the updates.

As a Meta Quest 2 owner, and Android user, I'm a bit of an Apple-skeptic when it comes to their pricing on some products; but I have to acknowledge that while the Vision Pro seems mad-expensive on first reveal, it doesn't seem to be poor value going by the previews and specs, which seems to match the article's conclusions.

That is where the Quest Pro seems to have failed - at least at its initial pricing it was very difficult to see the added value matching the price premium.

Apple - at several times the price of the Quest Pro here - has me rooting for it as a considerably more serious product at a relatively-justified serious price in what is an uncertain undertaking.


Yes, the Apple headset is miles more advanced, but to me, the massive selling point of the Quest 2 was also how easy to sideload stuff was. The Side Quest store is full of amazing apps and ports, like Doom 3 for example, that for reasons don't meet the Quest Store approval criteria or because the devs don't want to pay the 30% fee to Meta.

I doubt we'll see such freedom on the Apple headset as it looks more of a "VR iPad" than a "VR iMac". If I know Apple well, sideloading apps will be a huge no-no and you'll only be able to run what Apple approves on it, so those hoping for immersive VR porn or other wild fantasies, might be left hanging.


Apple is being forced by the EU DMA law to allow support for alternate app stores by 2024 - I imagine the same would hold true for the vision pro. I guess it depends if VR goggles are viewed as an entirely separate category of product, or if they are essentially just "smart phones with a fancy display".


>Apple is being forced by the EU DMA law to allow support for alternate app stores by 2024

As always, I'm waiting to see the final implementation before rejoicing prematurely.

Appel can always intentionally maliciously screw with the implementation of this law in so many ways, in order to make sideloading an unattractive proposition for the majority, but that is still compliant with the law, like they did with the self repair program, or like the implementation of many GDPR cookie banners use dark pattern to annoy you into just giving up and accepting tracking.

A law is one thing. It's the implementation that makes or breaks it.

Meta left sideloading relatively open without a law having to drag them kicking and screaming into it, which is much better IMHO from a hacker perspective, as it means our vision & goals for the product usage are slightly more aligned.


Re: GDPR, enforcement matters. GDPR enforcement was left to the DPA of the country where the EU HQ of the company is located, and the Irish DPA clearly sees itself as an arm of the Irish Development Agency tasked with making Ireland as congenial a venue for multinationals as possible, not on a mission to defend consumers' privacy. The EU learned its lesson and the implementation of DSA and DMA will not be left to compromised national authorities.


How do you game on it?


Developers would have to port their games, but I don't see any reason why it wouldn't work. The only real obstacle is a lack of first-party input devices.


The Vision Pro? I'm not a game player generally, I bought my Quest 2 for exercise apps and as a platform for investigating developing for VR and working inside the new medium.

I agree with your implication this is not an appropriate price format for gaming, but I would have thought the Vision Pro is aimed at more of that latter market; though presumably future less premium models will aim at the gaming and more natural consumer markets, or gaming will emerge as a secondary use for the more expensive models like this.


It has no controller, you can't game on it


Anyone else worried that Vision Pro will simply be a large iPad and not macOS like, ie closed app store and not geared towards productivity? The fact that to run VSCode you need to mirror your own Mac simply boggles the mind, it's like Apple wants to cut off the potential of the device at the knees simply to continue selling services.


You can do VSCode Remote in a browser on an iPad so I'm not too worried.

I think the bigger factor is that Apple is trying to push this as more of a workplace productivity machine than the iPad, so third party developers will hopefully be more amenable to porting those types of apps than they were to iPadOS.


> You can do VSCode Remote in a browser on an iPad

I'm sorry but that's exactly the problem, I want the app natively (well, as native as you can get with Electron), executing code that I have on my file system. I don't want to execute remotely through a browser.


Your requirements seem pretty specific, it might just not be the device for you.


How is it "pretty specific?" Any developer is running code locally (unless you work for Google or some company where they have remote codespaces). Why should I not want a headset that instead serves as a replacement for a laptop?


You might want it. Apple either doesn’t seem to be able to provide it, or didn’t prioritize it.

I don’t work for Google but I often SSH to my desktop to run my code. Why heat up my lap or drain my battery?


Good for you, not everyone has both a laptop and desktop to get work done.

> Apple either doesn’t seem to be able to provide it, or didn’t prioritize it.

Sure, that's up to them. I can still want what I want however.


You can want whatever you want. It’s not a failure of apple that they don’t serve your specific need, however. They just built a product for different people.


Sure, that's fine. At the same time I can still call it a shame that they would have to do such artificial limiting. I'll more likely pick up the Android or Windows version of such a device without restrictions on what kinds of software I can run on my own hardware.


If they really want to make it a work machine, there is too many things that needs to go from iPadOS when using keyboard. For instance, customizable shortcut is an obvious thing that is missing on the iPadOS.


> not geared towards productivity? The fact that to run VSCode

Productivity !== coding. Coding is a subset of productivity, but most jobs do not require coding.

I get that you want this to be something else, but a lot of people are incredibly productive on iPads today, and adding the ability to mirror your MacBooks screen means that this platform absolutely has a future in productivity.

HN has a lot of this attitude — “I can’t code directly on it so this is a toy” — and it’s always a laugh for me. There’s … other people? Who want to do other things?


That's great for those people but yes, for devs, it's not a real replacement for a MacBook, even though it literally has the same chip inside. It's a shame.


And very importantly: In most enterprises, most MacBook users are either graphics people or devs. Because they're the only communities that really have a legit reason to want them (especially the devs because iOS development just needs a mac).

Most others just have to put up with Windows whether they like it or not (working in IT for a company with 120.000 PCs and 600 Macs). And yes most companies are like us.

"The same chip inside" doesn't mean that much though. This one pushes a LOT more pixels at really high framerates, has to handle realtime input from a ton of cameras and do live algorithms on their data etc. I think the whole realtime usecase on a more open OS like the Mac? Very hard to make that work.

So I see where they're coming from, sure. But productivity wise, iOS apps are not great. Needing a Mac to connect to isn't either considering the price of each.


Depends on the industry. Big Tech workers overwhelmingly use Macs, and not just developers.


>That's great for those people but yes, for devs, it's not a real replacement for a MacBook, even though it literally has the same chip inside.

Good thing Apple isn't marketing it as a replacement for a MacBook then.


Of course it isn't, they want you to buy a MacBook alongside, that's my entire point. They are artificially restricting one device in order to make you buy their other devices. It's the same story as the iPad, it also has the same chip as the Mac but they restrict apps like IDEs so people instead have to buy two devices. If it were a Surface-like product instead, with full macOS, I'd have bought an iPad in a heartbeat.


>Of course it isn't, they want you to buy a MacBook alongside, that's my entire point.

Hardware company entices consumers to buy more hardware. Film at 11.

>If it were a Surface-like product instead, with full macOS, I'd have bought an iPad in a heartbeat.

Ah yes. The Surface, the device that showcases the appeal of running an OS designed for mouse/keyboard input with a thin veneer of touch support spackled on top. It's amazing Apple isn't sprinting to release a product along those lines.


> Hardware company entices consumers to buy more hardware. Film at 11.

Sure, they can do whatever they want, of course. Similarly, I can also call it a shame.

> The Surface, the device that showcases the appeal of running an OS designed for mouse/keyboard input with a thin veneer of touch support spackled on top. It's amazing Apple isn't sprinting to release a product along those lines.

You're right, it works great as a portable computer that I can run anything I want on. I'm sure Apple could make an even better version if, I don't know, they created a keyboard attachment for one of their devices and made that device's OS more mouse and keyboard friendly, while still retaining the ability to run whatever software the user wants. Ah, one can dream.


Even if it could run VS Code directly, VS Code would be at a disadvantage to other editors because native visionOS apps appear to use entirely vector UI and text to ensure sharpness and readability in a wider variety of conditions, which I don't see third party UI frameworks of any sort being able to hook into easily.


I don't think web apps like VSCode will be at a disadvantage. If they're doing "normal" web rendering and not rolling their own canvas renderer like Google Docs then it should just work fine.


If they're running on top of the system WebKit yes, otherwise vector rendering will need to be implemented in Chromium/Blink.


Something is better than nothing.


Well this is pretty clear to me, it's definitely running only iOS apps natively, I don't see why they would open the app store for it as it's a huge goldmine for them.

Apple still thinks people can use an iPad for all their work though my experience at work is very different (and I managed a large fleet of mobiles)


Indeed, and it's a shame. I am looking forward to whatever is available on the Windows and Android side in the future, at least there I can run whatever software I want.


When Apple says "Pro", they mean you get to hang around with a couple of marketing people pointing at PowerPoint slides

You thought they meant actual apps that professional people use, like VSCode?


Why worry ? It's already guaranteed to be the case. Apps will not have direct access to the cameras, meaning that the Vision Pro is destined to keep being a toy fully steered by Apple. You also cannot access any depth information, or anything that would make it useful as a general 3D computing device. You can't access eye tracking, so say good bye to foveated rendering. And since you can't have foveated rendering, well you're now rendering 2x 4k images on a relatively underpowered device.


Foveated rendering is supported by default. Most apps will rely on the OS's renderer and not their own so they don't have to do anything with eye tracking to have it enabled.

EDIT: See cma's comment below which adds more info: apps that use their own render still support foveated rendering too.


Foveated is available on native renderers/custom engines too:

(source: "Discover Metal for immersive apps - WWDC23 - Videos - Apple Developer " https://developer.apple.com/videos/play/wwdc2023/10089/)

If they try to hide it by restricting access to reading the foveation texture, you can be sure anti privacy apps like Immersive Facebook will try tricks like recovering it from GPU timestamp queries (render two transparent triangles covering the screen in separate draw calls, find which took the longest, subdivide and repeat).


Exactly. If you want to do anything without going through Apple's benediction, you're shit out of luck. It's one more episode in the war on general computing, except somehow it's okay when Apple restricts what you can do with your three thousand five hundred dollar ski goggles.


To me it seems like less of a war on general computing and more like Apple is hoping to prevent mixed reality or "spatial computing" from becoming an extension on the privacy disaster that smartphones became.

This seems prudent, because as personal as smartphones are, a headset like the Vision Pro dials that up tenfold. Considering what third parties have done with just the information surfaced in mobile operating systems, or heck even the web, I shudder to think of what they'd do with gaze data and a high fidelity color 3D map of your surroundings.


That’s great for the general case, but it should be also possible for a user to do whatever they want with their own hardware.


If there's a way to accomplish this without opening headset users into being easily socially engineered into figuratively selling the farm, sure.

In this particular case I think perhaps a better approach would be to allow apps to bypass App Store restrictions so long as their source code is public and binaries match that code. This would naturally deter those with shady intentions, allow FOSS projects to thrive, allow both manual and automated third party vetting of apps, and help users better know what they're getting into.


Yeah. To state the obvious: Apps will no longer run full screen; they will run as a 2D window in a 3D space. When Apple allows it the user will be able to ask the app to go full 3D (well, hopefully).

All of this is: Apple taking further control of the experience.


It's okay for the general user.


>Apps will not have direct access to the cameras

A restriction for which I am thankful.

>Vision Pro is destined to keep being a toy fully steered by Apple.

I trust Apple a lot more than I trust the "let's figure out how to fingerprint users based on how they scroll and other metrics and use it to sell ads" crowd.

>You can't access eye tracking,

A restriction for which I am thankful.

>so say good bye to foveated rendering.

Vision Pro does foveated rendering.


Yes, yes, we know some Apple users are unable to see past the "you can't be trusted" that Apple keeps violently stabbing in your brain, you'll live in a company town with company stores and company credits and you'll be happy about it. Now can the adults use their devices?

The Vision Pro does not provide foveated rendering if you do not go through the Apple renderer. There's some extremely vague wording regarding Metal supporting it, without any API documentation out.


>Yes, yes, we know some Apple users are unable to see past the "you can't be trusted" that Apple keeps violently stabbing in your brain, you'll live in a company town with company stores and company credits and you'll be happy about it.

Apple isn't telling me that I can't be trusted. Apple is telling me that it's a stupid idea to trust third-party developers with unfettered access to my data, and they're not even a little bit wrong. This has been happening at least since the "Foursquare is uploading and storing every user's contacts" fiasco well over a decade ago.

> Now can the adults use their devices?

Of course. Feel free to bring to market your wonderful headset that allows VC-funded startups to create a never-ending torrent of privacy hellscape diarrhea apps you can enjoy until they implode.

>The Vision Pro does not provide foveated rendering if you do not go through the Apple renderer. There's some extremely vague wording regarding Metal supporting it, without any API documentation out.

That's because the API has yet to be released to developers. Perhaps you can wait until it is to get your mouth all frothy, sunshine.


> Of course. Feel free to bring to market your wonderful headset that allows VC-funded startups to create a never-ending torrent of privacy hellscape diarrhea apps you can enjoy until they implode.

You do realise you're describing 99% of the iOS app store here right?

In fact, while the Android app store is curated in an even worse manner, I find that Android does offer a lot more high-quality ad-free open source software on platforms like F-Droid. Because the iOS developer subscription makes it hard for FOSS developers to do their work for free so things become more monetised.


>You do realise you're describing 99% of the iOS app store here right?

I do. And I also realize how much worse it'd be if Apple didn't gate user data behind OS-level user consent dialogs.


> we know some Apple users are unable to see past the "you can't be trusted" that Apple keeps violently stabbing in your brain

It's third party devs doing that, not Apple. They've proven themselves unworthy of trust more times than can be counted. Where there is room for abuse, abuse will happen. We might be numb to it, but it happens all the time on desktop operating systems. Examples include Adobe putting files in corners of the system it has no business touching, practically every modern AAA game installing rootkits and scanning our filesystems, and until just a few years ago Dropbox installing a kernel extension on macOS.

The only devs that deserve any trust are FOSS devs but there isn't good FOSS software for every use case.


> Apps will not have direct access to the cameras

Is there any headset that does allow this currently? Doesnt seem like any kind of differentiating factor


Every single headset in the world that has cameras. The HoloLens, anything that uses OpenXR and OpenVR (so, pretty much 100% of the current PCVR market), hell even the Quest has a Passthrough API.

The Vision Pro is pretty much the only one that doesn't, because Apple thinks of their users as children.


A passthrough API, but you absolutely do not get access to the headset cameras on any of those devices. Apple is very much in line with the standard here, you can make passthrough apps, but you cannot actually get the real data from the cameras.


It's the same concern I heard before the 1st Ipad announce: we want a MacOs on a tablet! No you don't. Coz MacOs apps knows nothing about the touch screen interface. And they know nothing about VR also, so nothing is changed here.


Do you realize that they all run the same kernel and OS underneath that's then restricted via feature flags? macOS can absolutely understand touch since it's all the same. This is also why their Continuity feature works so well, or why iOS apps can run natively on macOS.


> Pancake (MQP) versus Asperical Refractive Optics (AVP)

Guttag's section about the lenses not being pancake and being refractive only I think is wrong, Apple says: "combined with custom catadioptric lenses," which implies reflection is is used like with pancake.

(source: https://www.apple.com/newsroom/2023/06/introducing-apple-vis... )

> It should be noted that the AVP displays about 3.3 times the pixels, has more and higher resolution cameras, and supports much higher resolution passthrough. Thus the AVP is moving massively more data which also consumes power.

An earlier Sony presentation on their 4K microdisplays from last year I believe showed they had some form of foveated scan out on the microdisplay itself, letting the display controller move less data.

(source: https://www.youtube.com/watch?v=IVUSUdzsNgY )

>Apple discussed (linked in WWDC 2023 video) how they implemented features in watchOS to encourage people to go outside and stop looking at screens, as it may be a cause of myopia. I’m not the only one to catch this seeming contradiction in messaging.

Isn't this less of an issue with displays at a farther distance (whether physically or optically)?

> I was doubtful based on what was rumored that Apple would address VAC. Like many others, Apple appears to have ignored the well-known and well-documented human mechanical and visual problem with VR/MR.

There is less vergence accommodation conflict as you age. You slowly lose elasticity of your natural lens as you age and need reading glasses to see up close and the effect isn't as big: you're at a wrong farther focus when your eyes are converging anyway. It might actually be an improvement in vision for people over 45-50.


Catadioptric does mean exactly that - a combination of mirrors and lenses. In photography, catadioptric lenses are used to make very compact for the focal length telephoto lenses - for example, a 500 mm catadioptric lens with two mirrors and several elements weights nearly an order of magnitude less than a 500mm refractive only design.

I thought it was very interesting and unique, I have no idea what the optical design could look like. Maybe something like the perkins-elmer solid cat design, shrunk down for a long focal length but compact pancake lens.


> I’m not the only one to catch this seeming contradiction in messaging.

The contradiction being introducing a new product with a screen? Guess what, watchOS will still suggest you go outside, whether you use a MacBook or a Vision Pro.

This analysis is not particularly sharp.

The vergence issue he is very hot about, but like anything in this space the question is “is this good enough for a lot of people” not “did they solve this to my personal satisfaction (having not used the device)”


Good catch on the "combined with custom catadioptric lenses." Which by definition means a combination of mirrors and refractive optics. But I still don't think they are pancake lenses, but something more akin to what Limbak, which was recently bought by Apple was famous for designing catadioptric optics. Before being bought by Apple, Limbak was best known for their catadioptric design used by Lynx.


Is there a ray diagram the Lynx optics somewhere? I never fully understood how they work (reflection is off the sides of them?). With Apple showing the lens element shapes, is there room for reflecting off from the side?


> There is no vergence accommodation conflict if you have had cataract lens replacement surgery

This is fascinating to me since I am getting vision corrective surgery soon, where could I learn more about this?


I edited that out, apparently some of the replacement lenses can flex.


> In many ways, the AVP can be seen as the “Meta Quest Pro done much better.” If you are doing more of a “flagship/Pro product,” it better be a flagship.

Ok. I'm fine with being told "you aren't the target audience" but... name 5 things I would do with it if I bought one.

Why would I put it on my face/what would I gain from it?

Gaming? Productivity? What can it do that a screen + monitor + mouse + keyboard / iPad / iPhone can not?


What could the original iPad do that an iPhone couldn't? Basically zero, other than have a larger screen. Different screen size ends up making a significant difference in how you use a product.

Vision Pro presents theoretically unlimited size and number of screens to do categorically the same kind of things you'd do on an iPad/iPhone/Mac. As someone who likes to work with multiple monitors, at the least I see this enabling me to take my work on the go (coffee shop, library, plane) without significantly compromising my screen real estate.


So that iPad made reading or watching videos better, when you don't wanna use a laptop/pc/tv.

Give me one thing that those goggles let me do or do better right now.


Arguably, the same. You can watch videos in what amounts to a very large screen, with ambiance added to the backdrop, anywhere. Watching videos on a plane or train will get a massive boost.

Given that an iPad Pro with max specs is about 2/3 of the announced price of the goggles, I think the goggles are in fact at an advantage.


Plane and train rides. That's it. Not even public transport, for several reasons.


Not sure what your reasons are, but on a 30min-1h trip on the London Tube, I can see the appeal and not many reasons to not use it. Keep the "real world" somewhat visible for awareness and enjoy. Not much different from taking my laptop out and coding on it.


Well, the trivial answer given your premise that the iPad's existence is justified because some people sometimes might prefer it over other screens is that some people might similarly sometimes prefer the Vision [Pro].

I think it's a lot more compelling than that, personally, but you didn't set the bar very high.


I only need to name 1 for myself: i travel a lot and work from a laptop often. AVP feels like an obvious choice for development if it delivers on the vision (heh). Working without a dedicated desk space, laptop stand, separate monitor, etc, laptop work just pales in productivity and comfort to my multi-monitor desk setup at home.

A wearable that acts as if i have floating monitors anywhere sounds like a massive win, in a device that appears to be far more portable than actually bringing the aforementioned accoutrements with me everywhere i go.


Funnily enough, if it works for this use case well -

From the perspective of “it can give you an experience as if you have multiple large displays” and costs the same as ~2 Apple displays, plus comes with compute built in.. it’s actually a steal.


> AVP feels like an obvious choice for development if it delivers on the vision (heh).

I wouldn't feel comfortable wearing this in public around strangers whatsoever.


Is the hesitance for social norm/style reasons?

Similarly, i wouldn't wear it at a coffee shop. But i could see it coming into style at co-working spaces. It feels like a no brainer in hotel rooms and airbnb's.


3,500 is withing striking distance of - very nice computer + very nice monitor + very nice camera.

The question in my mind is if I want the thing strapped to my face for an 8 hour work day. If so then (while still too expensive for my blood) it's not crazy; it just needs a couple of years in the oven, if not then it's not a couple of years it's a decade or more.


The ability to position and manipulate virtual objects "in" the physical world. It's also supposed to be capable of 3d video playback and capture.

2d screens can't really do 3d, and definitely can't put things in the room around you.

People have already dreamed up all sorts of game ideas for passthrough AR, as well as CAD and 3d design (think autocad, blender etc.).

If you're using it as a stand in for a 2d display it may not be worth it -- it's just bigger in your field vision.


It can have a slightly larger effective resolution, and requires slightly less movement to navigate.


The propaganda for the Vision Pro is somewhat similar to various crypto bullshit in the past years: "you're not seeing the vision", while it stays a mostly useless tool.


I’m not sure this is a fair statement given that the thing isn’t even out yet and most haven’t used it.


Speaking in terms of eye health, does it matter if the screen is right in front of my eye if the focal length is farther? Or is it still unhealthy?


I won't pretend to actually know the answer but I've seen many people saying that VR can improve myopia due to the distant focal point.

Assuming that's true, I imagine this would have similar effects.


No it does not matter at all. The optics focus it much further away. As far as your eyes are concerned the display isn't right in front of them.

In fact if those optics weren't there you wouldn't see anything except a few coloured blobs.

The amount of brightness your eyes are ingesting is also much lower than walking around on a sunny day without sunglasses. Remember our eyes work logarithmically so the outside could be tens, hundreds of times as bright.


But if I understand correctly the optics can't recreate how our eyes focus on near and far things in reality. So if something is supposed to be 100 meters away it will still be physically like focusing on something 1-2 meters away due to how the optics work. That's the whole VAC issue mentioned in the article.


That's true. They still can't without eye tracking and physically moving lenses. I've seen a prototype in the news that could do that. But of course it's mechanically highly complex. Perhaps liquid/gel lenses could do the same job (similar to how our eyes work).

I've never heard of this actually causing harm though, except some questions were raised about kids whose eyes are still developing. It would be similar to looking through a microscope or binoculars for hours, which is not unheard of in human history (e.g. researchers, lookouts on ships)


The physical distance does not matter except for the heat it produces


I appreciate Karl as the technical Bulldog for AR, doing really good technical analysis for the field.

Even as someone steeped in this space, the fact of the 3-element AVP was news to me. That is a pretty big difference from other designs and tells me they are reaching for anything they can to squeeze optimizations out.

AR is really really hard.


>the fact of the 3-element AVP was news to me

It was known for at least 1.5 years https://www.macrumors.com/2022/01/05/kuo-apple-headset-panca...

I remember also seeing some shipment records from Apple ordering these lenses last year.


What are the current use cases for AR?


One of the best I’ve seen is for manufacturing and inspection. Eg, it highlights the next set of holes to place bolts into, then gives tightening order, etc. It eliminates the need to think about each step.


If you're eliminating thinking, you might as well use a robot.

I don't know about the ideal use cases, but it certainly appears that the SDK and dev environment are going to be the best and most accessible. So if a great use case does appear, it will probably be on the AVP.


You can use a robot for screws, but things like wiring can be difficult or impossible for a robot. Not to mention for low volume, setting up robotics isn’t worth it.


I think one of the better potential use cases for AR is for manufacturing work instructions. Typically, these are in binders or displayed on screens, but highlighting assembly steps, part orientations/locations, etc virtually would be especially nice for training or infrequent operations which don’t happen at a seated workstation. I believe Glass and some other products were exploring this space, but it could be that it’s a lot of work to set up.


Heads up display for cycling/navigating

Fundamentally though I see AR as how we give our AI personal assistants our eyes ears and real life action space evaluator.

Everything from highlighting objects you’re searching for to giving instant information about anything you look at that you want to know about.

It will also give you perfect recall

Hey siri where did I leave that toilet brush two weeks ago? How many pages of [Book] did I read on average last week? What was that pattern that [person] wearing when we were at the farmers market last week?


I’ve just learned the Vergence-accommodation conflict [1]. VR/AR, from now on is a no-go for me. It never came to my mind that those headsets had such a problem. We’re talking about an accommodation distance of what, three centimeters? I’m already on my way to reverse my myopia, sorry, I can’t take an even more unhealthy visual environment [3]. Half of the world is already on its way to be myopic by 2050 according to research [2], and keep in mind this is only the effect of smartphones. You don’t exactly hold them centimeters away from your eyes. I’ll stick with my plans of buying a projector if I ever want giant screens.

1: https://en.m.wikipedia.org/wiki/Vergence-accommodation_confl...

2: https://www.aaojournal.org/article/s0161-6420(16)00025-7/ful...

3: https://m.youtube.com/watch?v=XPIGDSY_xBs


The accomodation distance can be set to any value by the headset lens design. The problem he's discussing is that you can't change the distance depending on the content shown. Probably apple is setting the distance to 1 meter since that's where they show all their content at.

https://en.wikipedia.org/wiki/Vergence-accommodation_conflic...


Oh thanks! I understood, but why is there written "VR headset screen" in the image in the wikipedia page? Probably a mistake, right? It confused me.


No, you misunderstand. The lenses between your eyes and the screens will give it a focal distance of something like ~2 meters. So from an eye health / focus point of view it will be like staring at a wall 2 meters away all day (which is possibly better than staring at a computer screen .5m away!). This is why nearsighted people still need some sort of accommodation in VR.

The thing to realize is there's two different kinds of focusing: the first is the one everyone thinks of which is how your two eyes work together. They point together when focusing close, and spread apart when focusing far. But separately each eye also can focus individually. Try covering one eye and holding up a finger at arms' length for your other eye to focus on. Then switch to focusing on the wall with that eye, and back. You'll feel your eye muscle making the adjustment.

In the real world these two kinds of focuses normally operate in sync. If you want to look far, you'll normally both point your eyes further apart and adjust the individual eye focus for distance, and when you want to look close, you'll point your eyes kind of together and adjust each eye for near focus.

But in VR, your "focal distance" is fixed at the generally comfortable ~2m I mentioned. However, because it's a 3D world you can look near and far, and in that case you'll change how your eyes point but not the other kind of focus.

So I'm not sure to what extent that causes problems, if any, but it's different from staring at something a few centimeters away.


I undersood now, the text "VR headset screen" in the wikipedia page confused me. FYI, the latter one, individual eye's focus is the one which causes myopia.


I agree that the price doesn't matter as much as the experience; the Hololens 2 is just as expensive but generally people wouldn't commit to the limited field of view and lack of compelling use cases. Aside from media consumption and general computing prowess, I'm not sure if Apple made a complete case for it, but we'll see what software is ready to ship with it when it launches. Possibly people will be excited enough to develop for it that Apple can reach a critical mass of interesting "things to do" on the platform.


Analysis of Apple Vision Pro (AVP) and Meta Quest Pro (MQP)

Introduction:

Lack of technical analysis and understanding of VR/AR issues in AVP reviews. Neglecting issues of variable focusing and Vengeance Accommodation Conflict (VAC). Meta to present a paper on VR headset with retinal resolution varifocal display. Apple and Meta's inability to solve inherent human visual system limitations.

Advantages of AVP over MQP:

MQP considered poorly executed and a bridge to nowhere. MQP's passthrough mode criticized for low resolution and distortion. AVP has higher-resolution cameras, more depth-sensing cameras/sensors, and improved processing for passthrough. AVP's Micro-OLED provides better black levels/contrast than MQP's LCD with mini-LED local dimming. AVP's micro-lens array enhances light collection efficiency.

Price Perspective:

Comments on AVP's price being too high lack historical perspective. Apple and Meta need to prove the viability of a highly useful MR passthrough headset. AVP's price is subject to reduction with increased volume.

Resolution for "Business Applications":

MQP has lower pixel density, resulting in less readable text and slower reading speed. AVP's higher pixel count provides around 40 pixels per degree (ppd) for improved text readability. Long-term use of virtual desktops lacks evidence of being good for humans.

Passthrough Capabilities:

MQP's passthrough mode criticized for its poor execution. AVP's passthrough, although not perfect, is significantly better than MQP's. The accuracy of AVP's 3D views of the real world is unknown.

AVP Micro-OLED vs. MQP's LCD with Mini-LED Local Dimming:

AVP's Micro-OLED offers better black levels and contrast. MQP's local dimming feature has limitations and issues. Apple's Micro-OLED CMOS backplane is assembled by Sony, known for Micro-OLED expertise.

Pancake (MQP) versus Aspherical Refractive Optics (AVP):

AVP incorporates custom catadioptric lenses for sharpness and clarity. Apple's acquisition of Limbak, an optics design company, suggests the use of pancake or similar optics. MQP uses 3-element aspherical optics. Pancake optics are less efficient with Micro-OLED displays, requiring higher OLED power output.

Conclusion:

AVP reviews lack technical analysis and overlook critical issues. AVP offers advantages over MQP in passthrough, resolution, and display technology. The price of AVP needs to be viewed in the context of technological advancement. AVP's Micro-OLED and catadioptric lenses contribute to improved performance. Long-term suitability of AVP for business applications and human visual comfort remains uncertain.


What does it mean that guys like this and SadlyItsBradley were not invited to try the Vision Pro? Interesting to note that the experts in the state of the field were not given a chance to try it.


Not much given neither are press or developers, which are basically the only two groups that are usually invited.

Neither really do product reviews as their main gig either.


Fair enough, I did not consider this angle.


I'm sure they were not because they know too much.

Apple loves keeping the "magic" on their products as long as they can. Once they are in consumers hands they can't stop teardowns and detailed comparisons but right now they won't want some wise guy ascertaining the FoV, PPD, distortion figures etc.

They want their marketing to be viewed as a class of their own, without direct specs to compare to their competitors. Once you bring it down to technical fact territory, that "magic" they're going for is lost.

Right now they're only showing it to their most loyal followers that are bound to go "wow" but don't pick up too many technical details.


I dont think it really means anything. Neither are major press outlets or developers, inviting them only really has a downside for Apple




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: