Hacker News new | past | comments | ask | show | jobs | submit login
Apple Vision Pro: Why Monitor Replacement Is Ridiculous (kguttag.com)
92 points by PaulHoule 10 months ago | hide | past | favorite | 140 comments



I find it kinda wild that we are seeing article after article of people dismissing this without actually experiencing it first hand.

We also can't ignore that in every image shown, it appears they are using a virtual screen that is far larger than what most people likely have. meaning the text will be larger. You could argue that this is to fix a flaw with the hardware but it is also something that you are able to do when you are not dealing with a physical limitation.

Regardless of that because I am not going to pretend to understand a lot of what is in this article, it is way over my head. I feel like the thing that we keep ignoring here is that this is not meant to be a great success right out the gate. Apple knows this with how they are pricing it and how they are talking about it, including it being a "pro" version first. This will be a niche device while the hardware catches up with what the software can do.

But ignoring that eventually the hardware will be able to do this just fine and not building the software now would be stupid on Apple's part.

I think it remains to be seen what exactly this hardware can and cannot do, but more importantly what we are looking at 2 or 3 versions down the road. Unlike basically every other company that has tried to to make a product like this, Apple has a track record of not shutting down projects and having "hobby" projects that take a few generations to finally hit their potential.

Once the hardware manages to catch up, is smaller, etc it feels kinda nieve to think that we would remain stuck to monitors for computing. Is it guaranteed? no of course not. But I would not dismiss it yet based on how this guy is talking in the comments about the future. We saw a lot of the same dismissal of smart phones for most people's computing needs.


You bring up a good point that covers why I'm most excited for the AVP: I'm really excited about what later versions will be like. This first AVP is kind of like the first iPhone for me. Obviously not perfect, but the UI will probably be far better than anything before it, and later versions are going to be really awesome.


It's really just math. You can't simulate a 3840 x 2160 screen with a 1920 x 1920 headset. So until the headsets get more resolution than monitors the monitors will look better. Especially for text.


You don't look at all of a 3840x2160 screen, with a need for pixel-precision all over it.

Doing a test right now, I'm focusing on the 13-inch computer screen where I'm typing this and holding up a large container of cashews directly to the right of the display. It's literally less than 30 degrees to the right of my focus, and the word "cashews" is in half-inch-high letters, and I can't remotely read the word without moving my eyes over to it.


> I can't remotely read the word without moving my eyes over to it

Plus the portion of your eye that sees in high resolution is ridiculously small—1% of your field of view. And your eye and brain just hide that fact by making your eye constantly jitter to see a larger area.


But, they aren't doing fovea tracking and shifting the display around in response to eye movements, are they? If not, you would only get more screen space if you articulate your neck and cause the head-tracking to update your viewing perspective.

I've monitored how I use 4K screens in the 28" size range. I would not normally move my head at all to look at different parts of the screen. I would only move my eyes. With two such screens side-by-side, I would turn my head just a little bit to focus on one screen or the other. Or, I found I would often hold one head position that is neutral between the two, and then use approximately 2/3 of each screen, with the far outer edges neglected or used to banish less relevant communications and status apps that I only check once in a while.

And, these screens are not filling my field of view by any means. So, I'd really need a far higher resolution headset if it is going to give me a good immersive field of view and be a reasonable monitor-replacement. I fear that fovea-tracking will remain scifi dreams in my lifetime, so the reality is we need to render full resolution throughout the field of view to be prepared for where the eye gaze might go in the next frame.


> I fear that fovea-tracking will remain scifi dreams in my lifetime, so the reality is we need to render full resolution throughout the field of view to be prepared for where the eye gaze might go in the next frame.

This is not at all true. All of the AVP, the Quest Pro, and the PSVR2 do eye tracking based foveated rending. They lower the clarity of the things you're not looking at. And reviewers say it works perfectly, like magic. They are unable to "catch" the screen adjusting for what they're looking at.


Hmm, interesting...

Are they actually doing some kind of optical shifting of a limited pixel display? Or you just mean they do some kind of low-res rendering mode to populate most of the framebuffer outside a smaller zone where they do full quality rendering?

In other words, are they just allocating rendering compute capacity or actual pixel storage and emission based on foveal tracking?


> Or you just mean they do some kind of low-res rendering mode to populate most of the framebuffer outside a smaller zone where they do full quality rendering?

This exactly. They don't reduce the resolution too much, but it's visible to an outside observer watching on a monitor.


That’s just to be about to have enough compute/bandwidth to drive the display. Other posters are correct that the DPI decreases away from the center and various optical aberrations increase. Foveated rendering won’t help with that.


Fovea-jitter is far from covering the width of your vision. They can overshoot by (maybe, I'm guessing) 2-3x the are of your precise vision and all jitter will be covered.


This isn't how any of this works. It doesn't matter how little of an area of the monitor your focusing on. You still need to generate an equivalent DPI over that area in the headset to match the monitor. Which you can't do since the headset has fewer physical pixels than the monitor does.


No you don't. The headset uses eye-tracking to enable foveated rendering. You only need to generate a high-dpi segment of the screen, where the eyes are actually focussed


Ok. Now let's wait for a display panel which can fluidly move physical pixels around to get the most of them where the user is looking!


You can't really "generate" a high-DPI segment.

The display either has a sufficient pixel density or it doesn't. Foveated rendering let's you cheat the details you render outside the focal point, but since your eyes can move to any portion of the display, the entire display needs to have a high-enough DPI that, once your eye moves to focus on that section, the content rendered in that location will be sharp.


The limit on DPI is from the display system. Foveated rendering doesn't help.


3840 x 2160 resolution per eye. 23 million pixels per eye, or, nearly 3x the pixels of a 4k TV with the equivalent resolution. That's roughly 3 physical sub-pixels per 1x1 "virtual pixel".

As well, given the oddities of human vision and how eyes focus, they can give more subpixels to the tiny area your eye is looking at to increase fidelity


The Vision Pro has 23 million pixels total, or 11.5 million per eye, or 3,391 x 3,391 if square


>That's roughly 3 physical sub-pixels per 1x1 "virtual pixel".

That is apples to oranges. The virtual pixels are mapped over your entire field of view where the physical pixels are mapped directly to the 2D surface itself.

>they can give more subpixels to the tiny area your eye is looking at to increase fidelity

Again this is a property of the display system and can not be improved by software. Additionally only one part of the lens (usually near the center) will have the highest PPD. Looking anywhere but the center will have a lower PPD.


It does when your display has micro motion :)


I get that I can look at the upper-right corner of the vision-pro's display and then the precision is needed there -- or the lower-left, etc. But no one is going to do that for long; it's uncomfortable, and you're going to turn your head so your eyes can relax.


I am pretty sure the Vision pro's resolution is higher than 1920x1920? Could be wrong through.

Regardless, the question isn't if it's better or the same. The question is if its viable considering the author is saying the idea is "Ridiculous"


Yes it’s higher. From what we know (11.5 megapixels per eye), the width roughly corresponds to a 4K resolution. However, due to the 3D resampling, I would estimate that to simulate a virtual monitor of a given resolution in high quality, you need at least 3-4 times the resolution on the physical headset display. Which would mean that the AVP couldn’t simulate even just an FHD virtual monitor in high quality.


I haven't seen official numbers. My point is that it needs higher resolution than a monitor to match the experience.


3840 x 2160 resolution per eye. 23 million pixels per eye, or, nearly 3x the pixels of a 4k TV with the equivalent resolution. That's roughly 3 physical sub-pixels per 1x1 "virtual pixel"


> 3840 x 2160 resolution per eye. 23 million pixels per eye, or, nearly 3x the pixels of a 4k TV with the equivalent resolution

So a 4k TV is 3840x2160. The equivalent resolution is... 3840x2160. The number of pixels in 3840x2160 is 1x the number of pixels in 3840x2160, not 3x.


The Vision Pro has 23 million pixels total, or 11.5 million per eye, or 3,391 x 3,391 if square


Is it _trying_ to simulate a 4k display? Or is it trying to provide something different- a large panorama of 1920x1920 windows?


Neither really. The VP uses a pair of 4K displays to render a high dpi section where it sense your eyes focusing. This is out of a much wider virtual display space (potentially 360deg). As your eyes and head move it shifts the image through that high resolution area. the space outside that area is rendered in a lower resolution to match your peripheral vision. This technique is called foveated rendering.

People who have tried the demos say that it is a seamless experience. Moving your eyes and head around, you see things rendered at a 4K resolution but it appears that the space around you is unbounded.


This is wrong. The eye tracking is only to reduce the computational demands, it doesn't change the DPI of the screen *at all". It's purely an optimization trick.

I don't know why people keep repeating this myth. It's just not how it works. It's not moving around pixels in the physical display.


>I find it kinda wild that we are seeing article after article of people dismissing this without actually experiencing it first hand.

Current VR has fundamental problems that can't all be solved with just some clever software tricks. This article is stating some problems that the headset will have to face. He doesn't dismiss the headset in the article.

>Once the hardware manages to catch up, is smaller, etc it feels kinda nieve to think that we would remain stuck to monitors for computing.

The friction of having to wear something on your head, along with the weight of the device will always make monitors a good option as they are very good at being 2D displays in the word.

>We saw a lot of the same dismissal of smart phones for most people's computing needs.

Show me where people dismissed smartphones over them having bad text rendering.


People dismissed smartphones and tablets because they have bad text input, which isn't exactly the same, but the purpose of their statement is not to say "the literally exact same criticism was leveled at phones" but rather "people also had somewhat similar objections and look where we are now in spite of that".

If you are being honest with yourself, you will remember that when the iPhone was first unveiled, you, I, and everyone that currently browses Hackernews thought the idea of a phone that was only a touchscreen and a few buttons was absurd. The iPad was similarly seen as ridiculous. We now live in a world where tablets, smartphones, laptops, and desktop computers all have their own niches and where a standalone desktop with a dedicated monitor, keyboard and mouse is seen as a specialist device. A headset may become yet another entrant in the "yes, and" world of computing. The historical experience suggests that we should have some humility around predicting what people will latch on to.


> I find it kinda wild that we are seeing article after article of people dismissing this without actually experiencing it first hand

I suppose you haven’t been following Apple News for long. The cycle is always the same, everyone is explaining why it’s a worthless product, too expensive, no market, doesn’t work, whatever. And then later they’re all about how obvious of a product it is, they or their favorite manufacturer did it ten times better, Apple needs to give the technology away.

Remember Slashdots out-of-hand dismissal for the iPod: ‘No wireless? Less space than a Nomad? Lame’. It sounds so obvious why the product is bad, but it turns out it’s focusing on things that don’t matter.


> it feels kinda nieve to think that we would remain stuck to monitors for computing.

One simple reason: “Hey Stephen, is this what you were looking for” (points to picture of x device on the monitor) Stephen- (without even raising from his chair) Yeah but in red, where did you find it?


Whenever I read about the AVP being a dud, I am reminded of my own reaction to the first iphone: “it is a stupid idea and it will never catch on” whilst proudly clutching my Nokia communicator.

I think the AVP 2 or 3 is going to be great.


The article IMHO does a great job breaking down the pixel arrangement, troubles with rasterization, how modern monitors basically cheat to get sharper text, and so on. TL;DR: VR/AR suffers greatly with text.

The only way Apple will potentially mitigate this will be via clever rendering tricks. But he is very likely correct: this will not be a replacement for monitors.


So, one problem with this statement: People are working with virtual monitors today using far worse DPI equipment.

This is an iteration on something that already exists. Probably won't be perfect, but it can and will be a monitor replacement for some.


It's not far worse. My display at home has 60% of the Li ear resolution of the AVP, the experience is terrible for productivity. I've tried displays that come very close at 80%, and it's still unusable for me. If the screens are reasonably size so that I don't have to move my neck all the time or suffer optical aberrations, like my real life screens, I get an effective resolution of ~720p.

We can see that this will be similar : the AVP has ~35 pixels per degree, my screens are about 45 degrees, so I can expect 1575 pixels across and 885 vertically, so just a tad better than 720p. In practice you'll lose significant resolution to optical aberrations unless your neck makes it point right there.

Who is using 24inch 720p displays in 2023? You can get at least a 1080p for like 20$ on classifieds and 100$ new.

This is sitting as close to the display as you ever should, btw. A 24inch screen will be 45 degrees wide if you're sitting less than 1 meter (36in) away from it.


> it appears they are using a virtual screen that is far larger than what most people likely have. meaning the text will be larger

I mean it's just a concept marketing photo - the screen doesn't actually project out to be seen by anyone else. The projection is made for friggin' ants (kinda literally).


I realize that it doesn't project out, but for every reason that matters it does for the person that is wearing the headset.

https://i0.wp.com/kguttag.com/wp-content/uploads/2023/06/AVP... look at that image. Judging by the guy in front that middle virtual monitor is probably 3 times the size of my current monitor which is 32". If I want more than 1 monitor (I have 3) I can't imagine actually having anything larger than that in a physical space. But in virtual? Why not. And that is what apple is showing with these images.


I feel like neck stress/pain will be a thing in VR/AR just as it is in real life, so monitor size and positioning will have to follow at least kind of similar principles as IRL ones do if you care about ergonomics at all.


One scenario I'm hoping for: I had a device that was about the size of a lighter (but even smaller) that you would wear around a necklace under your shirt, between your shoulder blades. You'd calibrate it and over time it would train your posture as you slumped by vibrating gently.

I'm not trying to comment/ debate on if this will ever be a mass-market product, but, it would be fascinating if they built a posture corrector into it, and it ends up that AVP users develop better posture compared to keyboard/ mouse/ screen users


That is a valid point, and I am curious how that part will work out in the real world.

I do think about my setup, how much space is basically lost to speakers, space for my keyboard and hands (not that I wouldn't use that in this setup but I could kinda not think about it). Or just non traditional aspect ratios as shown in that image.

But I agree, I think it remains to be seen how that works in the real world as far as proper sizing and ergonomics goes. But I think there is room for some expansion on what we would normally have now.


Again, it's just a concept photo. The graphics designer hasn't used the product, nor has the model. They made the image to look pretty - the model, the monitors: they're all laid out in the rule of thirds. We shouldn't try to tease out what it should be like using this thing.

To play devil's advocate, your 32" monitor is more than 3x as large as the 13" monitor I'm working. People use interfaces on wall-mounted TVs that are 3x your 32" monitor just fine.


A meta point, but kguttag clearly really knows their stuff, but I feel like every article is like: "Gotcha! Actually because of this technical detail this product and therefore the entire industry is doomed to fail".

The technical details seem correct to me, but even as a layperson I can tell that the author is missing the bigger picture. For example in this case, to replace monitors these headsets don't need to have as good resolution if they win on the overall tradeoffs.

It's a shame because it's otherwise good commentary, it just feels like I have to filter everything from this site through a lens of knowing the author will have a mostly unjustified negative take.


It reminds me of Blackberry's criticisms of the iPhone: completely correct yet ultimately irrelevant.


> completely correct yet ultimately irrelevant

Found my new .sig


Karl has always been, to the best of my knowledge, a very detailed but incredibly pessimistic author.

Thus, while I find his analysis incredible, I always have to ignore his actual opinions that arise from the analysis.


kguttag approaches this from a first principles approach. I think you have a solid point though, I can use the multiple monitors setup on the road, flying, etc


First principles are all well and good, but you do have to acknowledge the reality that has sprung up from them, or you'll find yourself arguing that power plants are impossible because of the first and second laws of thermodynamics.


This is a good approach and produces valuable content, but I think it either needs to skip the higher level commentary and conclusions, or it needs to be a much more considered commentary that takes into account the wider context.


The problem is Apple and the fanboys are claiming they've gone from the Newton to the iPhone without passing through the LG Prada and iPod Video in between. Hyperbolic claims promote skepticism.


"The original XGA monitor, considered “high resolution” at the time, had a 16” diagonal and 82ppi, which translated 36 to 45 pixels per degree (ppd) from 0.5 meters to 0.8 meters (typical monitor viewing distance), respectively. Factoring in the estimated FOV and resolutions, the Apple Vision Pro is between 35 and 40 ppd or about the same as a 1987 monitor."

Okay, now wait a minute. On the one hand, it's possible to scoff at technology that was state of the art in 1987 and make it seem like it's shitty, but 1024 by 768 with a diagonal of 16 inches seems awfully close to the monitor I'm currently typing this on right now: a 1920 by 1080 27 inch LCD screen, sitting three feet away.

I used https://qasimk.io/screen-ppd/ to compare some of the displays I have around to the IBM XGA monitor. Although I have one 1440 monitor, I also have a 1920x1080 that I use all the time, and when I'm actually in the office, those monitors are the same.

To hear this person describe it, the monitor that I and a ton of people currently use today has the resolution of an ancient relic. A 1024x768 monitor with a 16 inch diagonal has effectively the same pixels per inch as a 1920x1080 with a 27 inch diagonal. Would a 27 inch 1080p monitor really be worth characterizing as obsolete, insufficient technology today?


24" is generally considered the upper limit of where 1080p is reasonable. 27" is getting to be a notably low PPI for text and UI, although it's generally fine for games and other motion content.

Normally at 27" you'd want at least 1440p these days, and that's been pretty standard for around 10 years now. Those Pixio PX276/PX277 monitors and similar have been around for a long time even as budget products - the panels they were using come from cast-offs from the premium panels.

Which isn't to say you won't find 25" 1080p monitors and similar junk in offices but... they're not buying you the good stuff either.

Not to say it's not fine for you! But generally today people would be looking for a higher res, either 1440p or 4K, either of which is under $250 these days for a 27", and often a pretty nice one (nano-IPS, 144 hz, wide color gamut, etc).


It just feel wild to me that using something which would be in offices today would be so ridiculous as to not even be usable for text. The article has to take for granted that any display with a resolution below 4K is not sufficient for reading text. That's unsupportable.

I'm not arguing that 1440 or 4k isn't better than 1080. I'm not saying that monitors aren't cheaper than the Apple Vision Pro. I know that text on a screen in 1440p looks better than 1080p.

I'm just saying it seems more ridiculous to be like "the monitors that people use in offices all around the world today cannot be used as monitors, welcome to 1987 if you think so" because there exist alternatives that people would prefer to use.

https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...

For goodness' sake, 61 percent of people running Steam have their resolution as 1080p.

I'm just saying it's uncharitable for the author to make it seem as if the resolution that a vast swath of people use in 2023 is so inadequate as to not be worth taking seriously. He's saying "Yes, they will “work” as a monitor if you are desperate and have nothing else, but having multiple terrible monitors is not a solution many people will want", and buried in that is the assumption that the majority of people are using "terrible" monitors that (sarcasm quotes) work. I don't think that's reasonable.


IMHO 1080p is unacceptable today, and if my employer provided me with a laptop at that resolution it would be bad for productivity and a slap in the face. In a full size monitor it's only worse. Retina screens have been a thing on laptops since 2012 and before that on mobile devices. Let's stop accepting fuzzy text as normal.


FWIW, I'm using a 1920x1080 14" screen when not on a 4K 28" screen. Both with almost identical pixels per degree. (They are identical dot-pitch, but I don't use exactly the same viewing distance for laptop vs desk...)

I would be absolutely aggravated to work at half the linear resolution. I would tolerate it for some fast-paced game as a practical tradeoff for framerate, but not for text or static image viewing.

I have seen a 16" 4K laptop and considered it overkill or even impractical, i.e. pixels are too small to resolve and naively scaled icons or text are unreadable without a magnifier. But my usual screens described up top are above that threshold for me, and I can resolve single pixel gaps much of the time.


I have no interest in seeing someone use a Meta Quest Pro to debunk the viability of an Apple Vision Pro.


Myke Hurley said that reading text on a simulated display with an Apple Vision Pro looked retina-quality to him, and I trust him enough that I'll give Apple's claims the benefit of the doubt for now.


Yup, He's also a person who's spent an hour talking (on Cortex) about how much he and CGPGrey enjoyed using the Meta Quest Pro for having work meetings together virtually. So he understands the comparisons fairly well.

His review probably made me more excited than anything else to give it a try.


I picked up a MQP out of curiosity with how the tech has improved since the original Vive - my last experience with VR.

My office gets very hot in the summer - as of writing, it is 84F in there. The rest of my house is 72. I've had AC people take a look, etc., and there's not an easy solution to fix it. I'd probably get a mini-split or something if I was intending to stay here for more than another year or so, but it's not worth the cost and trouble currently.

One thing I wasn't expecting was the MQP being enough for some basic monitor replacement use for if I just don't want to deal with the heat. It's not holding a candle to my 34" Alienware OLED or the other 3 4k panels I have in front of me, but... I get tired of the physical discomfort from wearing the headset before I get tired of the virtual monitors.

23m pixels total for the AVP vs. the MQP's 7m - a 3x increase in pixels is a pretty significant jump in resolution.

I will say the passthrough mode cameras on the MQP look like complete ass. I can see the world around me well enough that I can function and interact with the objects in the room if necessary, but if I was trying to use the virtual monitors against the backdrop of the world, it would distract me with how shitty the real world looks. I don't have much hope that the AVP cameras are gonna be a significant enough jump up to solve that.


Agreed. I can’t see the benefit of all this analysis at this point. Why not just wait until the v1 is out and you can try it yourself?

Secondly, what about the v2 or v3 of the Vision Pro?

Eventually the resolution of a VR device will get to the point of being good enough to replace a monitor.

Whether you’d want to be inside a big headset for 8 hours a day is another question.


Influencers need to grab attention


If you were around in the early to mid 00's, the Apple haters really came out in droves. "No wireless. Less space than a nomad. Lame." Seems they are back at it right before Apple wipes out the rest of the VR industry with the first headset that doesn't actually suck.


Completely correct criticisms. The iPod doesn't play music any better than one of those things. It doesn't respect the folder structure or drag n drop USB storage. If you need the storage, and organizing your library how you want, why pay more for less?


Especially when there are people right now using a Quest 2 for this purpose.


AVP is not released so using a Quest Pro is fine. Both headsets face similar challenges. It isn't like the AVP has a radically different architecture. Apple will run into the same problems.


I mean it does have a radically different architecture for capturing, predicting, eye movement and updating the foveated rendering profile on the fly to simulate depth planes and, of course, do "real" foveated rendering compared to Quest Pro (one of the patents is below). As well, talking to people who actually used it, said they were blown away by how crisp the text is.

https://www.patentlyapple.com/2023/07/a-new-apple-patent-cov...


This article is talking about the render quality at the headset's peak PPD. It is not concerned about foveated rendering.


same


I think the main thing he's missing is that Apple isn't aiming for the Quest Pro vision of "Three big desktop OS screens in front of you" and is aiming for people to port iOS apps, where each context/window is separate in the 3d world, dialogs pop out, etc. Applications will natively be aware of how big the text has to be for each situation instead of trying to squeeze the 2007-era design of MS Office onto your eyeballs.

Yeah they have the "project your Mac screen into this space" but that's obviously a bonus they threw in since they knew that there are not going to be enough productivity apps out there on day one.

He's also mistaken about a lot of the text stuff he writes. MacOS X has always ignored font hinting. Macs (and some iOS devices even) will already render on a larger canvas and scale to fit the screen when you choose a different scaling mode.


I'm currently reading this on a 27-inch 2560x1440 monitor [1]. At a typical viewing distance, it's around 35 ppd. This is perfectly fine for me, and I would imagine most other people as well (it has a 4.8-star rating on Amazon). You can see a little bit of pixellation around the edges of text, but I don't really notice unless I'm looking for it.

I don't think many people are going to care that the Apple Vision Pro is "only" 35 to 40 ppd. And those who do would probably be willing to concede that the ability to take as many monitors of whatever size you want with you everywhere is worth the tradeoff.

[1] https://www.amazon.com/P2720DC-27IN-WLED-25X14-HDMI/dp/B082F...


The main difference is that your monitor is stationary so fonts are rendered onto a grid of physical pixels in a fixed position. A monitor in VR is a flat plane that is not aligned with the physical pixels of your headset, so it has to be re-rendered constantly and shifts around on the pixel grid causing sparkles with the slightest head movement.


That's a lot of words for someone who hasn't even used the hardware.


"I've done the math."


I (and everyone else) have known that text in VR is really hard and technically demanding since I received my first Oculus from the first kickstarter.

Not sure we can judge how well Apple has solved this extremely hard problem until we actually can put our eyeballs in the product. I agree the MQP is not up to the task.

(also, judgements based on pointing a camera through the lens are extremely uninteresting since cameras aren't at all designed for that kind of use case and do not give a good idea of the subjective impression of the quality of the VR, which is ultimately all that matters)


Karl Guttag is awesome, but he is not a "product guy". The ultimate test will be if the market adopts it. All of our technology goes through cycles of higher quality and "worse is better" periods. People don't need perfect, but will the Vision Pro be "good enough" for enough use cases to gain niche adoption.


New products can be worse along most dimensions than the incumbent product except one. And if in that one dimension it's a 10x better solution to a new problem/job-to-be-done, then it is the beachhead in which the new product can stake their claim and become better and survive. Then it remains to be seen whether this beachhead grows or stays niche.

He may be technically right in all the way monitors are better than Vision Pro as a monitor for all the current use cases, but I don't think it's a forgone conclusions that they're not going to be monitor replacements.


Title is misspelled, should be “ridiculous”.


<title>Apple Vision Pro (Part 5A) – Why Monitor Replacement is Ridiculous - KGOnTech</title>

I guess the poster did that mistake.


Indeed, the most ridiculous thing about this article.


It's spelled correctly in the article. Did you even read it?


The person that posted it here made the mistake, my bad and apologies on that part. I read the article, and it's still ridiculous on many levels, number one being that they haven't even used the fricken thing. It was all just speculation via using something "similar", and then interpolating some meaningful result from it.


Kind of off-topic, but given the Vision Pro's weight—called out by many reviewers—it's worth considering adding magnetic connectors to the top of work-focused headsets. This way, a ceiling cable could bear most of the weight and charge the device. Also, users should be able to disconnect seamlessly. That said, these headsets might be lighter by the time they hit mass market. Additionally, this would open up the possibility of augmenting computational power through the cable, perhaps even facilitating a direct connection to your PC or similar devices.


Those who are looking forward to the virtual-monitor use case usually have mobile use in mind when traveling.


This is certainly some great creative problem solving, but I’m not sure it would be realistic in practice.


Yeah, my opinion is honestly that even though the Vision Pro is impressive on a technical level by virtue of probably being the best the XR industry can offer right now, it's still not nearly sufficient to reach the clarity of a 4K monitor that costs 50x less (if the $3,500 price estimate is at all accurate). It would drive me nuts trying to work with text on such a display, which is a problem based on how much code I write and how much time I spend chatting on platforms like Discord.


I purchased a Vive pro 2 to try it as a monitor replacement for my CHG90, which is a 49in 3840 x 1080 display.

The vive 2 is 2448×2448 per eye, or 4896×4896 total.

It’s completely unusable for the purpose. Text is blurry and unreadable. The lenses are garbage. It’s sitting on a shelf now.

The vision pro is probably somewhere in the range of 3,391 x 3,391 per eye, although one guy calculated it to likely be 3660 x 3142.

That’s not that big of an improvement over the pro 2, and I doubt it would actually give enough clarity to use it as a high res display replacement.

I picked up the pro 2 for $350, and I feel like I may have even overpaid a bit for what I got. Unless the vision pro is a quantum leap in display quality over it, it’s going to be a non-starter at the 3,500 price point.


> The vive 2 is 2448×2448 per eye, or 4896×4896 total

Typo there: 4986 x 2448 total

> That’s not that big of an improvement over the pro 2

If it's 3391x, that's almost 40% more pixels per axis than 2448x and almost double the number of pixels per eye. Bit more than "not that big", surely.


It's like pizza sizing, a large is substantially bigger than a medium despite the diameter seeming not all that different due to surface area.

Coupled with the complexities of lensing within the device, I think people should wait to try Apple's headset before coming to conclusions based on other designs.


It's not like a pizza. You care about linear resolution, not number of total pixels.


Yes it is. Only relevant question is whether there is cheese in that crust. /hungry


Many people are still buying 1080p monitors because they don’t actually care about clarity that much. It’s hard to factor in VR because you need to make many assumptions around viewing angles and whatnot.

Ergonomically, I think they easily beat tablets or laptops when chilling on a couch which more than offsets minor resolution issues. Not that long ago I remember upgrading to 640x480 monitors and they were better than 320x200 but not by nearly as much as you might think. Downgrading in DPI while upgrading in utility is probably well worth it.


Some people don't need 4K, that's fine. It's just that I do, so my opinion is that the Vision Pro isn't enough for me yet.


Possibly, but you say that without any direct experience. There’s a long history of people trading form factor over everything else when it comes to computers.

Tablets < laptops < desktops by almost every single metric except that isn’t representative of actual usage. AR could easily displace all of them.


Oh I definitely plan on getting a touchscreen 4K when budget permits, for (the option of) a more tablet-like experience (I already currently use a portable monitor like a laptop in bed). And I actually have switched to my current HMD (2160x2160 per eye) in exactly one extreme case: when I was on LSD, and my neck was getting tired, so I wanted to lay it down while still being able to look at art (images, not text).

I just, you know, wouldn't ever do that for daily usage or productivity like coding - I'm instead considering looking for some way to suspend my 4K monitor above me instead... at least until some HMD can match its sharpness/clarity.

It's kind of ironic, Apple displays are what got me addicted to HiDPI in the first place...


The article confirms my suspicion that it's not going to be great for productivity. Limited FOV, hard to read things, limited/more difficult input methods, and it being tiring to wear for a stretch of time.

It also doesn't seem to be designed for gaming, not for games where you move a lot and require fast and precise input.

The part of the presentation where usually developers show their finest creations this time was rushed and showed low poly toys apps as we've known from AR apps for a decade now. Sure, you can say it takes time for the ecosystem to develop, but that ignores a big problem.

It doesn't just take time. It's beyond the reach of most app developers entirely. It's extraordinarily expensive and difficult to build a compelling 3D world that is convincing enough for somebody to want this device as well as keep that headset on. It needs to be spectacular and convincing. Because ordinary consumer experiences, say the PS5, already deliver that fidelity and engagement. You need to up that significantly given the cost and discomfort of this device.

It seems to me that only the likes of Disney can pull that off, and not the small app developer. Even for Disney it will be a questionable ROI.

So what remains? A device for solitary passive entertainment? In a world already saturated with digital experiences and content?


Spending a month "on an article to quantify why it is ridiculous to think that a VR headset, even one from Apple, will be a replacement for a physical monitor", without actually using the headset sounds like an insane waste of time. Given other reports by people that have actually used and compared a Vision Pro, I'm willing to bet Karl changes his tune when he finally joins them.


I've been messing around with an nreal air which although not a true VR/AR display it is a pretty good HMD for consuming pancake content. Most of the time I use it with a Steam Deck, which it's perfect for. I have had to use it for work once due to a screw up during a desk move that left me without a workstation for a day. The nreal worked fine, but I had a ton of trouble with software. Even with Dex, Android was still pretty clunky. Using Windows/Linux/Dex all leave me with the same hacky feeling that I get when I use a desktop OS on a large TV. Like the scaling feels all wrong, you know?

I just got the nreal beam, which allows for more flexible placement of display feeds in the glasses. It helps, but I still find the clipping problem with low FOV displays crappy.

Apple is probably the best positioned to nail the software side of this, but I think it'll be a couple years before the hardware fully catches up.


Given the price of $3500 against competitors, and the external battery pack, I strongly suspect that Apple's hardware is very different compared to any other headset in the market.

From the limited info in the keynote, it seems they're doing "something" with foveated rendering. Ie: The display is a lot more dense where the fovea sees.

From the demos, they clearly have the eye tracking dialed in to precision.

From these, I would make a bet that the headset contains galvo motors to move the optics such that the high-density portion of the display's area are always in alignment with the fovea. Probably not the Retina display kind, but something more like ~100 ppi equivalent - which is certainly usable for work.

Otherwise, what's the point of having denser pixels in one area?

If this is true, I can totally see this being a true monitor replacement. This would also make most of this blog post moot.


I'd love to know how they intend on reducing eye strain. Just using a monitor, eye strain develops after 45 minutes. With VR there's no distance of >20m to look at occasionally, to relax your eye muscles. Perhaps it can trick your eyes into relaxing those muscles.


I'll believe it when I see it. Until then I'm skeptical that enough people will want this until it's sub ~$1,500.

I'd love to be wrong on this, but my gut tells me something's not quite right with this product vertical (notwithstanding Apple's entrance to it).


I think that while the first revision won't be terribly practical as full monitor replacement, it will probably be the biggest step towards that to date — something you might pull out when you want a canvas unmitigated by physical size constraints that's entirely free of the distracting and fiddly scroll/pan/zoom interactions.

It could be pretty incredible for travel, where you're typically stuck with a small 10"-14" screen. For my day to day work in IDEs for example I'd take a virtual ~24" 4k screen over a real 13" one any day.


Hmm. If it is entirely free of scroll interactions, how will I read to the bottom of a typical web page?


That situation sounds like the user is seated, which is not really served well by a large screen.

I’m thinking more whiteboard/pinboard/etc sorts of use cases, where you’re typically standing and can walk around. Think Apple Freeform, Obsidian Canvas, or any other program where you have an infinite canvas (maps, etc). I could definitely see it being nice to be able to throw one of those up on a wall instead of having it boxed into a comparably small monitor.


Walking around your document is scrolling around using your legs instead of mouse finger, right?

And limited by the size of your wall.


I used a 65 inch TV about 2.5 meters away from my eyes as a monitor for my home PC for a couple of years. Trust me when I say that having a huge screen at a longer distance from your eyes gives much less eye strain. It is focusing on things close to your eyes for extended periods of time that causes eye strain

With VR you could basically have a monitor that would feel like watching a valley while sitting atop of a mountain

Unrelated, but the best thing for eye strain when working is to sit close to a window and focus your eyes on distant things every few minutes


Perhaps it could work for some lone developer guy but people don't want this for the same reason they don't want to wear a motorcycle helmet. It looks uncool and it messes up your hair.


Tbh I always laugh when I see someone without a motorcycle helmet. It doesn't look cool to not wear one, it just looks like you need to project to the world how tough you are. Also, wouldn't the wind also mess up your hair?


Ok so Apple vision pro is even more uncool than motor helmets because it messes your hair up more than if you didn't use it.


I dunno... This looks pretty cool in a geek-cred way. And, his hair looks fine.

https://twitter.com/BigscreenVR/status/1666617283832324097


It looks pretty cool in that I-just-had-cataract-surgery kind of way.


I've used the xReal Air on a plane. The Air has 1080p screens for each eye, so it's pretty much the worst possible option that kind of works. I would pay 2x what I paid for the Air (~$400) in a heartbeat for a 4k option to have when I travel. But for the kind of price Apple is talking about it would have to be good enough to beat my main workstation setup, not just be better than a tiny laptop screen for travel. And I agree with TFA that it's very unlikely to meet that higher bar.


I'll express my opinions after really trying it myself...


This ignores the fact that people are doing this right now with Quest 2s. They're early adopters, sure, but they're doing it.

I've done it, and for sure the limiting factor was the clumsiness of the interface and the discomfort of the headset after an hour -- not the sharpness of the text.


but i really want 3d vim.

...no seriously. i may have a problem...


How does one exit out of 3d vim?


Its author did recently, but it might be hard to reach him.


Carefully, so as to not accidentally exit the simulation


You can accomplish this with most VR hardware now but it's not very good. The issue is that text is hard to work with in VR (for the complex reasons the article describes). The eyestrain is also incredible.


I've tried working in VR on side projects. It sucks after an hour. Granted, I have a Samsung Odyssey+ from a few years ago.

The PPD talking point is absolutely a problem. If my particular headset had the ability for "many floating windows" arranged in arbitrary places, it might be better (but it doesn't - I think there's a desktop app for Meta Quests which supports this though).

So, aside from the eyestrain due to low PPD, it is legit amazing, and I could potentially have productive workdays in VR with the right system. I also purchased the counterweights which help quite a bit.


why are we restricted to windowed layouts of applications? I think we can get more creative than that. I would like a more fully immersive experience that doesn't try to mimic our constrained current experience. Getting to the pixel level and determining the outcome based on that is missing the forest for the trees.


> why are we restricted to windowed layouts of applications? I think we can get more creative than that.

For the same reason we're "restricted" to pages in books.

> I think we can get more creative than that.

Yup, if we discard utility.


For the price of AVP I can buy 12 43-inch smart TVs to put in every wall in the house. real augmented reality


“which translated 36 to 45 pixels per degree (ppd) from 0.5 meters to 0.8 meters (typical monitor viewing distance), respectively. Factoring in the estimated FOV and resolutions, the Apple Vision Pro is between 35 and 40 ppd or about the same as a 1987 monitor.”

Doesn’t ppd have to be combined with distance to make sense whatsoever. You can’t compare 40 ppd at 1 meter with 40 ppd at ??? m?


No, pixels per degree are already corrected for distance.


I mean in terms of the physical resolution of the pixel plane, eg a low res calculator far away has more ppd than a 4K monitor at 1”.


Might seem counterintuitive but the productivity gain of having more screen real estate has diminishing returns very quickly beyond one screen and any perceived gains are mostly placebo.


Being able to shape applications to the size I would like for them to be without needing to buy yet another peripheral is greatness, though. If I want a really long code window, I can do that. If I want a really long terminal nearby because I'm grepping a huge codebase, I can do that. If I want to work on a large visual diagram while viewing the code and the terminal, I can view the entire thing and I don't have to worry about switching windows.


Why do you need a long code window? You should always be a couple key strokes away from anywhere in the code base. This makes having to see everything all at once irrelevant. Visually seeing all that code doesn’t increase productivity, jumping to specific lines does.


Maybe you're a magical gnome who has never had to edit anyone else's code before, and all the code you've touched is perfect and no function is longer than a few lines.

Most of us don't get to have that experience, and so we choose our tools accordingly.


To me it’s far more magical to setup your workspace like some panopticon of code and somehow observe everything all at once.


This might be shocking to learn, but different people work differently and what works for you might not be best for everyone else


You mean….Turning your head isn't faster that alt+tabbing?next you’re going to tell me visible, but unnecessary, application windows increase cognitive load! Sheeesh, let me bake my face in several hundred watts of infrared heat 12 hours a day. I look so damn productive!


Back in the days of office work having multiple monitors was mostly a power move, a sort of status symbol like having a corner office with windows. There’s really no need for anything more than a single screen. I don’t even work off anything more than a laptop unless I’m feeling gratuitous.


> Turning your head isn't faster that alt+tabbing?

Turning your head a few degrees, to the location you remember putting one specific window, and which is probably currently in your peripheral vision, is absolutely faster than tabbing through who knows how many windows to get the one you want.


No it’s not. I guarantee you I can jump damn near instantly to any tab or application window I want faster than you could turn your head. It’s like the difference between pushing your mouse a couple inches to carefully click some element, vs just using a shortcut. If people truly want to increase productivity, they would master these keyboard movements.

Also, the more monitors you have to look at, the slower your head movements will get. With a single monitor your head your eyes have very minimal movement. Consider how fast your eyes dart around on a small screen like a phone.


Eh, if you only have to switch between a few windows, alt+tab can get pretty fast, especially if you build up the "skill" to estimate how far back a certain window is in the tab list. But looking is definitely easier and less complicated.


> Turning your head ...

And tilting your head. And turning your body unless you want neck strain. Then there's tilting your body...


It is ridiculous how often people misspell 'ridiculous' as 'rediculous'.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: