Hacker News new | past | comments | ask | show | jobs | submit login

Is it _trying_ to simulate a 4k display? Or is it trying to provide something different- a large panorama of 1920x1920 windows?



Neither really. The VP uses a pair of 4K displays to render a high dpi section where it sense your eyes focusing. This is out of a much wider virtual display space (potentially 360deg). As your eyes and head move it shifts the image through that high resolution area. the space outside that area is rendered in a lower resolution to match your peripheral vision. This technique is called foveated rendering.

People who have tried the demos say that it is a seamless experience. Moving your eyes and head around, you see things rendered at a 4K resolution but it appears that the space around you is unbounded.


This is wrong. The eye tracking is only to reduce the computational demands, it doesn't change the DPI of the screen *at all". It's purely an optimization trick.

I don't know why people keep repeating this myth. It's just not how it works. It's not moving around pixels in the physical display.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: