Hacker News new | past | comments | ask | show | jobs | submit login

One thing that's illustrated in the demos is that you can zoom into detail in the photosynth images that you couldn't in a video.

I imagine there could eventually be better interactivity with the underlying 3D model than video could provide. Certain surfaces could be links to more information or another photosynth, for example. It kind of reminds me of some of the VRML demos from the 90s, but without the plugins and working backwards from photos instead of forward from models.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: