Hacker News new | past | comments | ask | show | jobs | submit | madhawav's comments login

Oh yes. This could be a nice interface to provide other inputs we (plan2scene) want as well. Like photo to room assignments. Then, plan2scene can create 3d houses with appropriate materials. Thank you for sharing this link.


This is actually a very interesting video. Reflection effects look very realistic.


Yah it’s cool stuff! Currently NERF takes a very long time to calculate and render but folks are working on improving that.


Do you think its possible to decouple lighting effects from the generated 3D scene? Currently it seem to simply copy the actual lighting condition in the environment. In a way, it's a limitation. It's much better if it can provide the RGB color + other material information (SVBRDF decomposition). If that's possible, then these meshes can be used in other environments.


I think its definitely possible, though I don’t know if the algorithms exist now. You would probably need to capture the scene under different lighting conditions, and probably from different angles. But assuming there is some theoretical way to go from photos to a mathematical representation of material characteristics I’d think we’ll eventually have neural nets that can do the conversion.


There are some prior work that can digitize (vectorize) floorplans: https://github.com/art-programmer/FloorplanTransformation.

This can detect rooms as polygons, doors/windows as line segments and bounding boxes around objects (e.g. cabinets, sinks etc. indicated on floorplans).


Thank you.


This is definitely a thought I have considered. It could potentially ease lot of hassle related to 3D scanning of houses by real-estate companies. Very often, those scans require expensive equipment, professional expertise to use those equipment and considerable time and effort to capture the entire residence. Even then,you will find many holes in those scans and floating debris. With our work, one only needs a sketch of the floorplan and photos taken from that house. Plan2Scene will use data to infer appearance of unseen surfaces.


For instance, look at this sample 3d scanned house: https://matterport.com/en-gb/media/2486. Make sure to switch to the doll-house view from the toolbar. There are many holes in the scan.


Estimating the global camera pose using non-overlapping photos is a challenging problem. This is doable if the photos densely cover the house (e.g. video frames).

However, if the global camera poses are available, we can detect surfaces in photos and project them reasonably well into 3D. In the past, I have used this work to achieve something similar: https://github.com/NVlabs/planercnn


Hi, This is because the Rent3D dataset we extend has a "non-commercial use only" term.


How do we get access to it? I filled out the form and haven't heard anything back.


I did got a delivery failure on one of the emails I sent out. Can you submit again just to be sure?

Here is the form: https://docs.google.com/forms/d/e/1FAIpQLSfl4muDFf0qktqtWhj6...

Sorry for the trouble.


I'm the first author of this, happy to answer any questions. It's great to see this on HN.

PS. I'll be graduating this July with my masters.


Congrats!

As far as I can tell it's not directly related to yours - but are you aware of any software that uses say, Android OS and a camera or pre-existing photos of the inside of a house/office/building to create a 3d map? Matterport seems like it might but their Android app is not going so well.


Ben Claremont has done a lot of reviews of 3D virtual tour software, see https://youtu.be/uKkQQ0aHRSc

Ben’s reviews are the best I’ve seen, but he is far from the only reviewer in this space.

Unfortunately, most of the software in this space does not actually generate a floor plan, even if they do create a 3D model of the space (like Matterport). There is one program he highlighted which is bundled with a service from the company that you can use to process your pictures and generate a floor plan among other things, but some of that work is still done by human beings.

I’m still looking for a good consumer grade program that can do photogrammetry of a space and turn that into a floor plan. I’ve seen attempts to do this with phones and cameras, and the LIDAR scanner that Apple has recently introduced has improved the situation, but IMO it is still not there yet.

But please correct me if I’m wrong. I would love to be wrong.


Did a bit of googling and this came up: https://canvas.io/. But, it's not for Android.


Ahh, yes — they used to support the Structure.io scanner that you could strap to an iPad or iPhone. I bought one of those. I could never get it to work.

Edit: the Canvas.io software really wants the latest iPhone or iPad devices with LIDAR scanners built-in. It will supposedly run on lesser hardware, but with greater error in the constructed 3D map. Fine for a sample to get an idea of whether or not it might work for you, but not really enough to actually use for anything useful.


Note that the Canvas.io software really wants the latest iPhone or iPad devices with LIDAR scanners built-in. It will supposedly run on lesser hardware, but with greater error in the constructed 3D map.

Fine for a sample to get an idea of whether or not it might work for you, but not really enough to actually use for anything useful.


I was reminded about a program called MagicPlan. See https://youtu.be/gEUk7YzfTcc for a video demo of using it to scan an apartment, using an iPhone 12 Pro with the LIDAR sensor.

But IMO, this still isn’t good enough. It’s a huge improvement over the corner mode, which is what you have to use if you aren’t using a device with a LIDAR sensor, but it’s still not enough.

I’m still looking.


Congratulations! Always great to see SFU students in the wild :)


Thank you!


Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: