Hacker News new | past | comments | ask | show | jobs | submit login
Building 3D with Ikea (cgsociety.org)
337 points by panarky on Aug 27, 2014 | hide | past | favorite | 61 comments



In case anyone has about 30-60 minutes and is interested to get a quick glimpse of how easy (I guess meaning free and accessible at the least) it has become to do such graphics:

- Download and install Blender 2.71 (http://blender.org/download). On linux (Ubuntu) I did not even have to install it; I just extracted the tarball and ran the blender binary.

- Go through this two part ceramic mug tutorial (30-60 minutes): http://youtu.be/y__uzGKmxt8 ... http://youtu.be/ChPle-aiJuA

As someone who does not have graphics training, I was blown away when I did this. Apparently there is this thing called 'path tracing' based rendering, that takes care of accurate lighting, as long as you give the correct specification of geometry and materials.

Some interesting videos:

- Octane 2.0 renderer: http://youtu.be/gLyhma-kuAw

- Lightwave: http://youtu.be/TAZIvyAJfeM

- Brigade 3.0: http://youtu.be/BpT6MkCeP7Y

- Alex Roman, the third and the seventh: http://vimeo.com/7809605

Brigade is an effort towards real-time path tracing, and it's predicted that within 2-3 GPU generations, such graphics would be possible in games.


You know your stuff. :)

BTW interactive path tracing-based rendering streaming into the browser: https://clara.io/view/1f7bd986-a232-4b42-8737-ce675093faa8/r... You can edit the scene too like in Blender if you click the "Edit Onilne."


That's cool, but I think the real in-browser path tracing example using WebGL is even cooler: http://madebyevan.com/webgl-path-tracing/


Blender is an amazing piece of software; it's history goes back 20 years. I never worked with 3D before and was scared away by all of the claims that Blender was too difficult. However, recent versions (2.6 and 2.7) have a completely revamped interface that is much easier to understand. It's also skinnable and scriptable with Python.

There are so many great Blender tutorials on Youtube and Lynda.com has an excellent Blender essential training course.


The person who made the ceramic mug tutorial has a tutorial on making a falling cloth onto a glass bowl ( https://www.youtube.com/watch?v=2zd1AI198I8&list=PLzmyR17f55... ) and making a teddy bear ( https://www.youtube.com/watch?v=LCghBIUZyuM ).

It's amazing how easy it is to make something that looks amazing.


> Brigade is an effort towards real-time path tracing, and it's predicted that within 2-3 GPU generations, such graphics would be possible in games.

Who predicts this? Path tracing is fundamentally different from rasterization, and I doubt that GPU manufacturers can transition that fast.


I've tried to make a very rough inference based on John Carmack's statement along the lines "at one order of magnitude improvement on today's GPU's we'll start seeing it in real things, and at two orders of magnitude it'll get competitive in games" (http://youtu.be/P6UKhR0T6cs?t=1h4m30s).

(some ninja edits)


Nvidia's Optix is built on top of their general purpose library, CUDA and it massively accelerates ray casts. People have been using this for gpu accelerates path tracers.

http://heart-touching-graphics.blogspot.com/2012/02/bidirect...


Most of modern GPU die is for shader processing. Compute shaders are the way to utilise this power for alternative methods of rendering.


This tut is another good one to go through: http://cgi.tutsplus.com/tutorials/create-and-render-a-still-...


There was more information given on this at Martin's talk at SIGGRAPH as part of V-Ray Day's - but I can not find out if it is online anywhere:

http://siggraph2014.chaosgroup.com/vray-days

The renderer used by Ikea is V-Ray, the same renderer we have integrated into our online 3D modeling & rendering tool: http://Clara.io :)

Here are two simple Ikea-like furniture scenes which if you click "Edit Online" you can edit it in your browser, both the geometry, the materials and the lighting setup, as well as rendering it photoreal via V-Ray:

https://clara.io/view/1d984b08-9711-4643-ae01-c3e53b174ace

https://clara.io/view/9fb1c2dd-0ff1-465e-bc64-fb8ac2cf7366


Even IKEA doesn't like putting their furniture together ...


I'll never be able to look at an IKEA catalogue the same way ever again.


Same here...it's incredibly well done but I can't help but seeing it as all fake now. Their showrooms are still real at least.


The showrooms are greenscreen/CG as well. Make sure to wear a green shirt next time.


The "real" photographs would have been heavily edited in photoshop anyway.



Maybe even sooner.

IKEA has a mobile catalog app which already has a bunch of interactive features like Augmented Reality furniture and a 3D shelf configurator. https://www.youtube.com/watch?v=uaxtLru4-Vw


There's no refrigerator in this apartment. Also, I really want those chandeliers. That's a beautiful rendering though.


The largest door in the kitchen is the refrigerator door. Kitchen equipment with front panels matching the furniture panels are quite popular (refrigerators can be bought without front panels, panels can be then bought together with furniture to match it exactly).


So when do they take the obvious step of providing an easy to use 3d modeller where customers can model their homes using ikea furniture?

They do have a tool sort of like this for kitchen design (developed by Configura in Linköping, Sweden). But I want something for the entire home!


You can use the IKEA catalog iOS app to see show items in your home using augmented reality. It uses the physical catalog as a reference in the camera view to determine size and viewing angle (there was an option if you don't have the catalog, but I didn't try it to see the alternatives).

Works decently well, enough so that I used it to pick out a TV stand.


There's a very large selection of Ikea models in the Sketchup 3d Warehouse - I'm not sure who keeps things up to date there, but that pretty much fits the bill for what you're asking. It's actually how I designed my study, and I ended up rearranging the furniture three times (virtually) before deciding on what I needed to buy, and I'm so glad that rearranging was done on my screen and not on my floor.


Not really having paid much attention to how they did do their catalogs in the past, I just kind of presumed they were moving towards 3D instead of 'real' photography.. and I guess my assumption has been proved right (makes sense - it's a lot more flexible).

I can only wait for a well integrated 'select the furniture for your own house app/site/whatever'.. which.. makes me wonder if they're considering some of the opportunities presented by VR or - better - AR (such as Meta and others).

AR overlays of how furniture would look in your own home, would be quite neat!


They actually provide a (somewhat limited) AR furniture placement iOS app as a supplement to the 2014 catalogue:

http://www.ikea.com/ca/en/about_ikea/newsitem/2014catalogue http://www.youtube.com/watch?v=vDNzTasuYEw


They actually have AR overlays already. What you do is lay the catalog down on the floor, and the iPad app uses it as a size reference to generate the overlay. It's a little finicky in practice but still pretty impressive.


Interesting! I did not know that. Thanks for the heads up, I shall investigate further :)


Or the other obvious step (once 3D goggles mature) of replacing the physical shop with an online shop. Or combine the two: instead of you visiting IKEA's virtual shop, the virtual items are sent to you and rendered in 3D and overlayed on your existing room, so you can walk around the item and see what it looks like in your own room. You then press the "buy" button and it is delivered.

Fun and games then ensues when people figure how to dump the information from IKEA to their 3D printers.


> Or the other obvious step (once 3D goggles mature) of replacing the physical shop with an online shop.

I'm not sure they'd ever do that. They want you in their stores. Their stores are structured so that you have to go through everything and see everything and activate that "nesting instinct."

"Hmm, I want a chair, but that cutting board is really nice... and there's a knife block that matches it. And I guess I'll get some storage containers too. Might as well get lunch while I'm here."


True.

A virtual store could also deliver in that department though, in that knife blocks and storage containers could always be situated in the neighbouring department, no matter what the customer was actually looking for. The accessories and decorations in each in-virtual-store display could also be tailored on a per customer basis, depending on what Ikea knows about the customer. "Nice table, and I really like the placemats they have used on it...". One can imagine Ikea providing a "buy the lot" option in their payment process.


The have an online store in several countries. I've heard from people who work there they're not expanding the trial due to shipping not working with their slim margins (paying for one person returning a sofa ruins the profitability).


Then they wouldn't have an online store either, and they do. I think an online 3D shopping experiment would work for them. Otherwise the competition will go that way and they would be leapfrogged.


They don't have an online store in most countries though, do they?

Usually you just order and then you have to go pick up everything you ordered.


But you pick it up on the ground floor without having to walk through their carefully arranged showroom.


They are adding new ones:

http://www.ikea.com/ms/en_US/rooms_ideas/splashplanners_new....

It will take a few years but the interactive 'show your webcam your house and then populate the empty shell with ikea stuff' is likely some year away.


Check out the Autodesk HomeStyler app: http://www.homestyler.com/mobile


Here are some realistic renders from Unreal Engine 4: https://www.youtube.com/watch?v=iO7Bp4wmd_s

Epic is encouraging all kinds of applications such as architecture simulations and not just video games. I'm interested to see how the engine can be used to do something similar to what Ikea is doing.


I wonder how active the 3D-CG scene is these days. In the mid-2000s there was so much activity on CGsociety (then cgtalk.com). The kind of work people posted there was just out of this world. Absolutely impressive attention to detail. I was an enthusiast too so I would visit the site many times a day.

Of late, I haven't been in touch. Good to see stuff like this on Hacker News.


I wish they had given a bit more information about the actual workflow.

Specifically, I wonder if they leverage the original CAD models? And if so, how are they converted to 3D Studio Max, and if the process is automated in any way?


I went to the Ikea Vray talk at Siggraph. They are reusing CAD models but the key to photorealistic rendering is not the model but the materials. They use a capture and calibration process to feed textures into their VRay based shaders in 3DS Max.


Awesome! Thanks!


Hey - 3d artist here. When you're working with CAD/Solidworks drawings, you can import them directly into 3ds max as paths (2d shapes), and work with them in the viewport with accurate dimensions. I would be very surprised if that wasn't the typical Ikea workflow.

Failing that, I've heard of some artists actually whipping out calipers to take measurements from real-world pieces, but it seems like that method would defeat the purpose in this case.


> We use every computer in the building to give power to rendering as soon as they are not being used. As soon as someone goes to a meeting their computer-power is used, and of course there is overnight when people go home.

I'm very curious how they manage the distribution of computation?


Path tracing is highly parallelizable, a ray doesn't need to know anything about its neighbors (there are integrators that give better results if more information is available) to be traced. In practice, each process just gets assigned a part of the picture and can calculate in until it's done, moving on to the next part.


This is only partly true. Path tracing is an embarrassingly parallel problem but only under the assumption that the entire scene description can be accessed.

When a light ray strikes the ceiling it can bounce off towards a vase that is on a diffuse table which scatters the light in all directions. So the calculation for this light ray needs to know the shape and material (BRDF) of all the objects that interact with the ray.

Before sending out the ray from the camera into the scene it is unknown what objects are going to be hit along the way - as you can imagine is a difficult problem to optimize for. The usual solution is to just distribute the entire scene.

On a single computer there is no problem, the entire scene is usually present in memory. On multiple computer it is more difficult since you will end up distributing large amounts of data (scenes can be multiple gigabytes).


It's really just a bandwidth issue - VFX studios do this all the time with their renderfarms - textures are the main issue - prod/archviz like Ikea stuff are generally really clean and don't have THAT many textures - whereas in VFX everything's dirty and generally very detailed so you're generally pulling in >300GB of textures per medium level scene.

And at least in VFX everything's generally done lazily so you only read textures as and when you need them if they're not cached already - there's a bit of overhead to doing this (locking if a global cache, or duplicate memory if per-thread cache which is faster as no locking), but it solves the problem very nicely and on top of that the textures are mipmapped so for things like diffuse rays you only need to pull in the very low-res approximations of the image instead of say 8K images and point-sampling them, so this helps a lot too...


It would be great if ikea sold some of these model libraries for use in videos etc.


Or you'd think they could give them away for free. It would be free product placement in 1000's of games and movies, no?


You'd get a bunch of polygons and an Allen wrench :)


Ållen


'Insex' actually. But you were pretty close.


The entire concept is very interesting and is a logical extension of the product catalog business (think about the impact of 3D and CG on movies, architecture, etc.).

I've been experimenting with Blender and Skulptris lately and 3D modelling is quite amazing. A wonderful mix of technical and artistic skills. I wonder if IKEA will ever rethink their large super-store model and move towards smaller stores where you virtually walk into and interact with rooms and furniture.


> move towards smaller stores where you virtually walk into and interact with rooms and furniture

I'm sure once Holodeck technology arrives, all stores will adopt it...


A lot of my furniture is from Ikea: It would be great to have this software for seeing how new room layouts look!


I was hoping this was about an app to build 3d things by mixing and matching ikea parts. I know there's at list one community around that idea [1] (and I've done it myself :).

1: http://www.ikeahackers.net/


I am amazed on how this could be cheaper and faster than actually doing real life photos. The scenery and lighting quality is amazing though. Can't do that in a warehouse full of ikea products and fake housings either.


I'm also amazed. But if you've ever been on a photoshoot set it becomes a bit more believable. Sometimes the simplest shots can take forever to get right. Not to mention the number of people required - photographers, grips, directors, gophers, etc. Factor in the point they made about shipping all the physical stuff to a central location, all the different room setups (ie. American vs German vs Japanese kitchens) required and it starts to make more sense.


the photograph in the office with people - are these real people on the office or rendered models too?

http://www.cgsociety.org/static/images/feature/ikea-onComp.j...

A model of model rendering itself....

By the way, instead of home furnishings of different colors for people with Google Glass or similar devices IKEA can just sell an app which will color a furnishing (only in the image projected onto retina) into "bought" color whenever owner looks at the piece, Emerald City style.


This is becoming pervasive in the industry: http://www.wired.com/2013/03/luxion-keyshot/


They should use Sketchfab for that. It is lighter and better thought for the basic user to see the 3d model.


I am Jack's 3D hatred.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: