- Download and install Blender 2.71 (http://blender.org/download). On linux (Ubuntu) I did not even have to install it; I just extracted the tarball and ran the blender binary.
- Go through this two part ceramic mug tutorial (30-60 minutes): http://youtu.be/y__uzGKmxt8 ... http://youtu.be/ChPle-aiJuA
As someone who does not have graphics training, I was blown away when I did this. Apparently there is this thing called 'path tracing' based rendering, that takes care of accurate lighting, as long as you give the correct specification of geometry and materials.
Some interesting videos:
- Octane 2.0 renderer: http://youtu.be/gLyhma-kuAw
- Lightwave: http://youtu.be/TAZIvyAJfeM
- Brigade 3.0: http://youtu.be/BpT6MkCeP7Y
- Alex Roman, the third and the seventh: http://vimeo.com/7809605
Brigade is an effort towards real-time path tracing, and it's predicted that within 2-3 GPU generations, such graphics would be possible in games.
BTW interactive path tracing-based rendering streaming into the browser: https://clara.io/view/1f7bd986-a232-4b42-8737-ce675093faa8/r... You can edit the scene too like in Blender if you click the "Edit Onilne."
There are so many great Blender tutorials on Youtube and Lynda.com has an excellent Blender essential training course.
It's amazing how easy it is to make something that looks amazing.
Who predicts this? Path tracing is fundamentally different from rasterization, and I doubt that GPU manufacturers can transition that fast.
(some ninja edits)
The renderer used by Ikea is V-Ray, the same renderer we have integrated into our online 3D modeling & rendering tool: http://Clara.io :)
Here are two simple Ikea-like furniture scenes which if you click "Edit Online" you can edit it in your browser, both the geometry, the materials and the lighting setup, as well as rendering it photoreal via V-Ray:
IKEA has a mobile catalog app which already has a bunch of interactive features like Augmented Reality furniture and a 3D shelf configurator. https://www.youtube.com/watch?v=uaxtLru4-Vw
They do have a tool sort of like this for kitchen design (developed by Configura in Linköping, Sweden). But I want something for the entire home!
Works decently well, enough so that I used it to pick out a TV stand.
I can only wait for a well integrated 'select the furniture for your own house app/site/whatever'.. which.. makes me wonder if they're considering some of the opportunities presented by VR or - better - AR (such as Meta and others).
AR overlays of how furniture would look in your own home, would be quite neat!
Fun and games then ensues when people figure how to dump the information from IKEA to their 3D printers.
I'm not sure they'd ever do that. They want you in their stores. Their stores are structured so that you have to go through everything and see everything and activate that "nesting instinct."
"Hmm, I want a chair, but that cutting board is really nice... and there's a knife block that matches it. And I guess I'll get some storage containers too. Might as well get lunch while I'm here."
A virtual store could also deliver in that department though, in that knife blocks and storage containers could always be situated in the neighbouring department, no matter what the customer was actually looking for. The accessories and decorations in each in-virtual-store display could also be tailored on a per customer basis, depending on what Ikea knows about the customer. "Nice table, and I really like the placemats they have used on it...". One can imagine Ikea providing a "buy the lot" option in their payment process.
Usually you just order and then you have to go pick up everything you ordered.
It will take a few years but the interactive 'show your webcam your house and then populate the empty shell with ikea stuff' is likely some year away.
Epic is encouraging all kinds of applications such as architecture simulations and not just video games. I'm interested to see how the engine can be used to do something similar to what Ikea is doing.
Of late, I haven't been in touch. Good to see stuff like this on Hacker News.
Specifically, I wonder if they leverage the original CAD models? And if so, how are they converted to 3D Studio Max, and if the process is automated in any way?
Failing that, I've heard of some artists actually whipping out calipers to take measurements from real-world pieces, but it seems like that method would defeat the purpose in this case.
I'm very curious how they manage the distribution of computation?
When a light ray strikes the ceiling it can bounce off towards a vase that is on a diffuse table which scatters the light in all directions. So the calculation for this light ray needs to know the shape and material (BRDF) of all the objects that interact with the ray.
Before sending out the ray from the camera into the scene it is unknown what objects are going to be hit along the way - as you can imagine is a difficult problem to optimize for. The usual solution is to just distribute the entire scene.
On a single computer there is no problem, the entire scene is usually present in memory. On multiple computer it is more difficult since you will end up distributing large amounts of data (scenes can be multiple gigabytes).
And at least in VFX everything's generally done lazily so you only read textures as and when you need them if they're not cached already - there's a bit of overhead to doing this (locking if a global cache, or duplicate memory if per-thread cache which is faster as no locking), but it solves the problem very nicely and on top of that the textures are mipmapped so for things like diffuse rays you only need to pull in the very low-res approximations of the image instead of say 8K images and point-sampling them, so this helps a lot too...
I've been experimenting with Blender and Skulptris lately and 3D modelling is quite amazing. A wonderful mix of technical and artistic skills. I wonder if IKEA will ever rethink their large super-store model and move towards smaller stores where you virtually walk into and interact with rooms and furniture.
I'm sure once Holodeck technology arrives, all stores will adopt it...
A model of model rendering itself....
By the way, instead of home furnishings of different colors for people with Google Glass or similar devices IKEA can just sell an app which will color a furnishing (only in the image projected onto retina) into "bought" color whenever owner looks at the piece, Emerald City style.