

Trimensional: 3D Scanner for iPhone - GrantS
http://www.trimensional.com/

======
motters
This is a shape from shading approach, which will only work if you're in the
dark, with the phone as the only illumination source. If white dots are shown
in different places on the screen, and assuming the phone and subject don't
move during the process, surface normals can be computed from the resulting
images, and once you have the normals then the shape can be approximated.
Normals can be found by calculating the angle of maximum reflectance for each
pixel for a series of images under different illuminations.

See this Google video for a similar technique.
<http://www.youtube.com/watch?v=rxNg-tXPPWc>

~~~
dy
What are good resources for looking into converting photos into 3d object
models? I'm really impressed by Shapeways but would like to take current real-
world objects as a base rather than building them in a 3D tool etc.

I've seen the laser scanners but they're relatively expensive. Is there
anything like a mount that takes two iphones, and software that can take those
two photos to create atleast a projection?

~~~
greendestiny
There are a lot of approaches. Yes there is software that takes two photos and
let you reconstruct 3d. If you have a man-made object you probably don't want
a generic point-cloud building approach (like say
<http://www.photosculpt.net/> ) - but rather something like Photomodeler that
lets you select vertices and build up surfaces.

There is also software that builds models from silhouette methods, that
usually require you printing out targets which you place the model, so of like
a manual turntable approach. I can't remember the name of software that does
this at the moment, but its out there.

Or you could work from video (just from a single camera) that you move around.
That can be effective, not sure of the best software for doing this - it's
called structure from motion, a quick google shows some source at this project
site: <http://phototour.cs.washington.edu/>

There some code out there for a do it yourself laser scanner - this isn't what
I was thinking of but it seems kind of cool:
<http://www.youtube.com/watch?v=xK5eYhpBtQc>

I've also seen some do it yourself structured light software (ie bring your
own projector and camera), that seems to work ok.

It kind of depends on what size and type of objects you want to scan - things
like the surface properties could be important - also how long you can keep it
still.

~~~
dy
Thanks for the information, it was very helpful.

------
antirez
Very cool and very uncool.

Cool: that it works in a decent way, and uses a neat trick.

Uncool: You can't export the image into a 3D file, making it 99% less useful
that it would be otherwise.

Suggestion: export it as VRML, it's trivial format that you can generate
starting from your points. Use this format (from my own code, so use it as you
wish):

    
    
        fprintf(fp,
        "#VRML V2.0 utf8\n"
        "Shape {\n"
        "    appearance Appearance {\n"
        "        material Material {\n"
        "            diffuseColor .5 .5 .5\n"
        "        }\n"
        "    }\n"
        "    geometry ElevationGrid {\n"
        "        xDimension  %d\n"
        "        zDimension  %d\n"
        "        xSpacing 0.01\n"
        "        zSpacing 0.01\n"
        "        solid FALSE\n"
        "        creaseAngle 6.28\n"
        "        height [\n", hgt->width, hgt->height
        );
    
        for (y = 0; y < hgt->height ; y++) {
            for (x = 0; x < hgt->width; x++) {
                height = getheight(x,y);
                h *= YOUR_3D_MULT_FACTOR;
                fprintf(fp, "%f", h);
                if (y != hgt->height-1 && x != hgt->width-1) fprintf(fp,",");
                if (x == hgt->width) fprintf(fp,"\n");
            }
        }
    
            fprintf(fp,
        "        ]\n"
        "    }\n"
        "}\n"
    
    

3d studio and other programs will happily import this stuff.

~~~
zach
Wow, there's been a VRML sighting! As an old dude, VRML reminds me of the
worst of 90s web hype. Trying to leave that aside, I also don't think it has a
lot going for it to keep it around as a format.

I prefer the even simpler Wavefront OBJ format as a scratch format. It's about
as printf-compatible as it gets and is supported in many more places than
VRML.

<http://en.wikipedia.org/wiki/Wavefront_.obj_file>

~~~
antirez
Good point, but unfortunately in wavefront .obj there is no "elevation mesh"
alike object, so it's a bit more complex than this, but still not too hard.

Btw while VRML as a plugin and the idea of a 3D web sounds now 90s, the data
format itself is too bad.

------
tobtoh
Cool app. But I have a pet peeve with information pages that only provide
video as a description. Whilst I understand that especially for an app like
this, the best 'information' is to demonstrate how it works via a video, it
doesn't help people a. just want to get a quick one line explanation of what
your app does and/or b. can't watch the video at the time (slow connection,
work restriction etc).

If I hadn't been at home, I would have just left the page and moved onto to
something else - missed sale.

~~~
GrantS
Thanks for the feedback. I was expecting to have this weekend to pull
everything together but Apple approved the app in 2 days (which is great, but
5x faster than for my previous submissions).

~~~
closure
The first question that comes to mind for me is: Is it possible to export the
generated mesh that is shown in the stills?

~~~
GrantS
Good question. Version 1.0 only saves/emails images, but trading 3D scans and
exporting the raw mesh is certainly on the feature list for future versions.

~~~
magicseth
Might I suggest using the Bump API to share scans?

------
magicseth
I tried it this morning in the pitch black. It worked pretty well. The effect
was mostly comical with some distortion. The most interesting aspect is how
people are taking all these things that at first pass would be considered
"impossible," applying some ingenuity and hard work to them, and pushing the
limits of this technology.

------
lliiffee
Looks like it just uses the reflected intensity to estimate the depth, then
pastes the original colormap on top of that? Incredibly clever and simple
hack.

~~~
iwwr
Can the iphone modulate the intensity of its light via software?

~~~
lliiffee
It looks like it is just sending a total blank white screen while it takes a
picture. Sure you can change the intensity-- use a gray pixel instead of a
white one.

~~~
anoved
Actually, it briefly displays a sequence of four illumination conditions - a
white semicircle at the top, right, bottom, and left edges of the screen.

------
ericb
The fact that this only works in the dark made me wonder, how much of what the
Kinect does could be possible on an iPhone, and what would be needed to get
there? 2 Cameras? What else?

~~~
sbierwagen
If your subject doesn't move, two cameras can be approximated by taking two
pictures a set distance apart.[1]

Also note that the method used by this app is very short ranged, and uses the
LCD's backlight while it's imaging, so there's zero feedback, making any kind
of interactive application impossible.

1: There are third party accessories to make this easy and repeatable, built
with varying levels of quality. Here's a fairly cheap one, circa 2003:
<http://www.dansdata.com/photo3d.htm>

~~~
anoved
Or what about using built-in motion sensors to record the relative camera
location and orientation of a series of frames captured with an e.g. iPhone
camera? I don't know exactly how the accelerometers and gyros work, or what
sort of data they provide (linear distance vs just orientation changes?), but
imagine holding down a "scan" button as you simply swing the phone around a
subject to capture a series of images. I would think it would be possible to
reconstruct 3d surfaces (at least under suitable illumination conditions, I
guess) given known camera location/orientation for each frame. Pushbroom
stereo, in remote sensing parlance...

~~~
m_eiman
Tracking movement in 3D using dead reckoning is apparently very inaccurate,
with the iPhone sensors I wouldn't expect it to be accurate for more than a
few seconds at best. I visited a startup working on the problem a few years
ago, and they had problems even with dedicated hardware.

------
Samuel_Michon
HTTP Error 503. Video can be found here:

<http://www.youtube.com/watch?v=IEZtiDrxh-E>

------
jfeldstein2
Can't wait until someone pipes this into a reprap.

------
nickpinkston
Is this related to the webcam 3D scanner video that was in the news about s
year ago? I'd be interesting to check out.

------
kirpekar
What can one use this for?

