
Taichi – Physically based Computer Graphics Library - vt
http://taichi.graphics/
======
yuanminghu
Hi, I'm the author of Taichi. Thank you so much for your attention on this
project.

As you may have noticed, Taichi is still a work in progress and far from
perfect. Actually, I plan to bring this project to you guys after a stable
interface and comprehensive documentation are finished. I didn't plan to
``promote'' it today but for some reason it just gets some attention. I'm so
surprised and my website taichi.graphics went down for the first time due to
increased traffic. Thus I upgraded the server on GoDaddy...

About the Chinese name 太極, as noted in the FAQ section at
[http://taichi.graphics/misc/](http://taichi.graphics/misc/), I chose this
word in the intersection of traditional Chinese (which is still used in some
parts of China nowadays) and modern Japanese, since a significant part of
Taichi was done during my internship at the University of Tokyo.

GPU support is absolutely meaningful, especially for the simulation part (like
the snow and smoke). If I'm given more time (and access to GPUs) I would
definitely try GPUs someday. For the rendering part, it's like CPUs are still
widely used in the industry, though there are renderers with GPU support like
LuxRender. Another consideration is that CPUs are more friendly to programmers
and researchers for prototyping.

I'm recently doing a full-time internship at Microsoft Research Asia and have
very limited time for this project. If you would like to contribute, that's
truly appreciated.

Thank you again for your support and encouragement.

~~~
mistercow
> If I'm given more time (and access to GPUs)

How much of that can you do in the cloud? IIRC there are some pretty beefy
GPUs on EC2, for example.

~~~
liuliu
Someone has to pay for that. This is an open-source project.

~~~
mistercow
Of course. I was just suggesting that it might be less expensive than some of
the alternatives.

------
david-given
Hmm. I'm looking for a procedural renderer with C++ integration for my moon
rendering project ([http://cowlark.com/flooded-
moon/](http://cowlark.com/flooded-moon/)). Right now I'm using a customised
version of Povray, but it's not very satisfactory.

The trouble is, I need to render planets, which are very very big objects,
typically seen from very very close up; Povray's the only renderer I've ever
found which could cope, and even then it needs clever tricks to avoid floating
point errors. I'd like to switch to something else.

I do see that Taichi is based on embree
([https://embree.gi](https://embree.gi)) for doing the raytracing heavy
lifting, which I did briefly look at; looking at it again I see there's a
standalone renderer now
([https://embree.github.io/renderer.html](https://embree.github.io/renderer.html)).

I'll certainly take a look. I wonder if Taichi does Rayleigh scattering? I
need it to do atmospheres...

~~~
radarsat1
I would imagine you could benefit from a bit of thought about how to handle
your coordinate system to avoid floating point precision problems. There are
ways to avoid huge numerical precision ratio issues. It should only be a
problem if you are simultaneously visualizing very large and very small
objects, but in that case there are other methods (e.g. pre-rendering small
objects that won't change much based on the viewing angle). If you are blindly
throwing huge numerical ratios at your renderer you are bound to have
problems.

It's not ideal, of course it would be better to ignore such issues and just
find a system that can "handle it" but dealing with numerical issues and the
need for "clever tricks" is simply a reality of dealing with numerical
approximations of reality.

~~~
david-given
Well, I _am_ rendering very large and very small objects, many of which
intersect. Planets are modelled as a single huge object (earlier versions of
the project used a Povray isosurface, but I'm now generating a mesh for the
area around the camera). That's pretty much unavoidable based on the goals of
the project.

What I'm currently doing is arranging things so that the camera's at (0, 0, 0)
in world space, which means that precision loss caused by arithmetic is
minimised. That solved the problem completely for me, by which I mean that I
haven't noticed any more artifacting. But it does mean that I have to be aware
of where each object is going to be in space rather than just creating them
using local coordinates and translating them later.

Some interestingly broken pictures here:

[http://stackoverflow.com/questions/20307441/is-this-a-
povray...](http://stackoverflow.com/questions/20307441/is-this-a-povray-
precision-loss-artifact)

~~~
radarsat1
Great, sounds like you found a solution then ;)

I guess my point was more that you can't really expect another renderer to do
much better, as it's a problem with what you (were) doing rather than a
problem with the renderer per-se. Making things closer to the camera
coordinates probably overall makes sense. But, I still think it's not a bad
idea if you run into more trouble, you could also try rendering small and
large objects separately and compose them afterwards. You'll still have to
mess around with coordinate systems of course but separating the large and
small objects might make it easier to do that. (e.g. make large objects
smaller, camera closer, and render, make small objects bigger, camera same
distance, overlay the two images.. it's to force the vector coordinates to be
in ranges that are tractable for the rendering algorithms. Annoying, but
possibly necessary in your case.)

Nice pictures btw ;)

------
imaginenore
Looking at the code samples, it's quite similar to Blender, except, I assume,
it compiles to a standalone app. Very impressive stuff!

Needs GPU support though, and that took Blender a long time to cover even the
basic material properties.

~~~
slezyr
Blender is GUI to render engine. And there are multiple rendering engines[1].
Including Mitsuba[2] physically based renderer with GPU support

[1] [https://www.blenderguru.com/articles/render-engine-
compariso...](https://www.blenderguru.com/articles/render-engine-comparison-
cycles-vs-giants/)

[2] [http://www.mitsuba-renderer.org/](http://www.mitsuba-renderer.org/)

~~~
slezyr
It seems more popular LuxRender works in similar way.

------
Udo
The site is clearly a work in progress. This looks potentially interesting,
but it's difficult to judge without documentation.

~~~
anewhnaccount
Looks like this might be a good entry point:
[https://github.com/IteratorAdvance/taichi/tree/master/python...](https://github.com/IteratorAdvance/taichi/tree/master/python/examples/rendering)

------
WhitneyLand
Beautiful stuff. Wonder why no gpu support.

------
chairmanwow
Interesting choice to use the traditional characters over simplified.

~~~
homarp
He explains why on his site: (
[http://taichi.graphics/misc/](http://taichi.graphics/misc/) ) and in another
comment (
[https://news.ycombinator.com/item?id=13327362](https://news.ycombinator.com/item?id=13327362)
)

Then why “太極” instead of “太极”?

    
    
        “太極” is a word from not only traditional Chinese, but also modern Japanese. Considering the fact that a significant part of Taichi (10+ papers and the general software framework) was done during my internship at The University of Tokyo hosted by Prof. Seiichi Koshizuka and Prof. Toshiya Hachisuka, I thereby chose this word in the intersection of two languages in memory of my time at UTokyo.

~~~
grzm
Thank you for pointing this out. While I dug, I clearly didn't dig deep
enough.

------
xjia
[https://github.com/IteratorAdvance/taichi](https://github.com/IteratorAdvance/taichi)

------
jinmingjian
Service Unavailable...Hi, boy, your site may go down. But I like the name,
although it has been used over and over again.

