
Filament: Physically-based rendering engine - corysama
https://google.github.io/filament/Filament.md.html
======
ArtWomb
Kudos to Romain Guy on the Android Graphics team. Stunning WebGL demo here:

[http://www.curious-creature.com/2017/08/14/physically-
based-...](http://www.curious-creature.com/2017/08/14/physically-based-
rendering-demo/)

Good example of visual state-of-the-art on Android is probably the ARK
dinosaur game. The Vulkan API is now included on latest Android. And with the
compact gltf 2.0 format we can expect great experiences on mobile ;)

Vulkan glTF 2.0

[https://www.youtube.com/watch?v=sl7iN-
vQCOs&list=PLy80eMh1-z...](https://www.youtube.com/watch?v=sl7iN-
vQCOs&list=PLy80eMh1-zPUz7y1JtFiS9I6H7_trBUAf)

~~~
naikrovek
glTF is pretty great, to be honest. At work I am writing a tool to convert
Creo View models to glTF 2.0 models and I am experiencing actual joy while
doing it.

I looked at the COLLADA and glTF formats (I wanted something open) and very
quickly chose glTF 2.

I don't have any 3D format implementation experience, and I implemented glTF 2
in Golang in a day, or less.

Anyway, it's a great format (AFAIK) and I wish more 3D modeling tools
supported it. I imagine more will come on board as time passes.

~~~
timdorr
Any thoughts on these criticisms from this Wikipedia article?
[https://en.wikipedia.org/wiki/GlTF#Criticism](https://en.wikipedia.org/wiki/GlTF#Criticism)

(I'm not agreeing or disagreeing with them, I'm genuinely curious to hear from
someone working with the format at the implementation level)

~~~
tfigment
I'm a few years removed from 3D work so take this with a grain of salt
especially since today was first time I've seen GITF. I've implemented several
3ds Max Game Asset importer/exporter tools for modding over past decade. While
I like the look of GITF 2.0 quite a bit, it does seem to lack a number
extension points that I would probably need for a generic importer/exporter
for handling game asset round trips (import, edit, export). I would love to
handle import/export in a generic way and have tried with FBX and collada to
poor effect in the past. Really want to take asset convert to GITF then import
to max edit and then export from max to GITF and then finally bake back to
game format. The less stuff software you need between the game format and the
GITF file the better but should allow for automation/customization.

You can get 80-90% there but miss quite a bit of fine detail needed as a
interchange format (which is why this is a transmission format I guess).
Extensions seem to be basic and I question how well they work in 3d editors
but not really sure how that works and is probably editor specific. I've had a
lot of issues trying to have a single file like collada be properly supported
in multiple editors unless it was fairly basic model. That leads to a lot of
wasted time for modelers trying to get the edited model back into the game and
working correctly.

Back to GITF, collision seems to be missing. They admit that they dont have
vertex compression implemented or animation for things like material colors or
animation metadata (timelines, looping, ...) . Is there a way to describe a
partial surface sort of like skinning for other reasons (for example Fallout 4
needs to tag certain surfaces/vertices as to say that they are part of the
head or arm for decapitation in game). Anyway looks like a nice start as its
all JSON (or binary) unlike GEX,obj,3ds,... which seems to need its own
parsers to read and they did only claim to be an transmission format. Again
only did cursory read of the format so I might have missed things.

~~~
IshKebab
It's glTF not GITF.

~~~
andybak
I've always assumed that was an "L". Whoever named this thing and made the
decision about capitalization needs to possibly think a bit harder next time.

~~~
badsectoracula
It is an "L", glTF means "GL Transmission Format".

~~~
andybak
In that case it's still the capitalisation that is remarkably ill considered.

------
buchanae
Wow. It's so hard to find expert-level documentation like this all in one page
these days. I'm usually stuck piecing together information from a couple dozen
(often poorly presented) websites. The topic is fantastic, but the
presentation deserves some applause too!

~~~
RomainGuy
Glad you like it. We found it hard to piece together all the information we
needed and after reading hundreds of papers and presentations we figured that
writing such a document would be helpful to beginners like us (in this field
that is).

~~~
buchanae
I think my render in Chrome is pretty broken. Here's a screenshot of the top
of the page:
[https://drive.google.com/open?id=1YVtcV-J0JrDyW2pJ6vwgEDsgjO...](https://drive.google.com/open?id=1YVtcV-J0JrDyW2pJ6vwgEDsgjOEusyEH)

~~~
RomainGuy
Looks like the page wasn't able to load all of its resources. Try to refresh
the page until it works. I should make an offline version that doesn't require
online processing.

~~~
mkl
It took multiple reloads for it to work for me on Android Chrome, and was
almost unreadable for a couple of minutes (MathJax processing? Tried KaTeX?)
due to an enormous right margin. The margin and font(?) changed multiple
times.

Is there a PDF version?

~~~
tripzilch
Sort of!

The document uses a JS tool called MarkDeep to convert extended MarkDown into
styled HTML. Just look at the source, it's 99% plaintext Markdown.

If you read the MarkDeep docs, you'll find that it has a feature to convert
the source into PDF instead. I admit I haven't tried this, though.

If it were up to me, I'd do the transform offline and just serve the static
HTML instead. And sure, a link to the PDF just to be nice :)

~~~
RomainGuy
We just serve straight from the source tree so it's not out of date :)

------
corysama
The actual code:
[https://github.com/google/filament](https://github.com/google/filament)

Additional docs on their material system:

[https://google.github.io/filament//Materials.md.html](https://google.github.io/filament//Materials.md.html)

[https://google.github.io/filament/Material%20Properties.pdf](https://google.github.io/filament/Material%20Properties.pdf)

Also, I'll plug [http://casual-effects.com/markdeep/](http://casual-
effects.com/markdeep/) which was apparently used to format the docs.

------
valine
The PBR tutorial series on LearnOpenGL.com covers a lot of the concepts
implemented here like physically based BRDFs and HDR lighting, if anyone is
looking for more resources like this.

[https://learnopengl.com/PBR/Theory](https://learnopengl.com/PBR/Theory)

~~~
corysama
If like me you actually enjoy tech talk videos, "SIGGRAPH University -
Introduction to Physically Based Shading in Theory and Practice" is great. PDF
version is linked at the bottom of your learnopengl page.
[https://www.youtube.com/watch?v=j-A0mwsJRmk](https://www.youtube.com/watch?v=j-A0mwsJRmk)

------
ChuckMcM
Ok this is way fun and drags me back to the days I was diligently trying to
build a 3D rendering engine from first principles (sort of, I had Glide to put
stuff on the screen).

The renderer is perceptually better than the one that is included in my CAD
package (TurboCAD) for pretty much all materials. So I'm guessing they will
snarf it and dump the proprietary renderer and replace it with this stuff if
they can.

But the really interesting idea that popped into my head was this; could
Google offer 'rendering as a service'?

Specifically they have a zillion machines, many of which are doing nothing
important, and they have this rendering package, and they have a scheduler
that can put things on any machine. Imagine a service where you sent them a
suitably detailed model description, and a set of lights, could they send you
back a rendered image? Could you parameterize changes to the model description
over time so that they could send you images in time based on your models?
Could they do say 480 renders 'free' per month and then maybe $0.19/render
over 480 in a single month?

Could you create a studio of modellers who would design models, and animators
that would animate those models over time, and a director who would compose
those animations into scenes? This is basically Pixar without the expensive
renderfarm. Does that enable new studios to bring their own vision to life?
Does it offer a cost effective service to places like Pixar which allows
Google to make money on otherwise idle resources? Curious minds want to know
:-)

~~~
steren
Google Cloud Platform offers
[https://www.zyncrender.com/](https://www.zyncrender.com/)

~~~
ChuckMcM
Okay that is pretty close, and acquired by Google in 2014. So presumably it is
possible to be a small CGI shop and use this as your back end. Now I'm
wondering if the economics pencil out. Clearly there was something that
motivated Google to buy them.

~~~
manigandham
They work just fine, the product is one of many that's used by shops now. GCP
even recently launched a LA region and had an entire day dedicated to the
local VFX shops. Nothing stops you from just running a bunch of VMs which many
do and they have rolled out the Filestore NFS to make shared disks easier too.

~~~
detaro
Is that a recent shift? I remember talking to someone from a VFX shop a few
years ago, and they made it sound like back then anything cloud was a no-go
due to their customers requiring material to stay in-house for fear of leaks.

~~~
manigandham
Yes, smaller studios were early but now the majors like Sony Imageworks are
all in. The cloud is a great fit for most of their rendering jobs and security
is no different (if not better handled by the cloud).

Here's a session from GCP Next 2018 for cloud render farms:
[https://www.youtube.com/watch?v=ODOJ3UbnV6Y](https://www.youtube.com/watch?v=ODOJ3UbnV6Y)

------
fallingfrog
I'm still waiting for someone to do a completely physically based renderer,
and simulate the whole EM spectrum, not just the 3 colors we can see. Then you
can model chromatic aberration, and the difference between fluorescent and
incandescent lighting, and prisms.. you could call it a really really
physically based rendering engine.

~~~
RomainGuy
There are several spectral renderers out there, such as Weta Digital's Manuka.
I don't know if they bother with parts of the EM spectrum that are outside of
the visible range though. I imagine UVs can be important to model in some
situations.

~~~
berkut
To model fluorescence accurately, it's necessary to handle the non-visible
spectrum in some way.

~~~
magicalhippo
Handling non-visible spectrum isn't much of an issue, after all the wavelength
used when path tracing can be whatever (some have used path tracing for
sound). Though getting realistic data for non-visible parts may prove tricky
depending on the material.

IIRC the issue is that if you can ignore fluorescence, then reflection is
simply an element-wise multiplication of the incoming light at the wavelengths
under consideration[1] with the reflection coefficient of the material at
those wavelengths. With fluorescence, that turns into a matrix multiplication,
with obvious speed implications.

If only a single wavelength is considered at a time, then the wavelength must
change upon reflection, otherwise there's no way for the fluorescence to
occur. That can also have performance issues, for example conversion
coefficients to/from regular color spaces needs to be recalculated.

At least that's my understanding having worked on a physically-based renderer
which did do spectral rendering but not fluorescence.

[1]: using for example binned wavelengths or stratified wavelength clustering.

------
Solar19
This looks very good.

Ironic aside: Notice the "Processing math" message on the lower left when you
first open the page? That's MathJax, a huge JS library, chugging away because
Chrome doesn't support MathML. Try going to the page in Firefox and you'll see
that it processes "the math" a hell of a lot faster than Chrome. Firefox
supports MathML, and MathJax probably generates MathML for it.

~~~
stepik777
No it doesn't, it renders it as a bunch of spans. MathML is basically a dead
standard for the web - browsers either don't support it or support is
incomplete and with bugs which makes it useless.

[https://www.peterkrautzberger.org/0186/](https://www.peterkrautzberger.org/0186/)

------
yters
Really beautiful. Funny how high tech 3D engines get better at making things
that look low tech and run down.

~~~
taneq
It's kind of like Moravec's paradox but for graphics. :)

------
Solar19
Also, there should be a warning in case you're running tab heavy on a lower
end machine, or just out of courtesy. The web page is 37 MB (saved as an MHTML
in Chrome -- see your About:Chrome flags). It's yuge. MathJax alone is usually
>1 MB of JavaScript.

------
andybak
1\. What's the likelihood of seeing a Unity wrapper for this in my lifetime? I
imagine the takeup of it if it plugged into Unity or Unreal would be massively
increased.

2\. How does it compare with what OTOY are working on for realtime? I imagine
they are focused on quality over performance and aren't even considering
mobile.

~~~
kowdermeister
PBR is already enabled in most game engines, it's even supported out of the
box by Three.js

~~~
andybak
I'm asking about this specific engine - not PBR in general.

~~~
Asgardr
Why would Unity use Filament? They have a PBR solution that works for them and
is in active development. OTOY's OctaneRenderer is an unbiased path tracer.
It's not comparable to Filament, two different use cases.

------
clankfan
Does anyone know how this document relates the the rendering done in the 2016
ratchet and clank game? That game has always astounded me and 8ve never been
able to find a deep and comprehensive relation about how the rendering works.

~~~
corysama
They certainly use some of these techniques.

Here’s a bit of convo about the code and art of R&C

[https://youtube.com/watch?v=Y65h1aO-xps](https://youtube.com/watch?v=Y65h1aO-
xps)

[https://youtube.com/watch?v=zpvXB4yWvyA](https://youtube.com/watch?v=zpvXB4yWvyA)

[http://advances.realtimerendering.com/s2014/index.html](http://advances.realtimerendering.com/s2014/index.html)

------
malkia
Had to do this:

CC=clang CXX=clang++ ./build.sh -j release

to get it started to compile (as it seems it had cc pointing to gcc and it did
not understand some, apparently clang-specific flags).

------
skavi
Do will this replace Escher? Or do they serve different purposes? (UI vs 3D)

------
hi41
Such beautiful documentation!

------
shawn
There is no such thing as physically based rendering, and the sooner the world
learns this the better. It’s a marketing term, nothing more.

Here’s food for thought: what does it mean to multiply two colors? Nothing.
It’s meaningless. It’s an approximation that happens to look good. But if
you’re going to claim your engine is physically based, you can’t use it,
because multiplying two colors is not based in physics.

~~~
pandaman
"Physically based" in context of rendering means just that the light
calculations preserve energy of light. So there is such thing actually.

~~~
shawn
There is no clear definition, further adding to the confusion.

[https://en.wikipedia.org/wiki/Physically_based_rendering](https://en.wikipedia.org/wiki/Physically_based_rendering)

[https://marmoset.co/posts/basic-theory-of-physically-
based-r...](https://marmoset.co/posts/basic-theory-of-physically-based-
rendering/)

------
gerdesj
Ahhhhhhhhhhh - I will be DVd to death for this but:

I have a screen (it happens to be about 18" wide in this case). Why on earth
is the text in a stupidly thin column in a small font? It looks like TeX's
daemonic alter thingie got into the render process somewhere or perhaps
someone forgot the other two columns or wanted to torture a webby reader with
multiple columns (mmmm ArXiv scrolly uppy n downy pdf) and lost interest.

Anyway ... it looks crap in my browser unless I hit CTRL(num)+ a few times -
200% works.

~~~
RomainGuy
Would you mind sharing a screenshot somewhere? I'm just using Markdeep's
default CSS and the font size seems reasonable on all my screen. I'd be happy
to try and tweak it though.

~~~
gerdesj
I'm not sure what has gone wrong (if anything) - I'm a sysadmin not
devops/webby minded.

This is what I see:

[https://nextcloud.roseandjon.gerdes.co.uk/nextcloud/index.ph...](https://nextcloud.roseandjon.gerdes.co.uk/nextcloud/index.php/s/ttqRiyJK7zQrTC3)
\- that should yield two images.

Fonts/typefaces - lovely. Layout - a bit limited.

~~~
RomainGuy
Thank you! This is exactly the same render that I get. I could try and widen
the column. I've grown to like the narrower width when reading graphics papers
like this but I understand it's not for everybody.

~~~
Fifer82
I am not 100% sure why people are discussing their individual styling
preference. Keep what you have. Thank you for sharing.

