
Real-time, in-camera background compositing in The Mandalorian - ashin
https://ascmag.com/articles/the-mandalorian
======
mitchelhall
Hi all, really cool that you have taken an interest in this project, a lot of
your comments below are very insightful and interesting. I work on the team
that deploys this tech on set. We focus on how the video signals get from the
video output servers to the LED surfaces, coordinating how we deal with
tracking methods, photogrammetry, signal delivery, operator controls and the
infrastructure that supports all of this. As noted in some of the comments,
the VFX industry solved tracking with mocap suits a long time ago for the
post-production process. What we are exploring now is how we can leverage new
technology, hardware, and workflows to move some of the post-production
processes into the pre-production and production (main/secondary unit
photography) stages.

If you are interested in checking out another LED volume setup, my team also
worked on First Man last year. This clip shows a little more of how we
assemble these LED surfaces as well as a bit of how we use custom integrations
to interface with mechanical automation systems.
[[https://vimeo.com/309382367](https://vimeo.com/309382367)]

~~~
kragen
Thanks! This work is really inspiring! Are these OLED screens, matrices of
separate LEDs of the usual InGaAs and GaN type, or LCDs backlit with LEDs? The
2.84 mm pixel pitch makes it sound like it's separate inorganic LEDs.

Are there times short of sunshine where you need more directionality to the
lighting than the screens can provide? Because the screens can emit _from_ any
direction but not _toward_ any direction, being quasi-Lambertian emitters.

~~~
marcan_42
Discrete LED screens go all the way down to 2mm or even 1.9mm pitch (I have a
few 2mm tiles at home). These are certainly discrete LED panels, not OLEDs.
I'm not aware of OLEDs being used in "wall" applications like this.

When you hear LED walls, think millions of discrete LEDs mounted on PCBs (and
thank China for making this low cost enough to be viable!)

~~~
jweather
LED video walls are down in the 0.7mm dot pitch vicinity. Not quite the ~0.3mm
of a 4K 55" LCD, but we're getting there. And the brightness and contrast
ratio can't be beat.

------
oseibonsu
Here is the Unreal Engine tech they are using:
[https://www.unrealengine.com/en-US/spotlights/unreal-
engine-...](https://www.unrealengine.com/en-US/spotlights/unreal-engine-in-
camera-vfx-a-behind-the-scenes-look) . This is a video of it in action:
[https://www.youtube.com/watch?v=bErPsq5kPzE&feature=emb_logo](https://www.youtube.com/watch?v=bErPsq5kPzE&feature=emb_logo)
.

~~~
jahlove
Here's a video of it in action on The Mandalorian set:

[https://www.youtube.com/watch?v=gUnxzVOs3rk](https://www.youtube.com/watch?v=gUnxzVOs3rk)

~~~
sgc
That explains a lot to me. The actors need those visual cues, and it shows
through in the final product. It was a great result. I look forward to
improved acting as this tech makes its way into other works.

------
devindotcom
This is super interesting stuff and I've been following it for some time. I
just wrote it up with a bit more context:

[https://techcrunch.com/2020/02/20/how-the-mandalorian-and-
il...](https://techcrunch.com/2020/02/20/how-the-mandalorian-and-ilm-
invisibly-reinvented-film-and-tv-production/)

It's not just ILM and Disney either, this is going to be _everywhere_. It's
complex to set up and run in some ways but the benefits are enormous for
pretty much everyone involved. I doubt there will be a major TV or film
production that doesn't use LED walls 5 years from now.

~~~
dd36
I wonder how much this reduces the environmental footprint. The explosion in
shows and movies always looking for raw nature or awesome settings has me
wondering how much destruction it makes. And how much waste it produces.

~~~
BubRoss
Environmental footprint? A local trade show, movie theater or small office
building probably uses more electricity. Let's try to keep things in
perspective.

~~~
folli
I think he meant flying large crews, including actors, caterers etc., to
Iceland to film a movie vs a small photogrammetry team.

------
cbhl
"The solution was ... a dynamic, real-time, photo-real background played back
on a massive LED video wall and ceiling ... rendered with correct camera
positional data."

Gee, that sounds a lot like a holodeck. We've come a long way from using Wii
Sensor Bars[0] for position tracking.

[0]
[https://www.youtube.com/watch?v=LC_KKxAuLQw](https://www.youtube.com/watch?v=LC_KKxAuLQw)

~~~
ragebol
For the VFX industry, the tracking had already been solved for ages, with
those reflective little balls on suits etc. in a mocap system. The Wii sensor
bar's thing was that it was really cheap.

But yes, damn close to a holodeck. But you can't see depth in this setup,
right?

~~~
nocut12
If it's perspective corrected for the camera, it would probably look very
distorted for anyone else on set -- whether there's depth or not

And that's certainly not the goal with this. Something along those lines has
been around for a while
([https://en.wikipedia.org/wiki/Cave_automatic_virtual_environ...](https://en.wikipedia.org/wiki/Cave_automatic_virtual_environment)).
This system seems specifically targeted for solving challenges for film
production, as it probably should be.

I am pretty impressed that real time rendering has gotten good enough to use
for these purposes. I certainly wouldn't have expected that those backgrounds
in the show were coming out of a video game engine.

~~~
miohtama
They mention they cannot push enough GPU juice to the screens, so they only
render the camera focus area in full resolution. Also there is 12 frame lag
which prevents moving camera too fast.

~~~
Aeolun
I don’t fully get this. They could just employ different computers to render
different parts of their cave. It seems more like a cost savings thing than a
technical limitation.

And I’m not sure why you’d skimp on a few PC’s if you’re already building a
humongous led wall, so maybe there’s something I’m missing.

~~~
GonzaloQuero
I worked in a somewhat similar project in 2015, though not as complex as this,
to build background videos for DotParty, using UE4 for panoramas and then
stitching them. One of the hugest issues we found was that, because of this
being a game engine, a lot of things are not deterministic, so if we used
multiple cards or computers, particles and other environmental effects would
not be in sync, and the stitches were glaring.

~~~
Aeolun
Yeah, that’s a good point. And taking out the particles and doing those
separately is probably near impossible.

For the non-visible screens it wouldn’t matter that much, but they’d still end
up with the moving fulcrum for the main engine.

~~~
GonzaloQuero
I believe it's been improved in later versions, as they've focused in these
use cases, and there might even be deterministic particles now, but I'm not
sure, because I've been out of the VFX market for quite a while.

------
flashman
I wonder how the photogrammetry aspect will intersect with intellectual
property laws. The example used - scanning in a Santa Monica bar so that you
can do reshoots without revisiting the location - would be an obvious example
that might raise someone's hackles ("because it's our bar you're using to make
your money" for instance). If you add that bar to your digital library, do you
have to pay them royalties each time you use it? Is it any different to
constructing a practical replica of a real-life location?

Can someone wearing a few cameras walk through a building and digitise it
completely without getting the owner's permission? Here in Australia,
"copyright in a building or a model of a building is not infringed by the
making of a painting, drawing, engraving or photograph of the building or
model or by the inclusion of the building or model in a cinematograph film or
in a television broadcast," for instance. (Copyright Act 1968 §66)

~~~
paulmd
A potential analogy might be something like using Carrie Fisher's image in the
new Star Wars movies. I would assume the estate got paid for that. Or holo-
tupac.

Practically speaking I think it will come down to what you negotiate. If you
negotiate usage of the bar for your series then you can use it, otherwise not.
If you negotiate resale of that model then that's legal, otherwise not. Most
large productions will probably want to stay far on the right side of the law
and get a written/financial agreement until things are hammered out, then
you'll have amateur filmmakers who have to do vigilante shoots.

And again, probably something that will have to be legislated out for the long
term.

In France, the appearance of buildings can be copyrighted, famously the Eiffel
Tower is very aggressive about suing photographers.

~~~
icebraining
It's not the Eiffel Tower that is copyrighted, but the light installation that
is turned on at night. Taking pictures during the day is free from such
restrictions: [https://www.toureiffel.paris/en/business/use-image-of-
eiffel...](https://www.toureiffel.paris/en/business/use-image-of-eiffel-tower)

------
huebomont
Fascinating, but this article needs a proofread, damn...

"The virtual world on the LED screen is fantastic for many uses, but obviously
an actor cannot walk through the screen, so an open doorway doesn't work when
it's virtual. Doors are an aspect of production design that have to be
physical. If a character walks through a door, it can’t be virtual, it must be
real as the actor can’t walk through the LED screen."

Not to mention the multiple paragraphs that are basically re-stated
immediately afterwards. It's like they hit publish in the middle of editing.

------
rebuilder
The Mandalorian was probably a very likely candidate for this kind of
approach, since it's essentially a western, meaning a lot of wide landscape
shots.

The LED screen approach works nicely for fairly uncomplicated background
geometry, like a plain. Try shooting Spiderman climbing up walls on that, and
things will get tricky fast.

As the article notes, slow camera moves are a plus as well. The reason given
is technical, but I also wonder how far you could really push the camera
motion even if tracking lag wasn't an issue. The background is calculated to
match the camera's viewpoint, so I expect it would be very disorienting for
the actors if the camera was moving at high speeds.

~~~
wbl
Spiderman climbing up a wall can be done via forced perspective. It's also an
action scene, reducing the need for a background to help the actor. And some
brave souls will Harold Lloyd it.

------
DonHopkins
"Once Upon a Time" (2011-2018, with Robert Carlyle as Rumplestiltskin!) was
shot on virtual chroma-keyed sets with real time integrated pipeline tools to
preview how it would look.

[https://en.wikipedia.org/wiki/Once_Upon_a_Time_(TV_series)](https://en.wikipedia.org/wiki/Once_Upon_a_Time_\(TV_series\))

The tech behind Once Upon a Time’s Frozen adventures

[https://www.fxguide.com/fxfeatured/the-tech-behind-once-
upon...](https://www.fxguide.com/fxfeatured/the-tech-behind-once-upon-a-times-
frozen-adventures/)

Once Upon a Time” TV Series VFX Breakdown

[https://web.archive.org/web/20180623020817/http://www.animat...](https://web.archive.org/web/20180623020817/http://www.animationboss.net:80/once-
upon-a-time-tv-series-vfx-breakdown/)

------
mdorazio
For those wondering, this appears to be not nearly as expensive as I thought.
The 20" square panels used are available for around $1000 each if you buy a
lot of them used. Compared to a typical production budget for a high-quality
franchise, it's surprisingly cheap to build one of these walls. The equipment
to run it, on the other hand, is likely not cheap at all.

~~~
ishtanbul
If the mandolorian was filmed entirely on location with vfx in post it would
have cost hundreds and hundreds of millions. The sets were incredibly
detailed. So i think they saved a ton of money for the output quality. I also
doubt they bought second hand gear.

~~~
BubRoss
Multiple hundreds of millions to shoot in the desert with red cameras and add
buildings behind armored people with their hair hidden? Plenty of movies have
been done like that and they didn't become the most expensive movies ever made
(which is actually Pirates 3).

------
russellbeattie
Sony had a big demo of this in their CES booth a few weeks ago. They showed
the camera moving around the Ghostbuster's Ecto-1 car and the background
moving as well. You could see from the overhead screens what the final
composite looked like. It was pretty awesome, given it was all set up in a
booth at a trade fair. [1]

As expensive as all this is now, I think this is really going to make an
impact in lower-budget movies. Not having to fly all over the world or having
massive sets to film convincing scenes might be a really good thing.

1\.
[https://www.youtube.com/watch?v=kh2Q9pCxuJw](https://www.youtube.com/watch?v=kh2Q9pCxuJw)

------
severak_cz
Funny that this is practically the same concept as shooting in atelier with
exterior background painted on walls as it was done in old movies. The
progress is only in technology - now it's created by game engine and projected
to giant LEDs, back then in 1930s it was done by hand by painters.

~~~
estebank
I think the innovation is the perspective correction of the background
depending on the camera. That could have been accomplished with rear
projection in film if it had been necessary by having the camera follow a
preset path, but I don't think even BTTF attempted that.

------
csours
This reminds me of The Mill BLACKBIRD -
[https://www.youtube.com/watch?v=OnBC5bwV5y0](https://www.youtube.com/watch?v=OnBC5bwV5y0)

------
treblig
"Postproduction was mostly refining creative choices that we were not able to
finalize on the set in a way that we deemed photo-real."

Does anyone know how they were able to swap out the in-camera version of the
background originally shown on the LED wall with something more convincing
later? Seems like it'd be tough since it's not a green screen!

~~~
janekm
While currently they are using "green screen" in those instances, given that
the camera positions are already being tracked, and the image displayed on the
screens is known, it would be possible to re-render what image the camera
should have seen if the foreground elements hadn't been present and use the
difference with the recorded image as a mask for further post-processing.

(which would be very cool as it would also allow using a low-resolution
version of the background during production that could then be re-rendered
with a higher resolution image after the fact)

~~~
estebank
It seems to me that making the system be able to perform masking itself
(instead of projecting a green screen) would reduce the ability to completely
replace the scene easily, but the reflections already give you that problem.
The advantage you would have is that the real elements in the scene would no
longer need any feathering around the edges causing them to look hazy, because
now you can take a few pixels of the projected background as a transition to
the higher resolution render.

------
en4bz
I think the demise of Lytro was a huge missed opportunity for the film
industry. They had this and a number of other features in their cinema camera
before they became defunct a few years ago.

[https://www.youtube.com/watch?v=4qXE4sA-
hLQ](https://www.youtube.com/watch?v=4qXE4sA-hLQ)

~~~
anchpop
I watched that video, it doesn't really seem like the same thing in this
article (although it's very cool). This is a real screen behind the actors
rendering the scene from the perspective of the camera

------
asmosoinio
Why is this one person always referred to with "ASC, ACS" after their name?

> Greig Fraser, ASC, ACS

~~~
emmsii
It means they are a member of the American Society of Cinematographers (ASC)
and I believe the Australian Cinematographers Society (ACS).

------
jpmattia
I've had a peripheral interest in virtual sets and real-time compositing by
way of a colleague from grad school.

A quick visual summary of this tech:
[http://www.lightcrafttech.com/portfolio/beauty-
beast/](http://www.lightcrafttech.com/portfolio/beauty-beast/)

This video was from a pilot several years ago, and it didn't make it to air,
but it was visually wonderful.

~~~
russellbeattie
With F. Murray Abraham as well! Nice. He's looked the same since 1984's
Amadeus. Crazy!

------
resoluteteeth
This seems slightly limited in its current form in that they either have
choose to either shoot the rendered background as-is (making it harder to
modify in post production) or specifically turn the areas around the actors
into green screens (sort of defeating the purpose).

I wonder if they could use some sort of trick like projectors synced with high
fps cameras to make the real time rendering invisible to the cameras instead?

------
pupdogg
The highest resolution LED panel pixel pitch I've seen to date is
0.7mm...wouldn't this result in a lower resolution capture of the projected
background? Specifically, when they're trying to shoot movies at or above 4K
range? Also, how do they cope with the scan rate of the background video being
played back to sync with the camera recording the footage?

~~~
thegoleffect
In some photos, you can see that from the camera's POV, the area around the
actors is displayed on LED as green screen so the actors can be masked out.
Then, a higher resolution background is composited in. Thus, the original LED
serves to accurately light the scene to reflect the background but not always
to actually be the background.

~~~
pupdogg
This makes more sense!

------
petermcneeley
This technique will produce potentially significant rendering artifacts in the
final image. The backdrop is correct only from the position of the camera. A
reflection from any surface will not be geometrically correct (as seen by the
image from the article) I think that even ambient lighting would contain
noticable deformations.

~~~
virtualritz
It's much better than the reflection of a green/blue screen or an empty studio
with some cameras and people.

Glossy surfaces are usually not a problem unless they're (near) perfect
mirrors. Even then -- lights are usually what you see in most reflections
because they're orders of magnitude brighter than the rest of the set.

If reflections are problem with this new technique in certain settings, they
would be even more with the current state of the art.

In those cases you replace them digitally. No way around this; either way.

Related trivia: The chrome starships in EP1 were actually rendered with
environment mapping and reflection occlusion[1]. Even most games do better
stuff today. Did you notice? :]

[1]
[http://www.spherevfx.com/downloads/ProductionReadyGI.pdf](http://www.spherevfx.com/downloads/ProductionReadyGI.pdf)
5.3 Reflection Occlusion, pp. 89

~~~
petermcneeley
Right but the artifacts here will still show up on glossy surfaces. Here is
how games now fix these kinds of artifacts.
[https://seblagarde.wordpress.com/2012/09/29/image-based-
ligh...](https://seblagarde.wordpress.com/2012/09/29/image-based-lighting-
approaches-and-parallax-corrected-cubemap/)

However the technology in the article cannot be corrected in this manner.

------
vsareto
Can you get a decent job just by knowing Unreal Engine well? Maybe by just
doing small POC projects?

~~~
mattigames
If by "well" you mean including Blueprints and physics-based-shaders then
probably yes; although stuff like 3D rigging and modeling which is done on
different third-party tools are a must for a lot related jobs.

------
vagab0nd
This is so cool. How do they connect the virtual scene with the ground? Do
they have screens on the floor as well?

------
web-cowboy
So we'll be able to play video game adaptations of the locations in the
episodes really easily/soon, right? ;)

~~~
ghostbrainalpha
That has to be a consideration. But I don't know how much it would really
speed up production of a AAA Mandalorian game. Some... maybe a 6 month head
start on a 4-5 year game.

It would definitely help make the game environments higher quality and be a
cost saving to the studio.

~~~
marzell
It could be really cool to have in-game art using the exact same assets from
the actual film. Even whole scene cinematography could be taken from scene
data used in a movie.

I can imagine a consumer experience using high-end VR (say an attraction at
Disneyland)... take the assets and cinematography from a scene in a movie,
digitally recreate everything that WASN'T already digital from the scene, and
then you could relive a scene in VR, with the bonus that you can navigate it
in realtime and see it from different perspectives. This would be especially
adaptable for franchises like Avatar where everything in the film is already
composited in 3D.

On a darker note, you could combine all this with things like the Body
Transfer Illusion, take some kinda psychedelic and star in your own horror
flick where gruesome things are done to your own body in VR. Good times for
the whole family!

------
jedberg
How cool would it be to hook your Xbox up to the wall. Or a TV receiver.

------
ghostbrainalpha
So I guess the next step will just be movies made entirely inside of Virtual
Reality environments, and an Actor in a MoCap suit who plays his virtual
Avatar right?

------
fuzzfactor
The car, the man, DeLorean!

Oh wait, never mind . . .

------
lmilcin
I don't care how "groundbreaking" the graphics pipeline. I watched couple of
episodes and I had to force myself to keep watching to, I don't know, give it
a chance?

I wonder when The Industry figures out the story is more important than the
graphics. You don't buy books for beautiful cover and typesetting... at least
not most you and not most of the time.

~~~
anigbrowl
I enjoyed the story and apparently many other people did too. It's fanservice
for sure, hence all the callbacks to characters and aliens that had background
or very brief appearances in the original movies and left people wanting more.
Cheesy, perhaps, but the entire franchise is pineapple pizza in space.

------
tigershark
Please send me a link of someone that watched the original Star Wars, the
later trilogy and finally the last “attempt” and really appreciated it. I even
watched “Rogue One” in the biggest screen available around me, with high
expectations, and I’m feeling really sad because of that.

~~~
UI_at_80x24
I watched the original trilogy in the theaters when they were released.
(Admittedly I was a bit young for the first one.)

I've seen all the follow-ups/addons/sequels/rewrites that exist. Rogue One is
the movie that I waited 30 years to see another story in the SW universe. It
wasn't perfect but it was damn good enough.

The Mandalorian gives me hope that 'Grownups' are in charge and can create
something worth looking at.

I am holding out hope that a story and plot will emerge. I really hate "baby
yoda", but if that's what it takes to move a real story along I am willing to
tolerate it.

#1 It looks incredible. Must win Emmy for best cinematography! #2 It feels
real. It feels right.

I'm sorry you didn't like Rogue One.

