
Neural supersampling for real-time rendering - tosh
https://research.fb.com/blog/2020/07/introducing-neural-supersampling-for-real-time-rendering/
======
emcq
Exciting to see new graphics research! This technique reminds me of old work
using motion vectors to not upsample in pixel space but instead upsample in
time, performing cheap interpolation. This technique got close to shipping
with Force Unleashed II back in 2010:
[https://www.eurogamer.net/articles/digitalfoundry-force-
unle...](https://www.eurogamer.net/articles/digitalfoundry-force-
unleashed-60fps-tech-article)

[https://dl.acm.org/doi/10.1145/1837026.1837047](https://dl.acm.org/doi/10.1145/1837026.1837047)

~~~
corysama
Slides with speaker notes:
[http://and.intercon.ru/rtfrucvg_html_slides/](http://and.intercon.ru/rtfrucvg_html_slides/)

Dmitri got it to work, but it was a bit too late. There were artifacts that
could have been worked-around in the content. But, the idea came up after the
content was far too solidified.

------
_Microft
Discussed here:
[https://news.ycombinator.com/item?id=23714977](https://news.ycombinator.com/item?id=23714977)

