

Antialiasing: To Splat or Not - mnem
http://www.reedbeta.com/blog/2014/11/15/antialiasing-to-splat-or-not/

======
eridius
For some reason I find the test image that was used to be quite fascinating on
its own. Were I given that image in isolation and told "write a program to
generate this" I wouldn't have any idea where to start. After consulting the
source code I now realize how it was created, and if anything that makes it
even neater, that such a simple approach generates such an interesting-looking
image.

~~~
tacos
Yeah; that's the most interesting part of the article. Everything else is
pretty much a dude screwing around with getpixel/setpixel without
understanding basic signal processing. Looks like he made the classic
log/linear error too.

~~~
vardump
> Everything else is pretty much a dude screwing around with getpixel/setpixel

That's not very constructive. Can you point where he did that? For reference,
source code is here:
[https://gist.github.com/Reedbeta/893b63390160e33ddb3c](https://gist.github.com/Reedbeta/893b63390160e33ddb3c).

> without understanding basic signal processing.

I got the impression he approached it from visual pleasantness point of view.
Which is more than perfectly valid when generating images _for people to look
at_. In that business, if it's fast to compute and looks visually good to
human eyes, it is _perfectly acceptable_ to do a slightly "wrong" thing from
signal processing point of view. At least until we have infinite computing
resources.

I didn't read the source code, but judging by the article and images, he did
appear to understand signal processing and sampling theorem. He appeared to
look for a better sampler for a scan-line (think Pixar Renderman) or ray-
tracer renderer (think POV-Ray).

My take is to have per pixel adaptive sample count as a function of standard
deviation in certain sample radius larger than a pixel. Oversimplified, the
higher the deviation, the more samples should be taken until the contribution
is below some adjustable threshold. For example in a real ray-tracer you
probably want to consider other variables as well, such as computational cost
per sample. Ultimately the problem in visual renderers is how to get the best
visual quality for computing resources available.

> Looks like he made the classic log/linear error too.

I can't see any telltale sign of doing linear processing for log space data in
the images themselves. They all look correct. Retina / high-dpi display? Make
sure your web browser is not resampling the images linearly in log space! Or
worse, your monitor or graphics adapter, in case you're using a non-native
resolution.

~~~
GFK_of_xmaspast
In general, things are not "wrong" for reasons of ideology, they are "wrong"
because they are "suboptimal" or "don't work."

------
Ono-Sendai
This is quite an interesting question. If you splat, you're effectively
sharing some information between neighboring pixels, which is efficient.
However you do introduce some variance at each pixel since you're not
perfectly importance sampling the filter function. So it's a trade-off.

------
robert_tweed
Server is straining. Here's the Coral Cache mirror:

[http://www.reedbeta.com.nyud.net/blog/2014/11/15/antialiasin...](http://www.reedbeta.com.nyud.net/blog/2014/11/15/antialiasing-
to-splat-or-not/)

