

The technology behind preview photos - anand-s
https://code.facebook.com/posts/991252547593574/the-technology-behind-preview-photos/

======
dharma1
Nice. If you are doing heavy blurring, it's a good idea to scale down before
applying gaussian blur anyway for a performance boost, as the blur processing
time grows with blur radius.

Ie. downsample 4x-8x, blur at smaller px radius and resize back to orig size -
much faster than blurring at original resolution, and looks almost identical.

------
kylehotchkiss
As a photographer, I wish you guys would spent this brainpower on finding
better ways to compress images without making them look like JPEG compression
artifacts

~~~
vanni
Fortunately someone is doing what you wish. Unfortunately patents exist.

Better Portable Graphics (based on HEVC):

[http://bellard.org/bpg/](http://bellard.org/bpg/)

[https://news.ycombinator.com/item?id=8704629](https://news.ycombinator.com/item?id=8704629)

JPEG vs BPG:

[http://xooyoozoo.github.io/yolo-octo-bugfixes/#ballet-
exerci...](http://xooyoozoo.github.io/yolo-octo-bugfixes/#ballet-
exercise&jpg=t&bpg=t)

[https://news.ycombinator.com/item?id=8755521](https://news.ycombinator.com/item?id=8755521)

Lossy Compressed Image Formats Study:
[http://people.mozilla.org/~josh/lossy_compressed_image_study...](http://people.mozilla.org/~josh/lossy_compressed_image_study_july_2014/)

------
amit_m
Why go through the blurring, rescaling and JPEG compression?Simply take the
first few DCT coefficients of the image, then quantize and compress them. In
fact, one can make this scheme output the gaussian-blurred image by adding
gaussian-weighted coefficient decay.

~~~
ot
What you are suggesting is absolutely not equivalent to gaussian blurring. Do
low-quality JPEGs look nicely blurred to you?

Truncating the coefficients is a low-pass filter, which introduces nasty
artifacts such as ringing. The transform of a gaussian is another gaussian, so
you could in principle work in the frequency domain but only if the transform
is applied to the whole image, while JPEG works in 8x8 blocks (as sp332
pointed out), and in any case there would be no computational benefit over
applying the gaussian in the spatial domain.

------
maxst
42x42 seems odd for JPEG, why not something divisible by 8 or 16?

~~~
dgreensp
Oddly enough, according to the article, they specced the exact amount of blur
first, and then figured out the pixel resolution required to look exactly like
the original picture blurred by that amount, and then the JPEG quality factor
they could get away with.

If that's really what they did, it's weird for a few reasons, in an amusing
but not ultimately that consequential way. First, there's a complex interplay
between resolution and quality (Q) when it comes to how "good" a JPEG looks.
For certain combinations of resolution, Q, and size-on-the-screen, turning the
resolution up and the Q down actually makes the image look better, but if you
go farther, it looks worse -- and that's when you _aren 't_ blurring the crap
out of the entire thing anyway. Second of all, why does it matter if the final
image has good "fidelity" to a blurry version of the photo that wasn't
downsampled? The user doesn't know what the blurry image is "supposed" to look
like, and yet the article implies they maxed out on information needed to
reconstruct the true blurry image exactly. Oh, and third, there's presumably a
range of acceptable blur radiuses from a UX perspective, while the 200 bytes
is a hard limit, so choosing an exact blur radius and then feeding it forward
through the rest of the design process doesn't completely make sense.

------
tegansnyder
Where does the request for the pre-defined "fixed" JPEG header come to into
play? The client appends this to the JPEG body as stated in the post, but I
seem to be missing the piece on how the client receives the fixed header in
the first place.

~~~
jasonlotito
It's stored on the client side.

~~~
tegansnyder
How?

~~~
jon-wood
As far as I could tell its compiled into the software, with a few placeholders
for dimensions. On receiving a thumbnail the metadata is dropped into place,
and the header preprepended to make it a valid JPEG.

------
ljk
Why does it say JPEG compression isn't enough then later "JPEG to the rescue"?
Am I understanding it correctly that they went after a different part of the
JPEG technology?

------
danjc
Wouldn't it be mainly mobile phones that only get a 2G connection in India?
The cover image served to a mobile phone would be much smaller than 100K.

------
niutech
Why not use WebP for this?

------
mahyarm
That blur step is the clever part.

------
miyuru
saw a similar implementation on wikimedia.

