Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It does. Imagine a one pixel wide vertical white line on black background. As you downscale it, by whatever reasonable means, the line becomes a darker and darker grey.

EDIT: This is in the context of the image shown in the article, which is a gradient image consisting in light lines over black background. Of course, if you have thin black lines on white background, they will become ligter upon downscaling. In practice, on a real image, some parts will become darker and others lighter, depending on their particular textures.



If you have an image consisting of a thin white line on a black background, then indeed as you scale it down the pixels covering that white line will become darker because each pixel corresponds to a small amount of white in a mostly-black square. (I am making some assumptions about what scaling is meant to mean here, but the conclusion doesn't change much if you think about it differently -- e.g., with pixels as point samples of a bandwidth-limited function.)

But those pixels are also larger and the ratio (amount of light in image) : (number of pixels in image) should not change.

The author of the OP is complaining that when you take an image and rescale it, its overall average brightness changes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: