1. Scaling down in linear colorspace is essential. One example is , where  is sRGB and  is linear. There are some canary images too .
2. Plain bicubic filtering is not good anymore. EWA (Elliptical Weighted Averaging) filtering by Nicolas Robidoux produces much better results .
3. Using default JPEG quantization tables at quality 75 is not good anymore. That's what people referring as horrible compression. MozJPEG  is a much better alternative. With edge detection and quality assessment, it's even better.
4. You have to realize that 8-bit wide-gamut photographs will show noticeable banding on sRGB devices. Here's my attempt  to reveal the issue using sRGB as a wider gamut colorspace.
This is pretty astonishing to me. I always thought applying smartly designed color filters to pictures was basically Instagram's entire value proposition. In particular, I would have thought that designing filters to emulate the look and feel of various old films would have taken some fairly involved imaging knowledge. How did Instagram get so far without color experts?
Working in a different color space than standard requires a little bit of familiarity and finesse that modifying 8-bit jpegs for consumption on the internet did not require.
Many photographers and printers are familiar with this dilemma in a variety of circumstances, where the cameras create images in a different color space and higher bit depth that can't be perceived at all with any technology or the human eye.
As an analogy, think of the value music theory (i. e. https://en.wikipedia.org/wiki/Scale_(music)#Harmonic_content) for composition.
(I'm a maintainer of pillow)
Vertical Image: 1080px in width by 1350px in height
Horizontal Image: 1080px in width by 566px in height
And you probably overestimate how large Instagram, the app, is.
Edited to add: and some other ones are doing something even less fancy, which is just to ignore the color profile entirely and just assume sRGB and display it incorrectly, taking for example what would have been the maximum red point for Display-P3 and making it the maximum red point in sRGB.
(Not an android user, I just want to figure out how a company of your size prioritises between bugs and features.)
And regarding android compression issues, although resources are always finite, I imagine in this case the android team is fairly separate, so they may very well be working on that compression issue while iOS is pushing forward into new terrain.
This is likely it, right here. So many people forget that larger companies have different teams working on different things. I bet a lot of their "iOS people" that are working on this project have no clue how the Android app works, and Instagram likely has a separate team working on the compression issues.
Not of instagram user. Not of app users.
I've seen a number of SV companies releasing ugly and buggy android apss then use their lowering android user base as a proof that android users use like their services.
To be honest, things could be worse. You could be a tinder user in win phone...
By the way, anyone who can display it (all my nice monitors aren't with me at the moment) can you see it on this imgur link? http://i.imgur.com/qCna54M.png
imgur shows straight red on my 2015 Macbook Pro, but my iPhone 7 shows the logo.
The dropbox thumbnail link (with ?dl=0) shows the logo. Opening the link with ?dl=1 in Preview shows straight red. I think Dropbox is doing some weird thumbnail processing.
I wonder if it has to do with imgur compressing the image, or maybe approximating/ignoring the colorspace as was mentioned above.
This comment explains it (at least for Webkit's test image, which is probably similar to Instagram's).
No, it is a bad test.
It is an image with an ICC tag that indicates it uses a color space larger than sRGB. The image data has the logo using color that should be outside the sRGB color space, but it still uses 8 or 16 bits to store that data.
Android doesn't have color management. Android basically assumes all images are sRGB, so you see the logo.
iOS does have color management. iOS sees the ICC profile and interprets the image data so that if you do not have a display that could show you the different reds in the image, it doesn't display them.
So we have everyone in this thread on Android thinking they have a wide color display. Most of their displays aren't even 100% sRGB. My Nexus 4 shows the logo. It is very much not a wide-color display.
But before that, could they not convert images to horribly encoded JPEGs? I get it, bandwidth and costs, but it's an image service that still drowns in its own... when it gets an image with strong reds and blues.
If you have some sample images where the current image pipeline is going wrong let me know and we can look into improving.
They are typically pushed by people who use the latest chrome, so they have an excuse not to care about other browsers.
Their preformance, and usability is almost invariably terrible.
(Indeed, I can see a faint Instagram logo on my iPhone7, but not my older rMBP.)
Here's the link from the article with the download flag removed:
If I right click and save the dropbox preview, I get a 14kb image, but the downloaded image is 29kb. As far as I can tell they're both PNGs with the same bit depth.
Changing Chrome/OSX to report its User-Agent as iOS9 MobileSafari does not help get a different image from IMGUR.
You can see the logo because almost every computer system on the planet handles color spaces incorrectly. Apple's devices are actually better than most, though third party drivers such as those for printers can sabotage their color handling.
The canary image will appear as red without a logo on a computer with an sRGB display if that computer correctly handles color spaces throughout the whole imagine pipeline. That's a lot of ifs.
If your system ignores color spaces, you will see the logo because the Display P3 (DP3) color space gets compressed into sRGB. When you look at real world DP3 images on this system, you will see the reds as being more muted. The same thing happens if you use an Adobe RGB camera (there are lots of these) and display it in sRGB, except with the green channel, because AdobeRGB has a wider green range.
No matter which color space you use, an image will contain RGB tuples. The color space is additional meta-info which says how to interpret those tuples. Lots of software will ignore the metadata and simply assume the RGB tuples are used in the same way as it expects.
I guess you could think of it somewhat like the difference between clipping an image larger than the monitor's resolution or scaling it to fit. In the former case you preserve the accuracy of individual pixels within the area that fits, but discard the information outside; and in the latter, you lose accuracy of individual pixels but preserve being able to see (an approximation of) the whole image. Applying this to colour spaces, "clipping" DP3 to sRGB preserves the "absolute" colour information but discards the "relative" differences (hence not being able to see the logo), while scaling discards the absolute colour (I think this is what you mean by "reds as being more muted") but preserves the differences (being able to see the logo).
Since a user looking at a monitor derives most of his/her information from the contrast between pixel's colours, I'd say discarding that contrast is the real "incorrect" choice most of the time. DP3 images scaled onto an sRGB monitor certainly won't look as good as on a DP3 one, but at least the user will still be able to resolve the fine detail that relies on differences in pixel values. Besides, getting absolute color accuracy on a monitor has always been nearly impossible in a non-specialised context since it depends so much on things like external lighting.
Seems sort of silly to me as most designers will be on sRGB displays and most people will be used to how images look in the sRGB space, but I guess it's one more way for Apple to sell more new Apple stuff by pretending these extremes in color are more important than precision in other parts of the spectrum.
I can definitely understand going to 10-bit color, this, not so much.
By improving the color gamut you can actually see a difference on the display. Areas where there were differences and color before but it was invisible because of the display now showing actual difference. It's slight, but it's there.
Seems like a good move to me. I imagine moving to 10 or 12 bit color will be the next step.
To make the change: System Preferences -> Displays -> Color
How do I get Chrome to do proper rendering?
That being said, the logo is visible in Chrome for (using a Dell display).
: Compared Safari, Firefox and Chrome on a 2016 MacBook Pro w/ Touchbar running macOS 10.12.
Another things worth mentioning is that lots of professional photo & graphics people have been using the Adobe RGB color space for almost 20 years which is "wider" than sRGB.
Not trolling, I care an incredible amount about color spaces, and had expected support for something like the UHDTV Rec.2020 color space with D65 (cool/blue white), not an obscure Adobe standard with D50 (warm/yellow white). It's wider than Rec.2020 though, so if this is what you're talking about: awesome, please update your article so people can find out more about this color model and the rest of the world can catch up!
> The color space that Apple chose for its devices going forward is Display P3.