Hacker News new | past | comments | ask | show | jobs | submit login
Bringing Wide Color to Instagram (engineering.instagram.com)
375 points by natthub on Jan 9, 2017 | hide | past | favorite | 117 comments



Here if anyone has questions on the implementation or such.


It's great you listen. So I'll try.

1. Scaling down in linear colorspace is essential. One example is [1], where [2] is sRGB and [3] is linear. There are some canary images too [4].

2. Plain bicubic filtering is not good anymore. EWA (Elliptical Weighted Averaging) filtering by Nicolas Robidoux produces much better results [5].

3. Using default JPEG quantization tables at quality 75 is not good anymore. That's what people referring as horrible compression. MozJPEG [6] is a much better alternative. With edge detection and quality assessment, it's even better.

4. You have to realize that 8-bit wide-gamut photographs will show noticeable banding on sRGB devices. Here's my attempt [7] to reveal the issue using sRGB as a wider gamut colorspace.

[1] https://unsplash.com/photos/UyUvM0xcqMA

[2] https://cloud.githubusercontent.com/assets/107935/13997633/a...

[3] https://cloud.githubusercontent.com/assets/107935/13997660/b...

[4] https://cloud.githubusercontent.com/assets/72159/11488537/3d...

[5] http://www.imagemagick.org/Usage/filter/nicolas/

[6] https://github.com/mozilla/mozjpeg

[7] https://twitter.com/vmdanilov/status/745321798309412865


I thought there was something familiar about that name Robidoux … I see he's the same one that got some crowdfounded work on GIMP's new scaling methods: http://libregraphicsworld.org/blog/entry/advanced-samplers-f...


"When we started this project, none of us at IG were deep experts in color."

This is pretty astonishing to me. I always thought applying smartly designed color filters to pictures was basically Instagram's entire value proposition. In particular, I would have thought that designing filters to emulate the look and feel of various old films would have taken some fairly involved imaging knowledge. How did Instagram get so far without color experts?


In the "How I Built This" podcast for Instagram, Kevin Systrom specifically says the filters were created to take the relatively low-quality photos early smartphone cameras were capable of and making them look like photographer-quality photos. Filters were about taking average photos and making them "pop" but not necessarily by virtue of having deep domain knowledge of color.


I was never under this impression. I was always under the impression the filters are just created by a designer playing around with various effects until it looked nice.


Because it isn't complicated or novel to make compressed 8-bit jpegs have color filters. There are tools for the job and they've been around for a long time.

Working in a different color space than standard requires a little bit of familiarity and finesse that modifying 8-bit jpegs for consumption on the internet did not require.

Many photographers and printers are familiar with this dilemma in a variety of circumstances, where the cameras create images in a different color space and higher bit depth that can't be perceived at all with any technology or the human eye.


I'm sure the comment you're replying to wasn't thinking of the algorithm that applies a filter to a jpeg, but the process by which that filter is created in the first place. The assumption being that there's some sort of theory to colour that allows you to systematically improve the aesthetic qualities of images.

As an analogy, think of the value music theory (i. e. https://en.wikipedia.org/wiki/Scale_(music)#Harmonic_content) for composition.


The creative process isn't novel. There isn't even the capability of any layer masking in most mobile apps, including Instagram, compared to pre-existing more robust tools on desktop (and other mobile apps), severely limiting the 'technical interestingness' to begin with.


Instagram value proposition is that other mostly young people use it.


Instagram's value prop was, and is, mobile.


Bit of a jump in topic, but I'm kind of curious: I don't use instagram myself, bugt I'm sure it resizes images for online viewing, saving bandwidth ans such. Does it do so with correct gamma[0]? Since that's a thing many image-related applications get wrong.

[0] http://blog.johnnovak.net/2016/09/21/what-every-coder-should...


If it's using Pillow for the resizing, probably not. I've not looked at Pillow specifically, but PIL certainly wasn't very good.


Pillow doesn't do anything with gamma by default, nor does it require that color management be compiled in.

(I'm a maintainer of pillow)


That would be a bit sad, given that they went through all the trouble of getting wide colour.


They don't store full resolution images: Square Image: 1080px in width by 1080px in height

Vertical Image: 1080px in width by 1350px in height

Horizontal Image: 1080px in width by 566px in height


Not necessarily on topic - but given your code samples are written in Objective-C, how much of your codebase is still in Objective-C and what's your opinion on porting it over to Swift?


Good q. All of it is still Objective C and C++; there's a few blockers to starting to move to Swift, including the relatively large amount of custom build + development tooling that we and FB have in place.


Hi Mike, since Instagram is listed in the React Native Showcase, could you tell us where are you using it? Thanks in advance.

https://facebook.github.io/react-native/showcase.html


It's getting used more and more in our app--few recent examples are the "Promote Post" UI if you're a business account and want to promote a post from inside Instagram, the Saved Posts feature, and the comment moderation tools we now provide around comment filtering.


I'd like to know this, too.


The swift compiler is probably too slow for projects as large as instagram.


Peculiar this is being downvoted - compiler speed / reliability / support for incremental builds are all issues with large Swift projects. I've even see people go as far as splitting up their project into frameworks in order to bring build times below times as large as 10m and restore incremental compilation.


The swift compiler is used in many large projects.

And you probably overestimate how large Instagram, the app, is.


I know there's not much love for the API these days, but is / will there be a way to access wide color images from the API? iPads and Macbook Pros also have wide color screens nowadays, so it would make sense to display them for specific use-cases in third party clients.


Great article. Question not directly related to your writeup: in your casual usage of a phone/photo app with the new, wider color space, do you notice a difference in your experience of using the app/looking at images? Or, in other words, does the wider color space feel at all different?


Is Wide Color preserved when processing video?


Photos only. Apple's APIs only capture in Wide Color when shooting in photo mode, and their documentation only recommends using wide color/Display P3 for images.


Our iOS dev hunkered down today and may have come up with a decent solution to your EAGLView issue. You can find the write-up on medium: https://medium.com/imgly/bringing-wide-color-to-photoeditor-....


Your link somehow got truncated. Here’s the full thing: https://medium.com/imgly/bringing-wide-color-to-photoeditor-...


I this why iPhone photos of red roses, or other vibrant roses look poor? I love the idea of a photo room is it hand painted?


Yes they look poor if you watch them on not color managed software environments (should not happen on Apple devices). See here a sample image https://code.google.com/p/android/issues/detail?id=225281


Do you test with Adobe RGB (which many prosumer cameras can output) or only P3 (which AFAIK only Apple devices output)?


Just P3, though the wider gamut available in our graphics operations should benefit photos brought in using Adobe RGB too since iOS is fully color managed.


What is the pixel format of your OpenGL surfaces?


Maybe a stupid question. But how to save an image as Display P3 JPEG from Photoshop? I want to play with this color standard.


Note that the red test image is only valid for iOS devices, since in Android you can see the Instagram logo even though it doesn't support wide color


Good to know--I didn't run it through my Pixel. Some devices will do a relative projection from Display P3 to sRGB, which means that it will look "relatively" like it would in Display P3 but projected onto the sRGB color space.

Edited to add: and some other ones are doing something even less fancy, which is just to ignore the color profile entirely and just assume sRGB and display it incorrectly, taking for example what would have been the maximum red point for Display-P3 and making it the maximum red point in sRGB.


Since you brought up your Pixel: what is the point of adding something 1% of your customers maybe can see instead fixing that horrible horrible compression that makes uploads from android (80% world wide market share) look like crap?

(Not an android user, I just want to figure out how a company of your size prioritises between bugs and features.)


I can't speak for OP but say this does effect 1% of users today, what percentage does it effect in 6 months, or a year? Not bad to be proactive.

And regarding android compression issues, although resources are always finite, I imagine in this case the android team is fairly separate, so they may very well be working on that compression issue while iOS is pushing forward into new terrain.


Instagram should hire some faster Android developers if that is the case, it's been an issue since 2012:

https://www.xda-developers.com/a-look-into-instagrams-compre...


> I imagine in this case the android team is fairly separate

This is likely it, right here. So many people forget that larger companies have different teams working on different things. I bet a lot of their "iOS people" that are working on this project have no clue how the Android app works, and Instagram likely has a separate team working on the compression issues.


"Working on" since 2012?


I don't use IG, so I wasn't aware of that problem or how long it had been around. That said, the general sentiment stands: one of their teams working on one thing doesn't show that they don't have another team working on something unrelated.


> (80% world wide market share)

Not of instagram user. Not of app users.


Well, given how Instagram has treated its android users can you blame them?

I've seen a number of SV companies releasing ugly and buggy android apss then use their lowering android user base as a proof that android users use like their services.

To be honest, things could be worse. You could be a tinder user in win phone...


Yes, was a bit confused when I opened the image on my ancient work computer (Windows, old Dell monitor) and could see the logo clearly. Interestingly, Chrome refused to display the image despite it nominally being a PNG, and I had to open it via an external program. Dragging the window to my second monitor (different model) causes the logo to vanish (though I do feel like I see it for a split second on the second monitor... optical illusion?).


Chrome should be able to display it fine. The dl=1 param on the Dropbox URL means the request is served with a response header indicating it's a download. Change the link to dl=0 and it won't force the download.


link for the lazy: https://www.dropbox.com/s/tgarynpj65ouafd/insta-logo.png?dl=...

By the way, anyone who can display it (all my nice monitors aren't with me at the moment) can you see it on this imgur link? http://i.imgur.com/qCna54M.png


That doesn't work - that gets you Dropbox's resized and recompressed version, which breaks the effect (likely it strips the ICC profile)


There's something weird happening.

imgur shows straight red on my 2015 Macbook Pro, but my iPhone 7 shows the logo.

The dropbox thumbnail link (with ?dl=0) shows the logo. Opening the link with ?dl=1 in Preview shows straight red. I think Dropbox is doing some weird thumbnail processing.


I can see the logo just fine on the imgur link, but I can't see the original in firefox or in GIMP.

I wonder if it has to do with imgur compressing the image, or maybe approximating/ignoring the colorspace as was mentioned above.


I can see logo in downloaded png at every screen I have, including old 2008 Acer. Never screens display it better, older screens need to be tilted.


I can see the logo on the imgur link on my TouchBar MBP


You are probably seeing the color management of monitor 1 in effect on monitor 2 while the window is less than halfway. Once it goes above half then monitor 2's color management takes over fully and it disappears.


Why can you see the Instagram logo on Android if it doesn't support wide color?


Can't see the logo on OSX Sierra on my 2014 MacBook Pro Retina 13in.


I don't think that computer has a wide-color display. The 2016 MBP should (at least, the one with the touchbar, not sure about the one without), and the retina iMac supports it as well.


Can see on my 2016 MBP Touchbar 15" w/Sierra


I couldn't either on my MBP late 2013 until I changed the Display Profile to "Display P3". There are other profiles that will show the logo, too, like "Adobe RGB (1998)"


If you change your display profile like this you're going to have incorrect color rendering, so don't forget to change it back after you've had your fun looking at the (inaccurately-rendered) image.


I have a 2016 MBP w/ touchbar, and depending on which color display profile I select in System Preferences, sometimes I can see it, and other times I can't.


On Windows, I can see the logo on one of my screens, but not the other.


I can't see the Instagram logo in the red test image on Windows 7


On OS X you can adjust your color display profile settings to make it magically appear and disappear.


In wrong color though.


Windows I can see it if using XnView but I can't see it if I use the built in image viewers. It's an older computer so it's XnView converting to sRGB most likely.


Do you know why that is the case? Do Android devices project the Display P3 color space into the sRGB color space?


https://www.reddit.com/r/Android/comments/52ultq/does_you_an...

This comment explains it (at least for Webkit's test image, which is probably similar to Instagram's).

> No, it is a bad test. It is an image with an ICC tag that indicates it uses a color space larger than sRGB. The image data has the logo using color that should be outside the sRGB color space, but it still uses 8 or 16 bits to store that data. Android doesn't have color management. Android basically assumes all images are sRGB, so you see the logo. iOS does have color management. iOS sees the ICC profile and interprets the image data so that if you do not have a display that could show you the different reds in the image, it doesn't display them. So we have everyone in this thread on Android thinking they have a wide color display. Most of their displays aren't even 100% sRGB. My Nexus 4 shows the logo. It is very much not a wide-color display.


I used the same approach as the Webkit image, so the same applies here, too (it's also why we only serve Display P3 photos to iOS clients with wide color screens, most Android devices would treat them incorrectly)


Thanks for the confirmation!


No. There is absolutely no color management in Android. https://code.google.com/p/android/issues/detail?id=225281


Ubuntu checking in - I too see a red square


I'm not sure if this is different in the iPhone 7, but the 6S is pretty terrible at color representation in photos, specifically when dealing with neon lights. The iPhone tends to overexpose the neon light to make up for the surrounding being darker, so to get a 'decent' shot, I have to down the exposure by about 2/3 on the 'brightness' in the normal Camera app. But in general, it has a hard time showing as good of color as you would be able to capture in a better camera. For example, I've taken some photos of museum paintings during a visit, and the colors tends to be a little darker and not truly representative (yellows appear like mustard rather than a brighter yellow, for example). I'd love to be able to take more color accurate photos, and it would be more worth it to get the iPhone 7 if that's the case.


That's more a problem of the camera and ISP and not color management and display.


That's what I figured. I hope that the sensor becomes better and better, so that one day I can take great neon light photos (as well as see the correct color of red).


meaning DSP (easy confused).


I did not know that Instagram was using OpenGL for processing. That's pretty neat, given the capabilities of OpenGL. I'm looking forward to seeing more filters with more lifelike Polaroid effects.

But before that, could they not convert images to horribly encoded JPEGs? I get it, bandwidth and costs, but it's an image service that still drowns in its own... when it gets an image with strong reds and blues.


We started using OpenGL in 2011. Our CPU-based image filters used to take 4+ seconds with pixel-by-pixel manipulation, and now can render 30+ times per second.

If you have some sample images where the current image pipeline is going wrong let me know and we can look into improving.


I think that's back when Gotham died :/

http://randsinrepose.com/archives/rip-gotham/


I also went through that process with my app Picfx. Using OpenGL for filters is much quicker, the only downside I've found is being limited by the texture size, I did set up a way to process images in tiles but ultimately decided to just limit images to the texture size. Great info on the colour space, I'm sure it will be useful.


Instead of fixing the horrible JPEG encoding can you please add support for webp? It's quite a bit smaller and well supported with polyfills since it's just a single vp8 frame


polyfills in general are a really awful user experience.

They are typically pushed by people who use the latest chrome, so they have an excuse not to care about other browsers.

Their preformance, and usability is almost invariably terrible.


You don't need a polyfill to deploy Webp. Chrome automatically sends webp in the Accept headers for images, so on the CDN level you could implement some logic to seamlessly swap in Webp images for the browsers that support it. Imgix does this for instance.


Link for viewing 'canary image' in browser, without triggering download/dropbox:

https://imgur.com/a/v82GI

(Indeed, I can see a faint Instagram logo on my iPhone7, but not my older rMBP.)


Imgur seems to be changing the profile or somesuch because I can see the logo when I downloaded the image but not on the imgur image here.


Same.

Here's the link from the article with the download flag removed:

https://www.dropbox.com/s/tgarynpj65ouafd/insta-logo.png


Also changes the profile, I can see the instagram logo here but not on the download


Really strange! I get the same behavior. I thought it was maybe Chrome vs my image viewer, but if I open the downloaded image with Chrome, it's just pure red, but the preview in Dropbox I definitely (faintly) see the logo.

If I right click and save the dropbox preview, I get a 14kb image, but the downloaded image is 29kb. As far as I can tell they're both PNGs with the same bit depth.


Interesting! From this link, I also see the logo on my rMBP (in Chrome or Safari) – but not in the IMGUR link I posted above, even though the IMGUR link reveals the logo on an iPhone7 iOS/10 MobileSafari.

Changing Chrome/OSX to report its User-Agent as iOS9 MobileSafari does not help get a different image from IMGUR.


This is interesting. Shows up in Firefox here, but not in Windows built in Picture viewer. My display is certainly only sRGB so I guess Firefox is doing some kind of correction.


I've written lots of graphics and image processing code.

You can see the logo because almost every computer system on the planet handles color spaces incorrectly. Apple's devices are actually better than most, though third party drivers such as those for printers can sabotage their color handling.

The canary image will appear as red without a logo on a computer with an sRGB display if that computer correctly handles color spaces throughout the whole imagine pipeline. That's a lot of ifs.

If your system ignores color spaces, you will see the logo because the Display P3 (DP3) color space gets compressed into sRGB. When you look at real world DP3 images on this system, you will see the reds as being more muted. The same thing happens if you use an Adobe RGB camera (there are lots of these) and display it in sRGB, except with the green channel, because AdobeRGB has a wider green range.

No matter which color space you use, an image will contain RGB tuples. The color space is additional meta-info which says how to interpret those tuples. Lots of software will ignore the metadata and simply assume the RGB tuples are used in the same way as it expects.


I think "incorrectly" is a strong assertion to make, when it's behaviour that most users are actually accustomed to and expect.

I guess you could think of it somewhat like the difference between clipping an image larger than the monitor's resolution or scaling it to fit. In the former case you preserve the accuracy of individual pixels within the area that fits, but discard the information outside; and in the latter, you lose accuracy of individual pixels but preserve being able to see (an approximation of) the whole image. Applying this to colour spaces, "clipping" DP3 to sRGB preserves the "absolute" colour information but discards the "relative" differences (hence not being able to see the logo), while scaling discards the absolute colour (I think this is what you mean by "reds as being more muted") but preserves the differences (being able to see the logo).

Since a user looking at a monitor derives most of his/her information from the contrast between pixel's colours, I'd say discarding that contrast is the real "incorrect" choice most of the time. DP3 images scaled onto an sRGB monitor certainly won't look as good as on a DP3 one, but at least the user will still be able to resolve the fine detail that relies on differences in pixel values. Besides, getting absolute color accuracy on a monitor has always been nearly impossible in a non-specialised context since it depends so much on things like external lighting.


So... correct me if I'm wrong, but this DP3 color space they're using isn't increasing the bit-ness of the color, it's still 8 bit color, they're just using a different color space to get a wider range of color with less precision?

Seems sort of silly to me as most designers will be on sRGB displays and most people will be used to how images look in the sRGB space, but I guess it's one more way for Apple to sell more new Apple stuff by pretending these extremes in color are more important than precision in other parts of the spectrum.

I can definitely understand going to 10-bit color, this, not so much.


They're not mutually exclusive. Right now basically every single display is 8bpp even if Apple went to 16bpp displays when using normal applications most people wouldn't see anything special because the source data is all still based on eight bits per pixel.

By improving the color gamut you can actually see a difference on the display. Areas where there were differences and color before but it was invisible because of the display now showing actual difference. It's slight, but it's there.

Seems like a good move to me. I imagine moving to 10 or 12 bit color will be the next step.


Last I checked, the browsers vary in their support for color management. Try a quick test at a page like this before drawing any conclusions from the canary image:

http://cameratico.com/tools/web-browser-color-management-tes...


If you have a Macbook and can't see the logo on the canary image, try changing the color profile for your display. On my mid-2014 rMBP, the default color profile was "Color LCD", but I could change it to "Display P3" and see the logo.

To make the change: System Preferences -> Displays -> Color


But unless the display actually supports wide colour gamuts that will just make everything look worse.


Thanks for the hint. On my 2015 Retina MBP, when opening the IG image in Preview, I see the IG logo. However, when dragging the image into Chrome, I do not see it.

How do I get Chrome to do proper rendering?


Chrome does something strange with colors on macOS. Colors always render visibly different than Safari.

That being said, the logo is visible in Chrome for (using a Dell display).


Have you tried restarted Chrome since you changed the display profile?


"What every coder should know about gamma" may be of interest here. http://blog.johnnovak.net/2016/09/21/what-every-coder-should...


Are browsers generally color-correct? I was going through the Webkit wide color examples and was getting different visual results on different browsers[1]:

https://webkit.org/blog-files/color-gamut/

[1]: Compared Safari, Firefox and Chrome on a 2016 MacBook Pro w/ Touchbar running macOS 10.12.


Browsers are all over the place, unfortunately. It's part of why sRGB because the only reasonable color profile for Web use. I think we'll see wide color become common in apps before the Web.


All over the place in what way? Support for different color profiles? Actually handling color spaces at all? The fact that there's no consistency when it comes to untagged images? The mess which is plugins? The ability to specific CSS colors in a specific color space?


When on Android? I also opened a bug on Android Bugtracker: https://code.google.com/p/android/issues/detail?id=225281 and it is marked as a small issue. On Android we are lacking color management and are only restricted to sRGB. Original Wide Gamut and P3 images look dull on Android devices. I also want good wide P3 colors and color management without buying an iPhone.


The canary image in the article displays fine on my S7


Cool article.

Another things worth mentioning is that lots of professional photo & graphics people have been using the Adobe RGB color space for almost 20 years which is "wider" than sRGB.


Awesome, would be great to see in-app support for 2x optical zoom on the iPhone 7+ next


We built this in already! We don't have a "1x" or "2x" indicator, but the dual lens camera is fully used in Instagram now and will do the smart transition between 1x>2x optical and 2x+ digital zoom.


Oh no way! That's awesome, apologies for not knowing this and thanks!


They mentioned this in the post but it was fairly well hidden, can't blame you for missing it.


Ok, so now that's done. How about the ability to zoom?


Update the app? I can zoom in Instagram, both iOS and Android. Apparently since 18 weeks ago: https://www.instagram.com/p/BJxv5WthYas/?hl=en


So... what is "wide color"? Are we talking https://en.wikipedia.org/wiki/Wide-gamut_RGB_color_space or something else that has no documentation anywhere on the internet?

Not trolling, I care an incredible amount about color spaces, and had expected support for something like the UHDTV Rec.2020 color space with D65 (cool/blue white), not an obscure Adobe standard with D50 (warm/yellow white). It's wider than Rec.2020 though, so if this is what you're talking about: awesome, please update your article so people can find out more about this color model and the rest of the world can catch up!


It was in the fourth paragraph:

> The color space that Apple chose for its devices going forward is Display P3.

https://en.wikipedia.org/wiki/DCI-P3


How extremely disappointing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: