Having related emails batched and then providing easy ways to pin, snooze or mark done is so useful. Clicking the "sweep" button (Mark All As Read) after prioritizing the email you care about is such a satisfying feeling. I'm not quite at inbox zero yet, but definitely feel like it is now actually possible with a bit more work on my end.
Hi everyone, Chris, the founder of imgix here. We are just beginning to talk about what we are going to be rolling out over the next few months, so I wrote this piece about how and why we are moving in the directions we are moving, with a focus on explaining things from the developer perspective. There will be plenty more to follow so stay tuned. For the moment, I am happy to fill in any details that I can.
Keynote is actually a surprisingly good tool for building diagrams once you learn the interface and features. I use it for all of our architecture diagrams and for generating printable posters of such to hang in our office. I have tried nearly everything else, but unless you need semantics native to your diagram, Keynote is (oddly, I know) the best I have found.
You should check out the Whitebox Software Defined Radio Project. It is a hacker friendly software defined radio started by my friend Chris, a former lead engineer from Google/YouTube and the former VP of technology at Adly. http://radio.testa.co
We would be happy to help at imgix. We currently process tens of millions of images per day, including large images like yours. Feel free to contact me at chris (at) imgix (dot) com. Link: http://www.imgix.com
We used the existing image requests as a baseline. So the numbers we are reporting are specifically for the images that are being served in established formats (e.g. PNG and JPEG) measured against their equivalent requested WebP and JPEG XR variants. We then measured the percentiles of savings for these images. 98% of images in our test customer base saw a 18-74% improvement in compression per image, with an average of 41% savings per image. The data itself is a deduplicated set of hundreds of millions of image requests across several test customers. The one piece of data we should also report is the aggregate total savings per customer across all of their image requests. I will look into calculating that data point.
We deliver images out of a CDN where we already have handled the proper request normalization. There is no support cost to implementing content negotiation in this case unless you want to put us behind a proxy. At that point, we can work with you to vary correctly without incurring the complexity you are focusing on.
Normalizing the values used in Vary at the CDN level is definitely the right way to go. However, that still leaves problems with transparent proxies at large companies, ISPs, mobile carriers, etc. unless you also have something like Cache-Control:private which is correctly handled.
Founder of imgix here. The comparison image is just meant to be moderately informative. We will be following up with a detailed performance outline as more data comes in. You can compare the progressive JPEG image against the WebP or JPEG XR to get a sample of what the ratios might be for a standard JPEG. Besides, a surprising number of websites still serve wildly uncompressed and unoptimized imagery. We help websites that are serving uncompressed JPEG and PNGs all of the time.
Furthermore, what should be understood (and I will clarify in the post) is that apart from the example image, which is designed to show the comparative compression ratios of the file formats, the data we report is based on the images that these companies are ALREADY serving. We analyzed our logs for what the image size is that they are currently serving as JPEGs and PNGS at the same size and simply enabled content negotiation for those same sized images.
With regards to the Vary header. You are right that the cache fragmentation of varying by Accept or User Agent would be extreme. This is why we do not do this. Instead, we perform normalization on the request and generate a specific "content ID" that takes into account any number of input signals that should be normalized and varied on. We can expose this as a separate header to folks who want to Vary on if folks want it. Finally, if you are serving directly out of us, none of this really matters. There is no web server involved. We handle all of the hard work on our end.
I disagree that content negotiation is the wrong approach though. The fragmentation we see across all of the input signals (e.g. format support, device pixel ratio, user agent, etc) is already so extreme that the best answer is to serve dynamic responses for images. If you want to stay within in the browser prefetch stage, which is critical for front-end performance, you have to make decisions about the content you want to serve at the server. This means potentially serving variant content under a single URL. We serve targeted responses for text content all the time across the web. It is not clear from your argument as to why this same treatment should not apply to imagery.
Having OS X in our stack allows us to tackle a lot of use cases that come out of the prototypical design process. Format support, color profiles, color space conversion, typography, etc. are all mastered in OS X, but lacking in other operating systems. Apple has had the best imaging scientists in the world working for the last 30 years on getting these features right. We want to be able to leverage that expertise, whenever it makes sense, to produce the highest quality image. In this case, the consequence of that decision means racking Macs.
ImageMagick clearly grew up in environments with short lived processes - the command line, php, etc. Run it in a long lived process at your peril, and watch all your memory leak away. I also don't think it makes very good use of available GPUs at all.
Interesting. I work in VFX and we only use OSX when when we absolutely have to(read prores).
Interestingly I've found the color handling to be "odd". I'd recommend looking at nuke from the foundry.
It's designed to be fast, scriptable, and handle comically large images. One of the nice things is its multi-platform. you'll not need the GUI, but you'll love the color tools.
The math is well known, and many libraries exist. I bet you could replace the OS X and Mac hardware with more cost effective hardware and software with an engineering effort in the low man years, if not less. Which would presumably pay back pretty quickly.
If it's just OS X you need, and not the overpriced Apple hardware, why not just buy a bunch of OS X licenses and install it on your normal servers (a la Hackintosh)?
It would cost much much less and you would be able to have completely uniform server hardware with faster networking.
Just because it's against their ToS doesn't mean that their ToS is enforceable.
If you purchase a legitimate license to OS X and only use it on personally owned hardware that you keep in your possession (unlike that company that tried to sell Hackintoshes a couple years ago), it's very unlikely that Apple could ever have a legal case against you.
Your interpretation of the law is wrong. "Click-wrap" licenses have been held enforceable for over 15 years. See, e.g., ProCD v. Zeidenberg, 86 F.3d 1447 (7th Cir. 1996).
reply