The last I heard, Google were being sued due to the use of Courgette allegedly infringing a patent and, as a result, they had stopped using Courgette to distribute Chrome updates. Does anyone know the result of that case? Are they now using Courgette again?
Red Bend's attempt at a preliminary injunction was quashed recently, with the court stating that "with insufficient evidence that Courgette infringes on at least one patent claim, Red Bend cannot prove a likelyhood of success",
I'm no lawyer, but this sounds pretty bad for Red Bend.
Call me stupid, but why do you have to disassemble the program to get the symbol pointers when you just finished compiling the thing in the first place?
Couldn't this intermediate step of pointer collection be part of the prelink process and skip all this guesswork?
If the source is in a single language, compiled by a single tool, with one single version, and entirely linked by a single linker strictly at the end of the process, then that might be feasible. But that's probably not the case. If there is much heterogeneity at all, getting proper symbol information may be a lot more work than analysing the binary for control and data flow. For example, intermediate linking steps may resolve symbol fixups internally and just leave a list of relocations behind, or perhaps use strictly relative addresses within what looks to the final linker like a monolithic blob.
If we're talking about a way to generate an upgrade image for a single architecture, taking a known version # of an executable to another known version #, built on a production build machine, I would think that could cover the case easily.
That kind of led to my second question: this makes really small images, but only from one known version to another, right? What happens if the target to be upgraded is 80 revisions behind?
I can't speak to what chrome does, but usually there would be rollup patches to take advantage of, cutting down the number of patches to apply dramatically. E.g. To get from 23 to 76 you might only need to apply 23->25->75->76. This would be slightly less efficient than going directly both in terms of time patching and patch size, but not terribly so (one can assume most patches are decently disjoint). Most importantly, this keeps the total number of supported patches manageable.
We want smaller updates because it narrows the window of vulnerability. If the update is a tenth of the size, we can push ten times as many per unit of bandwidth. We have enough users that this means more users will be protected earlier.
They're trying to get the vulnerability window down to minutes! If they succeed, this is going to have an impact on the economic viability of running malware.
We're also giving google implicit permission to push code to our machines anytime they want.
We should have conflicting feelings about this
You have a point. But I think the answer to that is in the market.
Through Chrome, the browser is evolving into a new kind of platform. It will be a platform that encompasses locally execution and memory and all of the flexibility and availability of the cloud, and it will all just work.
There are sure to be competitors. If one competitor fails, we can always take our business elsewhere. We only have to worry if the government legislates us out of a means for oversight.
This is exactly how updates should be done. Usually, I'm not that big of a fan of things that Google does, but this is a great achievement. Hats off to those who contributed to this. I hope all update systems eventually work this way.
Well, most binary updates do go through bsdiff, which was designed for exactly this. I'm surprised they were able to beat it by such a margin. Shit, the guy who made bsdiff did his doctoral thesis on the subject (actually, the algorithm used in bsdiff generates patches about 25% larger than the one in the thesis, but at huge performance increase (or, the thesis generates patches 20% smaller)).
I think Google could handle spending a bit of extra time on a single diff, in order to save 25% of their update bandwidth. Maybe they should just implement the thesis.
Yeah, me neither. What projects are we talking about here? I'm in Linux so I'm getting the full deb each time. I'm okay with that considering it's managed for me. I'll take that any day.
Adobe? Apple? Those are the two vendors that I can imagine people having installed on most, if not all Windows computers. And I know for a fact they both distribute updates as full copies of their software waying in at a couple hundred to several hundred megabytes, requiring manual updates, pop up windows and restarts.
I've yet to see anyone do updates as fast or seamlessly as Google.
While Firefox updates in the past were not as quiet or seamless as Chrome updates, Mozilla is moving in that direction as part of the new "rapid release" process:
http://news.softpedia.com/news/Google-Sued-over-the-Courgett...