It just seems amazing to me that anyone with anything remotely resembling a degree in CS[0] would think that this kind of scanning would work for a Turing Complete language. If you restrict extensions to non-TC languages, then it could be made to work[1]. Also the argument that it stops lazy malware authors is absurd -- it's as if they haven't heard of script kiddies. The situation here is exactly analogous: one someone figures out a way around their (extremely ad-hoc) checks, everybody else gets that "workaround" for free. They're only setting themselves up for a neverending whack-a-mole game which they can only lose, solely based on the amount of resources they have.
[0] As one hopes at least some of the people working on this would have. Not that a degree confers magical powers, but any CS education will go over what's known to be impossible.
[1] ... by static checks via type system, termination checker and restricting access to "unsafe" modules/libraries. Of course it's doubtful how many extension authors would be able/willing to write their extensions in e.g. Idris.
...about halfway through that article I remembered how he'd said "lucky you," and understood it.
Wow.
I suddenly understand why Firefox isn't the dominant browser anymore. Mozilla hasn't adapted to being the competition. At all. They've, like, spazzed out and are pitifully flailing their arms around as they die. Not a particularly endearing way to go.
What a strange new world we now live in. Blink is now the future of the Web, Webkit is where you'll find occasional snippets of "oh, cute" when OS X updates, EdgeHTML takes over from Presto as the leading closed-source competitor, and Vivaldi trails everyone else as a source of "wat."
I wonder where Cargo will end up. It's a pretty weird sprawling mass as it is right now.
For anyone out there interested in the browser scene (and non-bureaucratic, non-toxic development environments!), I definitely have to recommend http://netsurf-browser.org/ - checking the site, I see they just got a JS engine properly integrated, and it looks like they've finally started major work getting the rendering engine up to scratch to handle dynamic page changes (once the page was in the canvas it used to be unmodifiable).
The thing that stands out to me in the discussions on the Mozilla mailing list is how Mozilla is changing the goals of extension signing after the fact. It was initially presented as a method of combatting malware but with the back and forth over Zotero they now seem to be expanding the scope to include add-ons with performance problems, addons with code that is hard to read, addons with code that touch powerful API's unnecessarily. That's opening up a whole lot of grey area for their volunteer reviewers to wade into. In hindsight, I also wonder if they shouldn't have just waited for XUL addons to be phased out since the screening process for the more limited WebExtensions API should be easier.
[0] As one hopes at least some of the people working on this would have. Not that a degree confers magical powers, but any CS education will go over what's known to be impossible.
[1] ... by static checks via type system, termination checker and restricting access to "unsafe" modules/libraries. Of course it's doubtful how many extension authors would be able/willing to write their extensions in e.g. Idris.