Extensions is currently the only thing FF is better than the competition at, and they scrap it to copy Chrome APIs... I just can't understand why they would scrap XUL-based extensions before their new APIs reach feature parity. The day pentadactyl stops working is the day I stop using firefox.
Well, XUL-based extensions are not scraped yet. They will, at some point in 2017, but we're not there yet.
The reason to scrape them is simple: the total API surface of XUL-based extensions is pretty much all of the internal APIs of Firefox, which means that any change anywhere in the code of Firefox breaks some extension accidentally. That's a compatibility burden that Mozilla could afford when the only competitor was IE, but it makes the development of Firefox much less nimble than that of Chrome & co. So a new, less invasive and more future-proof API is sorely needed.
> is pretty much all of the internal APIs of Firefox,
But that's also what makes them more powerful than any other extension API.
If your box is airtight then there is no out-of-the-box thinking.
An almost trivial example: Internal APIs can inspect the current (as-enforced) sandboxing flags of an iframe. Which is not the same as the sandboxing HTML attributes on the frame, because those can be changed without effect after loading and also due to inheritance of sandboxing effects.
This tiny tiny tidbit is useful when making content-filtering decisions.
Indeed, it's difficult to achieve a perfect balance here. That's one of the reasons the WebExtensions team has been very actively discussing with add-on developers to try and prioritize which APIs should be created/ported to WebExtensions.
I don't remember specific numbers, but I seem to remember that pretty much all the extension points that had been requested by add-on developers were at least somewhere on the TODO list of WebExtensions. I hope that by the end of 2017, they will all be available, with a much cleaner API than their XUL/XPCOM counterpart, better tests and much better survival chances.
Of course, some of the changes will require pretty drastic reimplementations by the developers. For instance, going from js-ctypes to the native bridge will require full reimplementation in a different programming language. On the upside, js-ctypes was really painful to use and the native bridge will let you use a much better language for the task. Say Rust :)
The native messaging API is not useful for addon developers. They only cater to application developers that ship a companion addon to their application.
js-ctypes allowed an addon to access already-present native libs on the the host machine or bundled with the addon itself. Messaging only works if you can get users to install some external application in addition to your addon, which is a too-tall hurdle for small extensions.
From my perspective those barely comparable.
> That's one of the reasons the WebExtensions team has been very actively discussing with add-on developers to try and prioritize which APIs should be created/ported to WebExtensions.
You are taking requests to keep existing uses supported, mostly for big, already established addons. My concern extends beyond that. Simply by having access to internals anyone can currently develop novel uses, even if they initially do not have a userbase. Those can only be prototyped by access to internals. Someone might not even have an idea if they don't know an API exists that could be leveraged to do it.
In other words, "I can't let you do that, Dave" API design can stifle creativity.
And it also introduces a bias in favor of already entrenched extensions, which is less open than the current approach.
> js-ctypes allowed an addon to access already-present native libs on the the host machine or bundled with the addon itself. Messaging only works if you can get users to install some external application in addition to your addon, which is a too-tall hurdle for small extensions.
>
> From my perspective those barely comparable.
I haven't checked, but I seem to remember that you can provide a platform-specific binary as part of an add-on, spawn it from WebExtensions and then communicate with it through the native bridge. Building/shipping one binary per platform is annoying, but the rest is infinitely better than using js-ctypes. I should know, I've been one of the first users of js-ctypes and I still shudder at the thought :)
> In other words, "I can't let you do that, Dave" API design can stifle creativity.
True. On the other hand, this is the kind of paradigm shift that was implemented by operating systems ~20 years ago (more in the case of Unix) and so far, improving the safety and security of operating systems has generally played in favor of users and creativity, not against them.
This doesn't mean that there is no space for lower level, MS-DOS/Apple II-style hacking – I'm doing exactly that these days. But still, separating both sounds like a good idea to me. YMMV.
> And it also introduces a bias in favor of already entrenched extensions, which is less open than the current approach.
Good point. I seem to remember discussions about how to have an "unsafe" extension mechanism that would make it easier to experiment with non-standard APIs. I haven't checked myself, but I believe that this mechanism has been published, as an extension called "the WebExtension Extension" or something like that.
> On the other hand, this is the kind of paradigm shift that was implemented by operating systems ~20 years ago
No. Kernel modules can be installed at runtime. The user just has to grant the application the necessary privileges.
This means OS developers do not try to attempt to "strike a balance" between power and safety. They allow both, the unsafe stuff merely requires a fairly simple opt-in.
This is the whole crux of the issue. I do not mind security measures, as long as there is a bypass. Mozilla has taken the - in my opinion extremist - position that even bypasses cannot be allowed because they know what's best for users.
> I haven't checked, but I seem to remember that you can provide a platform-specific binary as part of an add-on, spawn it from WebExtensions and then communicate with it through the native bridge.
I am not aware of any such feature. This used to be true for bootstrapped extensions, but not for webextensions.
> I haven't checked myself, but I believe that this mechanism has been published, as an extension called "the WebExtension Extension" or something like that.
> On the other hand, this is the kind of paradigm shift that was implemented by operating systems ~20 years ago
What? No. Every OS out there allows the loading of user (administrator) supplied code to increase it's functionality.
In other words, every OS out there allows extensions that touch their internal APIs (or a very permissive subset of them in the case of Windows, that is known by how restrictive this is).
So, will extensions like pentadactyl or Tree Style Tabs be able to exist after "some point in 2017"? Ignoring rewrite time etc., will there be APIs available to do what they do today?
The WebExtensions team has been very proactive, attempting to get in touch with add-on developers across the spectrum. If the add-on developers have responded, there's a 99% chance that new APIs will be available for their add-ons.
Sorry, but that is just wrong, Certain types of APIs are infeasible to provide, because mozilla is too time and resource constrained to invest major resources into spec'ing, implementing and maintaining "niche" APIs.
So far, Firefox did not even manage to reach chrome parity, let alone bug parity.
From Giorgio Maone, developer of one of the most complex and most popular extensions, NoScript:
Developers and users are also concerned about add-ons being prevented from exploring radically new concepts which would require those "super powers" apparently taken away by the WebExtensions API.
I'd like to reassure them: Mozilla is investing a lot of resources to ensure that complex and innovative extensions can prosper also in the new Web-centric ecosystem. In fact, as mentioned by Bill McCloskey, at this moment I'm working within Mozilla's Electrolysis team and with other add-on authors, involved in the design of mechanisms and processes helping developers experiment in directions not supported yet by the "official" the WebExtensions API, which is going to be augmented and shaped around their needs and with their contributions.
From Nils Maier, developer of some of the most complex and most popular extensions, DownThemAll! (+ MinTrayR):
Read my comments
That comment by Giorgio (nice guy btw, shared a room with him on at a couple of mozilla events) is over a year old by now and rather optimistic. So far, nothing of that happened, nor will it ever happened at a scale that actually accommodates most add-on developers.
He wrote that it was already happening a year ago: at this moment I'm working within Mozilla's Electrolysis team and with other add-on authors, involved in the design of mechanisms and processes helping developers experiment in directions not supported yet by the "official" the WebExtensions API
I'm not an add-on developer but I've read about it happening in other places too, and I know for a fact that Giorgio is working on Firefox WebExtensions issues in Bugzilla.
> Sorry, but that is just wrong, Certain types of APIs are infeasible to provide, because mozilla is too time and resource constrained to invest major resources into spec'ing, implementing and maintaining "niche" APIs.
I don't know how you come to this conclusion. They've been able to maintain XUL up until now and whatever they end up with in this new API, it'll be cheaper to maintain than the monstrosity that is XUL extensions.
The specification and implementation are done in cooperation with the add-on developers. They can draw from the community here.
And while they haven't reached Chrome parity yet, they do have already implemented some additional APIs. You're writing as if one thing would block the other.
I get this conclusion from over a decade of add-on development, and many years of volunteering my time to/for mozilla in various capacities incl helping other add-on developers, reviewing add-ons for AMO, fixing bugs within Firefox itself, during which I became quite familiar with the code base and development process of Firefox.
PS: As to maintaining XUL vs WebExtensions API, they always maintained XUL/XPCOM themselves because that's what Firefox itself uses, meaning the entirety of Firefox developers "maintains" that "API". They regularly broke stuff for add-on developers, which was sometimes annoying, sometimes avoidable, and other times just necessary.
Some add-on developers learned to adept to that, other add-on developers switched to the add-on SDK (which like WebExtensions is a limited API, just not chrome compatible) if it was feasible, and a lot of developers will switch to the WebExtensions API if feasible in the future or even now.
And what about extensions that have thousands upon thousands of users, still working fine, but haven't been updated in years because the author ran into real life? Is Mozilla going to dump those users on the street and then say, "Well, your author should have contacted us, we don't have time to implement your API now"?
The current Firefox UI is XUL and XBL, tons and heaps of it. Re-implementing it in e.g. HTML is not an easy task, in particular not when you also want the result to look at least somewhat OS-native (took XUL itself ages to get there, btw). You also cannot do the OS-native look in HTML alone, you need support from the engine, which means servo has a some work ahead on that front too.
So for the foreseeable future, servo is not an alternative to replace XUL/XBL anyway. mozilla said that themselves.
Using servo to render the actual web pages inside the browser, that's another matter (once servo is considered stable enough, of course).
To the add-on developer, this should not really make any difference anyway, XUL add-on, SDK add-on or WebExtension alike, as most add-ons for the most part will use the regular DOM APIs which servo has to provide anyway to interact with web stuff.
I don't know why the parent is voted down; it's a totally reasonable comment.
It's definitely the case that the Firefox UI very much makes it impossible to move away from the current layout implementation at this point, and moving away would be a massive job. At the same time, the Firefox UI cannot move away from XUL/XBL until extensions no longer have free access to it, so this is definitely a first step that's relevant here.
As for using Servo for web content, it's worthwhile pointing out that not only is it the direction Mozilla is taking (with Project Quantum, and progressively moving components from Servo into Gecko), but it would also be a technical challenge because web extensions have access to extra APIs and have security checks disabled in others.
Firefox always broke stuff, since the very beginning. Mayor stuff, sometimes for no good reason, sometimes for very good reasons. Add-on developers learned to deal with it. A lot of add-on developers now have an alternative in WebExtensions, which will ease that pain indeed.
But those developers that cannot or will not use WebExtensions (e.g. I will not/can not port most/any of my add-ons incl. DownThemAll!), but probably would manage breaking changes (I did that for more than a decade now), are left in the cold, rainy dark now.
As a Firefox developer, whenever I make a change (and I mean pretty much any change to an API accessible through platform JS), I have two choices:
1/ either try and locate all the add-ons that will be broken, get in touch with their developers, be ignored by most of them, start several weeks of negotiation with those who do answer, then eventually, several months after my code is ready, land the change, and notice that Chrome has landed that same change a few months ago;
2/ ignore the add-on developers, improve Firefox immediately, but certainly break some add-ons, hence breaking the user experience of millions of users for no understandable reason.
As you can imagine, neither solution is good and everybody suffers from either.
WebExtensions condense all the instances of 1/ into a single point of time, hoping that we never again need to go through this painful dance.
Is there a reason a both approach wasn't pursued with the current method being marked unsafe but still available for use? A la Rust mixing code but having to explicitly mark unsafe bits.
For reasons listed in other comments above, the "we will expand WebExtensions API coverage based on current add-on needs" scares the shit out of me.
I'm having visions of why I quit using Chrome because there was some download UX that could not be altered, and the default Google "one true way" of doing it irked the hell out of me.
I'm pretty sure that there is some such mechanism. At least, I've seen patches related to such a mechanism. I don't remember the specifics and I can't find the documentation right now, but I seem to remember that there was something called the "WebExtensions Extension API" or something such that let developers access existing unstable APIs. It's probably somewhere.
Note that in this case, we're not discussing (only) "unsafe" APIs but also "unstable" APIs that can change from one day to the next without warning.
> 1/ either try and locate all the add-ons that will be broken, get in touch with their developers, be ignored by most of them, start several weeks of negotiation with those who do answer, then eventually, several months after my code is ready, land the change, and notice that Chrome has landed that same change a few months ago
(emphasis mine)
And responding to that emphasis: so what?! People who use Firefox instead of Chrome do so for a reason! You're failing in two ways: 1) "breaking the user experience of millions of users for no understandable reason," and 2) turning Firefox into Chrome.
If Firefox users wanted Chrome, they would already be using Chrome. They don't want Chrome, they just want Firefox to continue being Firefox, to keep working without breaking their add-ons that make their browser theirs!
Why is this so hard for Mozilla to understand?
The day Firefox stops supporting XUL extensions is the day I stop using Firefox. I've been using Firefox since it was Phoenix. I already have Chrome installed--I'm not using it, and there are reasons for that. You should meditate on this.
Mozilla is slowly committing suicide. It's painful to watch, because it doesn't have to be. One of the great forces in the Internet's history is not going to be around in a few years. Firefox will live on, because of the MPL. But what a shame to destroy the organization that brought it into being. Such a waste.
Oh well. If the Internet has taught us anything, it's taught us that, when it comes to software, developers are fungible. The phoenix will rise again.
Presumably he's talking about web features when he says "Chrome landed that change a few months ago". This isn't copying Chrome, this is web compatability. You need that for websites to work.
> which means that any change anywhere in the code of Firefox breaks some extension accidentally
So to fix this they're going to break every extension.
I think it's a fair assumption given how long the current extension API has been around that there will be more extensions that will die to disinterest or inability to move to WebExtensions than have been broken by all firefox releases from 4.0 to date.
I can see where they're coming from though. Today, developers need to create two versions of their extensions - one that works on Safari, Chrome and Opera (and Edge?) and one specifically for Firefox.
As the market share of Firefox is not that high, maybe extension developers eventually won't be bothered with supporting that extra version, which means Firefox extensions will be less up-to-date and there will be less of them.
That's probably a trade off they have to make, though I imagine they're aware that some add-ons won't work and that users will be annoyed by that.
I'm sorry, but I hear this all the time--it's the primary argument used for deprecating or removing any functionality in any software. And my response is: so what? Maintaining software is a burden, period.
Software exists to be useful. The APIs in question make it useful. The developers have been maintaining it for nearly 20 years. Their employer receives millions of dollars a year to do so.
They don't want to maintain this "burden" anymore because it's not fun to maintain old code. It's not glamorous. No one becomes a rock star by unloading the bus. But if you remove the baggage compartments, the bus ceases to be useful, and the show doesn't go on.
Chromium is such a cooler project. It was started from scratch (except for the WebKit part), and it's made by Google (which at least used to be cool), and it's got all these modern APIs. They're so great that it only took them 7 years to support resumable HTTP requests[1].
Firefox used to be about the users. Now it's about the developers. (This is not to say that individual developers are selfish, but that the organization as a whole is behaving in a way that disregards the needs of users and prioritizes the desires of the developers.)
1: https://bugs.chromium.org/p/chromium/issues/detail?id=7648. Just look at this cleverness: "In the absence of a crypto::SecureHash object, DownloadFile reads the partial file and calculates the partial hash state in a new crypto::SecureHash object. If a prefix hash value is available, then the hash of the partial file is matched against this prefix hash. A mismatch causes a FILE_HASH_MISMATCH error which in turn causes the download to abandon its partial state and restart." Any other software in the world would just restart the download. I mean, they already give it a special ".crdownload" extension, so it's not like any other program is going to mess with it. But no, they can't just resume the download, they have to make 15 hash checks and pass around 7 different objects and find every possible reason to start the download all over again. What could have been a 5-line patch turned into 7 years of waiting, a dozen revisions across 50 files...
With regard to resumable downloads -- that "cleverness" isn't for the sake of being clever; it's in there to avoid two very common cases where a naïve download resume will corrupt a file:
1. The remote file has changed since the previous download, so "resuming" the download will end up combining two different files.
2. The user's computer crashed during the first download, and some of the data in the partial download was not written to disk properly. (A particularly common case: the last few blocks of the file are zeroed out.)
It's not the same APIs as chrome though. It's the same base. The Firefox APIs are planned to be a superset of the Chrome ones (so you have some compatibility there), but will provide the functionality exposed by many of the older APIs.
pentadactyl is co-developed by a member of the Firefox Add-ons core team [1]. I don't know what the plan is to port it to web extensions, but at least you can be sure the team is aware of it ;)
I just migrated a complex extension and with Firefox 52 most features are there. It was a bit painful but the new API is much better IMO. Also, we now have one code base less to maintain.
Didn't Pentadactyl stop working a while ago? For a while I was trying to keep it working, making the necessary changes on every firefox update, but eventually it got too much for me and I switched over to VimFX. At first I didn't think it would be satisfactory, but it actually is a quite good 80% solution. What seemed to be a weakness - the fact that it doesn't aggressively change how the browser works - now seems like a good design decision, making it considerably more robust than Pentadactyl/Vimperator in the face of the changes Mozilla is making, even at the cost of some of Pentadactyl/Vimperator's more advanced functionality.
One of them answered here and IIRC what (s)he said was that they won't cut off the old APIs until every popular extension can be upgraded to the new api.
So it was less of providing a poor api like Chrome and more like creating a newer and better API that also happens to be a superset of Chromes API.
And this is how Mozilla goes out. Their last remaining reason for existence is being destroyed. They've already gotten rid of Thunderbird, Tab Groups, full themes, and more, but left us with the things nobody wanted (Pocket, Hello, etc.). They've also slowly let performance go in many places, seemingly.
A few years ago, I switched to Firefox for precisely the set of features they've axed. It was also much faster than the alternatives for me, used much less RAM, and rarely locked up. Since then, I've seen performance decline, RAM usage ramp up, and watched the browser lock up nearly once every hour. At this point, I'm probably just gonna go to Chrome. What's the point if Firefox is just gonna be a shitty clone?
I have to disagree, Firefox is in a much better position than a few years ago. RAM usage is lower, e10s works nicely. Tab Groups was spun out into an addon; a sensible decision for a feature extremely few people used. Test Pilot is a better solution for testing features like this.
Very relevant to this thread, addons and plugins are a huge cause of poor performance - I ran into this a while back, where an addon was leaking memory via zombie tabs. Have you tried disabling addons?
Poorly written addons were precisely the reason that firefox got a bad rap in the past. I hope webextensions being more common and restricted will help this out.
If you have significant lock-ups once an hour, you should troubleshoot. Probably one of your add-ons causing the issue. Your RAM usage might very well go down with that as well, and if you disable a few of your add-ons during troubleshooting, chances are also that Electrolysis (multiprocess) will get enabled for you, which should significantly improve performance.
The decline of XUL started around 2009. I spent months building a very complex app around XULRunner only to find out that Mozilla had no interest to support the technology in the future.
So I decided to leave the platform and put my efforts on native and web and frankly it was a good decision.
The WebExtensions API is no match to XUL and XPCOM but I only hope that now that we have Mozilla behind the technology we will see further API improvements.
> I decided to leave the platform and put my efforts on native and web
For the sake of curiosity, did you create extensions as side-projects/startup ideas? Do people pay for extensions or are we talking about patreon funds here (Or startups like pocket whose extensions are ways to get more crowds).
It was a desktop app that was based on top of XULRunner and yes it was for profit and people payed for it. I have no experience with selling extensions but I would imagine it is a very hard market to crack unless it is targeted towards enterprise customers.
Based on my experience, I can only advise not to base your entire business model on browser extensions as you will become dependent on platform moves (again unless enterprise).
I'd love some figures on whether the main popular extensions will continue working, as well as how many in the Firefox Add-ons site are already web extensions.
I'm curious about my mains:
- Classic theme restorer
- uBlock
- DownThemAll
- Markdown Here
As I'm already using electrolysis, I know Markdown Here basically doesn't work. How I can check whether things will break or not.
You can consult https://www.arewee10syet.com/ ; seems authors are mostly showing some efforts to migrate their extensions. CTR: seems OK, uBlock: uBockO: seems OK, DTA: bug, but at least tries to be e10s compatible, Mardown here: unknown (supposedly not compatible?)
You can try yourself: either install https://nightly.mozilla.org/ and load it with your addons (it uses different profile) or use your stable one and look at about:support page for "Multiprocess Windows" cell if it states 1/1 and does not mention "(Disabled by add-ons)" you are covered. You can try some about:config alterations [0] to force it if you are adventurous. In this case, it might be nice to install "Add-on Compatibility Reporter" [1] for tracking and sharing your findings.
Careful. The e10s transition is a much smaller step than the move to WebExtensions. To witness, notice how (higher up in this thread) the DTA developer once again said that there will be no DTA as a WebExtension.
Oh, thanks for pointing this out, I really mixed multi-process and WebExtensions and diminished main XUL problem (which I oversimplified for myself as "just problem of settings pages to be rewritten to HTML" what is really wrong [0]).
As I understand it WebExtension is inherently multi-process compatible so (to answer question below) the only extensions we know today will work next year from arewee10syet.com are those marked as "compatible-webextension".
[0] Most of extensions I'm used to alters chrome visually or functionally: calls external applications, changes hotkeys, so almost none of them are XUL-free. But I still hope / believe Mozilla is not that insane and will achieve consensus keeping the most important features for extensions authors available in the end.
No, what this check means is that the recently introduced multiprocess-features of Firefox won't be blocked by Tab Mix Plus. You'll get much better performance, if you have no extensions which block that.
The guy seems to be confusing things here, too.
But while this doesn't confirm it, it neither confirms the opposite. So, it is very much still possible thst Tab Mix Plus gets ported to WebExtensions before the old add-ons get shut off, i.e. you might very well be able to use it still then, there's just currently no way to know yet.
uBlock and Markdown Here also exist for Chrome, so they'll very likely work or be ported.
The DownThemAll devs have been very vocal about "WebExtensions being the likely end of DownThemAll", although I don't know if due to understandable unwillingness to redo a lot of their work or actual technical limitations.
Ironically the long-term ideas of having a HTML based GUI could enable heavy modification again, at least in modified "distributions" of FF? Doesn't solve the now/next year though :/
Can/Do you want to comment on Mozilla's efforts to work with extension developers to bring everything needed to WebExtensions? While the promise sounds great, I would expect there are issues that make it not as easy in reality.
They are trying I guess, and may succeed for a bunch of use cases to bring what's required. But certainly not all. Even the add-ons that can be somewhat reasonably be ported will have to deal with limitations, and I think the quality of some of those ported add-ons will take a (major) hit.
To elaborate:
If you're doing "web stuff", toolbar buttons, and request stuff (adblocking etc), you'll probably be fine. If you're lucky enough that you only need a few additional new things and your name is e.g. Giorgio of NoScript, you also will be probably fine.
If your add-on does not have a sizable user base, and you do fancy things like modding the browser UI itself, or doing something else not entirely "webby", your outlook is a lot less rosy.
It's still not clear to me whether you actually tried to work with the Mozilla devs or if this is still fueled from your initial (in my opinion rather rushed) statement and some form of false pride.
But yeah, I'd just like to say that in case you did not do your best to maybe get to a solution with the Mozilla devs, that I'd appreciate it, if you could try again.
I don't personally use your add-on, but I know that a lot of people do and I think that this initial discourse to get some of the APIs sorted out is rather important.
Even if you don't end up writing the extension, it'd be good to have reasonable APIs at least drafted. So that maybe someone can offer a half-assed Down Them All, which maybe is already good enough for some people, and then maybe that add-on's developer can continue working with Mozilla to get a proper API fully sorted.
You should also consider that Mozilla started out with this "webby" state from Chrome. Anything modifying the UI and so on still needs to be defined. Once they have some of the other UI-APIs worked out, it'll be easier for Mozilla to fit new libraries into there and then it'll also be easier for Mozilla to work with your API needs.
I volunteered massive amounts of my time for mozilla for more than a decade now, not only developing. I do not consider my previous statement rushed, but a realistic prediction based in a decent foundation of experience and knowledge, and - rather unfortunately - so far I am right and there is no indication that that's about to change.
I'd really like to share your optimism here, but really, knowing what I know, I just cannot.
I think it's safe to say that many if not most popular extensions will be provided API support and converted over.
I think the real issue is that innovation in the extension space is effectively over - the Firefox extension system is so vibrant because anything was possible. Now Mozilla is reducing the possibility space to that which is provided by the much more limited WebExtensions and adding in support for existing popular extensions. But new extensions, doing things yet undreamed of, will no longer be possible.
I'm not sure what the distinguishing features of Firefox will be moving into the future. More and more, I think it's becoming Just Another Browser, with a better ideology behind it. Well, that's not a winning formula. I'm really disappointed and I don't know what to do other than stay on old versions of Firefox as long as I can.
That article is more than 1 year old. And Mozilla is going beyond the Chrome APIs, they are implementing shims and additional APIs to port almost everything over.
There is a reply on a review for the new release from September 1, 2016 which says:
"Just to clarify: We did not port it to the WebExtensions API, nor do we think it is possible to port it in any significant manner to the WebExtensions API..."
“After a rather long long time, with many new things to address such as Australis, e10s, limited time we could spend on the project, a little lack of motivation after mozilla required mostly pointless signatures and announced to kill XUL add-ons…” - Nils, September 2016
uBlock Origin was first developed as a Chromium extension, so converting to WebExtensions API is a non-issue. There is currently a dev build for the WebExtension version of uBlock Origin in the latest release.[1]
Mozilla have said they will adapt Web Extensions along with consulting add-on developers in order to bring along the maximum number of add-ons.
For me, if DownThemAll isn't one of them, I'll either stay with the last compatible version of firefox or, more likely switch to one of the alternatives that does support it and the others that have become such an integral part of my browsing behaviour.
I've been burned so often by the changes made in recent years, so much so that I have next to zero loyalty to them anymore. And I've been using firefox since the browser was still part of the Mozilla suite.
There may be a lot of components and interests involved in this change.
As a user of, not developer for, Firefox with its extensions, it's this simple for me: The day my existing extensions stop working -- the day neither they nor an alternative under the new paradigm offer equivalent functionality -- is the day I no longer have any reason to stay with Firefox.
I'm not saying, "Stop, wait, a user is unhappy!"
I'm simply saying that, as a perhaps sophisticated user with some more advanced functionality, security, and privacy concerns, the set of extensions I currently use is my only compelling reason for using Firefox over Chrome.
Well, that and Google/Chrome's current trend towards its own version of Embrace, Extend, Extinguish wrapped within a panopticon.
I embrace the browser as the user client. I'm glad to have security issues addressed, but I also require that the browser do what I want, and not what -- a la Chrome, for example -- media interests want to dictate.
I'm watching 2017 with trepidation, and wonder whether and when something may fork. Not unrealistically, though; I'm skeptical any fork is going to have the resources and commitment to maintain a modern, full-featured browser in the face of current and future... err, developments.
Maybe such a fork will, both of desire and of necessity, look instead towards a bit of minimalism. Not doing "everything", but doing core functionality well and securely.
I guess then we'd have to see whether the content farms with their Javascript++, DRM delivered content, end up leaving such a fork too much out in the cold for it to maintain sufficient interest and/or usefulness to keep itself going.
Thanks for that pointer! I've been off Firefox for the past six months since they broke my private feedreader extension[1]. I'd been using Opera but I'm typing this out on qutebrowser.
You can also try Vivaldi when something does not work in qutebrowser as it has support for single key keyboard shortcut s. I recently voted for native vim mode in Vivaldi reddit feature request thread. Worth checking out.
As an addon developer, this annoys me. I've had to port my addons to different APIs twice in the last couple of years. Hopefully this is the end for a while.
Slightly OT: There is another problem that most people seems to miss. Add-ons in their current form in ff are tightly coupled to the browser to the point that it's the browser itself.
In the modern web, the browser knows too much about you, and so does the add-on, and it's not unheard of them to explore it[1]
How does that free webpage to PDF converter add-on make money? By selling your web history of course.
I hope the "end of 2017" deadline is flexible to change based on what's supported and what's not supported for current extensions. As people have said on the comments on the Mozilla blog post, Firefox will die without the popular extensions, of which there are many. Even today, Firefox extensions are far ahead of extensions on other browsers in what's available and how well they function.
If I can't use Ghostery, Blend In, No Resource Leak (or they are watered down), I see no use for Firefox. Please Mozilla, don't kill some of your biggest attractions! You copied UI from Chrome, now you are copying Add-Ons, why would anyone need to use your product then? Aren't you shooting into your own feet? When are you going to pull off Opera and just skin Chrome differently?
"By the end of 2017, and with the release of Firefox 57, we’ll move to WebExtensions exclusively, and will stop loading any other extension types on desktop. To help ensure any new extensions work beyond the end of 2017, AMO will stop accepting any new extensions for signing that are not WebExtensions in Firefox 53."
Based on Mozilla's track record, that's too soon. It took about three years after the announcement of the Jetpack API before it worked right.
RSS reader, Another RSS reader, Dictionary, something for user scripts, domain white list, Another dictionary, Gmail checker, uBlock origin. 4 of those extensions are my own. NIH much :(
Only accepting WebExtension based add-ons while not yet fully implementing the Chrome API seems like it would lead to a large number of unsupported add-ons.
There are constant references on forums to how they want to move away from XUL, for both Firefox and Thunderbird. XUL works, but I feel like very people are able to improve the guts of XUL, and that can be a huge risk for the future of the project. Pale Moon seems to be the main contender willing to maintain XUL, but presumably not improve it (changes to XUL can break extensions).
As someone working on another free software project with an aging code base, we see this debate all the time: limited funds, limited interest to maintain the old core, but users rely on it. However, users also want performance improvements, more reliability, new features... not to mention what do you do when security issues pop up. At some point you need to cut your losses and find a model easier to maintain.
TamperMonkey is not equivalent to GreaseMonkey in terms of development workflow because TamperMonkey stores scripts in a database and pretty much requires use of its built in editor which is of 'built-in-editor' quality. On the other hand, GreaseMonkey stores scripts in files and therefore allows any editor/IDE to be used in the development workflow.
Because scripts are stored in a database, TamperMonkey does not play swimmingly with GIT either. Finally, TamperMonkey is a third party tool that from time to time asks for money. GreaseMonkey is open-source.
The TamperMonkey EULA:
Your right to use Tampermonkey continues until terminated by the Company, which may terminate this Agreement and your license to use Tampermonkey at any time, without cause and without notice. You may terminate this agreement at any time by uninstalling Tampermonkey. This Agreement will automatically terminate if you fail to comply with any of the terms of this EULA. Upon termination, you agree to stop using and to uninstall Tampermonkey.
Zotero will work the way it does with Chrome now: there will be a plugin that integrates with Zotero Standalone, which will continue to be developed using XULRunner.
I think this will be good in the long run--Zotero is complex enough to merit being its own application, and separating Zotero and browser plugins will make it easier to migrate Zotero off XULRunner at some point, if the developers choose to do so.