I find it sad that you say 'phew' to this. Surely there's a more resilient way to do disposable email that wouldn't be knocked over by one pull request?
I noticed this too. So, I changed my payment option to Paypal, from credit card. Paypal has option to cancel subscriptions from its user dashboard. Cancelled from there.
Actually, in normal usage the word "sideloading" means transferring files laterally. That is to say, not in a client-server relationship.
In the world of android, "sideloading" became parlance for installing applications because historically you used adb on a computer to sideload the app on.
Did this change? Android-wise, I've always interpreted "sideload" as the adb method... just installing an apk in the UI isn't "sideloading", it's "install from apk" vs "install from play store".
The original article only discusses a specific, non-standard (to me, at least – probably made more sense for IT departments) method that Mozilla is calling ‘sideloading’ will be disabled. Per the article, ‘sideloading’ is defined as putting an extension file in a special folder that results in all instances of Firefox on the machine automatically loading that extension.
In the page yorwba linked, it shows how to generate a signed XPI file that can be distributed and installed locally from that file in a more conventional manner (e.g. drag & drop the file onto Firefox’s Add-ons page).
Yes, the older method of extension side-loading, supported at some point by Edge, Chrome, and Firefox, was for IT departments who created OS deployment images with software (incl. extensions) burned into them. Usually this was combined with Group Policy / Device Profile settings that made the extensions impossible to deactivate or remove ("force enabled") and potentially blocked any extensions other than those preinstalled.
I believe that all the browser makers have, at this point, reached a consensus that deploying extensions directly as on-disk files this way makes it 1. too easy for malware to just set itself up as seemingly "deployed by Group Policy"; and 2. makes it too hard for these extensions to be updated as often as they might need to be, or disabled if the browser-maker declares incompatibility with an API in them, etc.
What all the browser-makers seem to favor nowadays, for IT departments who want to do OS-image deployment, is an approach where the burned-in Group Policy will just list out a set of extension IDs that are to be force-installed and force-enabled; and then the browser itself will do the work of retrieving and installing them (but will still treat them like any other extension as it goes through the install process, vetting it for compatibility, upgrading it through its registered upgrade channel, etc.)
This means that every extension in these browsers now has to live in the browser's extension store—even if it's a private, just-for-your-own-company extension. (Which, honestly, there's not much to be said against; the "enterprise deployment" parts of app-stores don't usually force developers to go through pre-vetting before new versions are published or anything. It's just cloud hosting—with the proviso that, due to having download logs, the app-store can see if your "enterprise extension" has an install profile that looks more like that of a virus rather than an enterprise, and then blacklist it.)
Here's Google's documentation on the Group Policy settings that modern Chrome looks for, for comparison: https://support.google.com/chrome/a/answer/7532015?hl=en. Probably Firefox wants to move toward a similar model. And more power to them, honestly; right now, Firefox's extension ecosystem is far too easy a target for malware authors.
From what I know (and I've got quite a few self developed addons running), installing unsigned XPI files has not been allowed for a while. I've been uploading them to my profile in the add-ons developer portal and getting back the signed XPI file. I could be misreading this though so apologies if I jumped the gun here.
Signing is only required on official release and beta builds -- users on developer edition, nightly, and unbranded builds can opt-out of the signing requirement by flipping a setting in about:config.
I worked around this with a script that patches the omni.ja zipfile to change the setting to require signing.
It needs to be unpatched for updates - I don't allow my normal day-to-day Firefox instance to write to it's own binary anyway. Firefox tells me when there is an update, and I then restart into my special script that unpatches, runs Firefox with permissions to modify itself and a profile that I don't use for day to day browsing, only for updates, and then patches it again.
Developer edition has updates but it's a beta. Unbranded builds aren't a beta but don't get updates. There is no version that's just like normal firefox except without signing enforcement.
Part of the problem is the very delineation between developer and end user.
How about Firefox just be a powerful open platform that anyone can develop for easily with as few roadblocks as possible? We should all be one step away from being developers.
I am so glad I was on the old Firefox 12 years ago. I'm glad that I was able to quickly whip up extensions and share them with my friends. Wanting to automate bypassing my school's wifi captive portal, or wanting to change how the bookmark menu was displayed encouraged me to experiment.
The fact that Mozilla now sees only an audience of consumers that need to be protected from themselves and marketed to is the problem. Mozilla is afraid of their precious precious brand being sullied.
The point of my post was that there should not be such a hard delineation between regular users and developers. A developer edition is not a helpful solution.
What’s the trade-off here? Browsers are trying to protect users against a metric ton of malware trying to exfiltrate login credentials, and the vast majority of users have no clue what an extension even is.
Last week my dad thought he had a virus, but really it was just a BS spam site that he had accidentally allowed to send him notifications. The screen in Chrome to revoke notification access was like 8 clicks deep.
Browser extensions are a powerful, beautiful, dangerous bit of tech. Is it asking too much to put some guard rails in place that really aren’t too much trouble to follow?
> Is it asking too much to put some guard rails in place that really aren’t too much trouble to follow?
No, but that's not what Mozilla is doing. A confirmation prompt is a guardrail. This is a fence.
> Last week my dad thought he had a virus, but really it was just a BS spam site that he had [...] allowed to send him notifications.
That's his own fault. Not an ideal outcome by any means but a private organization has no right to restrict people's freedom just to protect others from themselves.
This would not have protected your father from any of that. If hostile code can inject an extension into your Firefox profile, it can also install a keylogger or read your unencrypted Firefox password store. There is almost no protection against your credentials being exfiltrated. Neither would it protect you against unwanted notifications. It will however greatly reduce the functionality of Firefox.
This is security theater.
> Browser extensions are a powerful, beautiful, dangerous bit of tech. Is it asking too much to put some guard rails in place that really aren’t too much trouble to follow?
There are many layers of guard rails already. The problem is that now they want to also inspect every extension that I use, even if it is for completely private use and will never be available to the public. And Mozilla does not exactly have a good track record with trust.
Probably not. There's a clear conflict between what (some) power users want and what the best/safest choices are for the general public. And, in general, mainstream platforms like Firefox should default to what's best for the general public. (Yes, you could have bypass mechanisms but backdoors like that are more or less asking for trouble.) See also Apple.
Added: It's not clear to me how "hostile to power users" this particular change is. But my general statement still applies.
It is not clear what is best for the general public.
You can aim for the "safest" choice, which seems to be what you are advocating. If so, it needs to be said explicitly and the implications of functionality reduction (that safety increases always brings) openly discussed and debated.
The model of claiming "we will do what is best for users" with a freedom to replace "best" by "X" has been thoroughly compromised by googles and facebooks of this world. We need to be explicit about goals and tradeoffs. Not attacking your post, just arguing for being clear and honest on goals of non-commercial software.
They have those evil metrics about what users are doing.
If 1% of users are getting exposed to malicious software by a feature and 0.0001% of users are using it, it's a little more clear than you say. I've made both of those numbers up, but I think it's likely enough that very few end users are depending on the ability of other software to silently inject plugins.
With the caveat that power users are probably the ones to have turned off metrics. Mozilla is probably aware of it, but it's still hard to compensate for that.
I do agree with you to a point. I do think they have a difficult task trying to balance protecting people from their own ignorance and catering to power users who know exactly what they are doing and desire more freedom.
An option that just occurred to me is to be able to start Firefox with a special flag that would give access to some extra options to allow actions that reduce security - such as sideloading for example.
To help prevent innocent users being coerced into starting Firefox with that flag it could be something like "Firefox.exe -pleasehackme".
Power users would know what the flag is for but even the most naive user might hesitate to start Firefox using a command inviting themselves to be hacked :-)
Probably a stupid idea, but just putting it out there.
I don't think there is any way to enable unsigned extensions on vanilla Firefox, and I don't think they will ever allow it.
Their justification is that if there were some command line option to allow it, then users could be tricked into doing that.
But, couldn't the user not also be tricked into simply downloading the developer edition? Couldn't the user be tricked into deleting their home directory?
Personally, I find these justifications dubious. There is a kernel of truth that in some edge cases it can offer some protection. But it feels far more like something Google or Apple would do, and Mozilla is either cargo culting them, or has been pressured into doing this.
> The hacker can probably still compile one by their own, but at least it will makes them pain in the ass.
I've compiled a branded build of Firefox myself, and it is as simple as setting a single flag at compile time. Almost trivial. The only protection that branding has is legal, not technical. If I tried redistributing the branded build then Mozilla might sue me. Do you think they will be able to sue malware authors when they do it?
There is absolutely nothing stopping a hacker from replacing or patching Firefox.exe with a branded version that will run their hostile extension. Even if they do not have write access to Firefox.exe, they can download it somewhere else and change where the shortcut points to. It would be almost impossible to tell the difference.
This is not a serious security measure.
But I think you are missing the bigger point here. If they can write to files on your computer, then it is far too late. They can encrypt and ransomware your documents, they can install a keylogger, and they may be able to extract all of your passwords and cookies from Chrome and Firefox.
It would be like if someone stole your car, but at least they don't have the keys to the glove compartment.
Not sure if that is what they do, but on many platforms, there is also code signing. E.g., even if you could trick someone to download your patched/hacked version of Firefox, I believe they'd get a warning on Windows that the software is unsigned.
That is a fine idea. We can make Firefox as safe as they want at startup. Just keep it as a default option -- something power users can turn off and do not make this a hardcoded choice "because those users turning it off may not know what are they doing". Inform, not restrict. My 2c.
Yeah, exactly. Malware could also just delete vanilla firefox and replace it with the developer edition. Just overlay ads over the browser window itself. Or anything else really.
Trying to protect against hostile code already running on the same computer as the browser is futile. At best, it should warn the user if suspicious modifications were made.
And it comes at such a high cost for such a narrow measure of protection.
Yes, I'd argue "safest" is usually the best default for the general public. My reasoning is that a bit more fine-grained control, performance, choice, etc. doesn't have much upside for most people whereas getting compromised can be a very big downside.
Perhaps Firefox hasn't been very clear on its tradeoffs but Apple, for example, seems to have been IMO. (Yes, it tends to be wrapped in marketing rather than tech speak but I think they've been pretty consistent about their priorities.)
I personally have no objections (usually not even strong feelings) to setting defaults, as long as that's what they are -- default choices that users can change if they see fit.
But an option that cannot be changed is not a default choice -- it is a hard coded design decision. My 2c.
And so is, for example, Android. But due to some (shady) tactics, a successful fork of either is impossible for a small group. Saying to a hungry kid in a favela "you are free to get rich" is technically correct, but ignores practicalities.
You could argue that Mozilla is either a shepherd (making best decisions for most users and that's it; take it or leave it) or a partner (listen to users and try to implement features they want) but you cannot eat your cake and have it too.
What makes it nearly impossible to fork Firefox? Mozilla already distributes an unbranded "fork" that allows unigned extensions. It's just a build flag away. What potentially shady tactics am I missing?
What, specifically, is the feature you think Firefox is removing the option to have, that you think people want?
I suspect that if we drill down to this point, you'll find the feature is still available with a trivial amount of work, or on the outside case it's the more specific thing they actually talked about, it's not a trivial amount of work, but it's just not automated for enterprise installs anymore.
I also think the discussion moved to a more generic "safe default with options for users to change" vs "choices hard-coded based on perceived safety for general users".
> Manually installing extensions from an XPI file.
Everything I see indicates that is still supported. Especially if you read the comments to the post. XPI files seem to be supported, what they removed was the ability have an executable installer install them automatically.
The change, which seems to have been communicated poorly, seems to be that some actions within Firefox need to take place (allowing the user to opt-in to the extension) before it will be used by Firefox.
This comment[1] from Caitlin Neiman, who I just looked up on Google and appears to be the Firefox Addons Community Manager, states:
Developers will still be able to self-distribute, and you will still be able to install extensions from self-distributed (non-AMO) sources.
Going forward, developers won’t be able to distribute an extension through an executable application installer.
> I also think the discussion moved to a more generic "safe default with options for users to change" vs "choices hard-coded based on perceived safety for general users".
I'm pretty sure what they did is not hard coding choices. There are multiple ways to get an extension locally, the hardest of which but still works no matter what is installing the developer version and using source (XPI files can be unpacked).
What they actually did was lock down one method of installation which is almost never used by users and is used by orgs and malware, which is to drop extension into the plugin folder with the extension name as the folder name and have it automatically added to Firefox. Now they require you add it through the Firefox browser interface so the user
has to opt-in to the extension.
Do you really consider being able to provision your own profile a backdoor? I do it all the time. Do you think that Apple is a good example of what to do?
Having a single entity control what extensions you get to install is not what is best for the general public.
Personally, I think that Firefox should be the more flexible extensible alternative to Chrome. It should be to Chrome as Linux is to Windows.
They need to stop chasing the demographic that is going to hurt themselves.
>what the best/safest choices are for the general public.
Sometimes I think tech people forget 'the general public' is made up of adults capable of learning and making their own choices if educated properly on matters.
Sure. Most adults can learn lots of things. Most of us only care enough and have time enough to educate ourselves on a very small subset of the tools we use and tasks we perform on a daily basis. By and large, I trust the tools I use out of the box to be safe (subject to what common sense typical functional adults would be expected to have).
It is often reasonable to provide the option to use tools in ways that are less safe. Sometimes that's unavoidable. You can certainly use a chainsaw unsafely although I also think it's perfectly reasonable for chainsaw manufacturers to not provide easy ways to defeat the various safety mechanisms that are built in.
So you'd only buy a chainsaw with a codepad entry where you have to contact the manufacturer and ask them 'can I cut this log'; ooh, so much safer! But where does that leave the people who been buying that brand of chainsaw for a decade or more and don't want the restrictions?
Just remove the codepad, well yes, so long as the manufacturer doesn't weld it on ... oh, and they just decided to weld it on. Such safety, much wow.
How many topics do you have time and energy to be “educated properly” on? What are the odds that learning about the issue would lead to a different outcome than what Mozilla is doing? Firefox's openness means it attracts a lot of attention from people who have relatively uncommon preferences but believe they speak for a large group of users and that group tends to portray every disagreement as a betrayal rather than reasoned decisions.
I can properly educate myself on many, many things, especially over time. A major issue is correctly utilizing available time. In developed countries today, many "adults" would rather direct their time at very low-value activities, like having children and / or pet slaves, watching disgusting amounts of Netflix, wasting countless hours on Facebook ranting about personal issues or envying others, taking unnecessary and unproductive trips, etc.
While lots of readers will dislike this comment, are not ready to digest it, and will claim that I am wasting my time writing this comment, I think as a society we should prioritize creating competent humans who can appropriately use technology, whereas this move by Mozilla is just more of the same, prioritizing mindless consumers who have no idea how any of the things that their lives increasingly rely on function.
I think such consumers should be left to fend for themselves, as that is the only way a majority of people actually learn. This is especially true for Firefox, which has never been a "consumer" browser, and has always been a browser for people who wish to understand how it works and be able to modify it in a straightforward manner. If someone desires a browser that does whatever it wants without asking, Chrome is a perfectly fine option.
In the end Firefox will anger power users and general public still will use Chrome because they can't resist when it's advertised from every corner (and, honestly, just works better).
Power users should probably use unbranded versions of the browser that don't have these issues. AIUI, such versions are widely available, even from Mozilla themselves.
Chrome allows installing local extensions from a source folder, and the installation persists across browser sessions. Firefox extensions installed from a local source folder in about:debugging are removed at the end of the browser session.
The chrome "toolbars" are a joke. Especially this one.
It makes all pages have a margin on the top (that flickers in)
and then the toolbar appears several seconds later (after the webpage loads)