Am I the only one annoyed by people pushing this "they flicked a switch" narrative? No. They provided shitty software that didn't work under certain conditions (in this case date related) and thus broke your shitty software.
A third party having remote control capability is something entirely different.
Some other people are also upset that the Normandy system has the capability of pushing extensions that it does and that it is apparently on by default.
These both seem like reasonable concerns that are unrelated to the expiry but highlighted by it.
One or both would need to be true for a normal revocation to have any effect.
Mozilla, you've really been testing my patience for the last two years.
What they set was essentially a dead mans switch, though.
> A third party having remote control capability is something entirely different.
They’re pushing fixes through a telemetry channel which can also be used to change your preferences soo..
"So you and your team just released a tested and reviewed build, did gradual rollout and numbers looked good - so out to general public it goes. All clients update over the following days. QA discovers, upon testing the next release, a latent bug that prevents the current version from connecting to the update service and running the updater. How do you resolve this?"
Eg, if it was just a typo in the domain name, just register the domain new name or adjust the DNS record ;-)
1) Has a functioning update service that fixes the original problem
2) A quick re-skin/theme-change to i.e. a more "modern" version of its original design
3) Whatever performance/security improvements you can cobble together quickly or were in the pipe already
I would do this because I see this as also a business/finance/sales/marketing problem as a technical one. Assuming this bricked the core product, you have to save the business first, then save the quarter, then the week, then the day.
I think this would work because it is much easier to get a user to re-download a version 2.0 with performance/security/design improvements then it is to download almost anything else (imo). Again, this is assuming no saner solution exists. Interested to hear your accepted answer though.
Perhaps it's good from the company POV because that's how you make sure your users migrate to the new version even if it is still inferior/buggier.
Most applications have some data that comes from a remote server, whether it's on the home screen, any kind of announcements screen, server-sourced error messages, etc.
I would use that to communicate an update is available, with a handy link.
So how'd I do in the interview? ;)
I don’t know if they found a fix in the end
I will never buy another Samsung set again. Shipping a TV where HDMI access depends on a proprietary box is unconscionable.
Storytime: App devs built their app with the debug flag on. (Why would anyone use CI anyways?)
The debug flag is used for the local developer builds.
All API endpoints were pointing to dev/stagings. In order to make this fuckup go unnoticed(otherwise customer retention would be awful + possible media coverage), the dev/staging endpoints were redirected to production, so that they would take the real orders from the broken builds. But we could only do this for our own APIs, not for our payment providers. So we had to ship product without getting paid. :|
TBH it’s not hard for me to think of ways Mozilla could have preserved user control even in the face of these constraints though. For instance, I already use a master password to encrypt/decrypt the password database in Firefox. Why not allow the master password to also protect the settings and store them encrypted so a third party can’t modify them?
If the crapware uninstall Firefox, install Firefox Dev Edition, and installs the unsigned plugin, there's not much of a "gray" defence there.
People, including myself, were argueing against putting mozilla in a position where they are a single point of failure. That critique was countered with the specific argument that giving users an override is not acceptable because any override could also be used by adware and thus mozilla must be the sole arbiter of what addons can be installed.
Under the Unix security model, for example, that would be pretty likely: your profile's owned by your account, the executable & its containing folder are owned by root & go-w, so code running as you can tamper with the former but not the latter.
Or just inject code the browser at runtime.
Under the Unix security model, the UID is the permission boundary. Even if the binary is owned by root, it inherits the user's UID when they run it.
Regardless, as I said in my comment, I only mentioned the Unix model "for example" - not to "shift the goalposts", just because I'm more familiar with it than with whatever anti-binary-tampering measures Windows may have.
> By baking the signing requirement into the executable these programs will either have to submit to our review process or take the blatant malware step of replacing or altering Firefox. We are sure some will take that step, but it won’t be an attractive option for a Fortune 500 plugin vendor, popular download sites, or the laptop vendor involved in distributing Superfish. For the ones who do, we hope that modifying another program’s executable code is blatant enough that security software vendors will take action and stop letting these programs hide behind terms buried in their user-hostile EULAs.
I haven’t heard of this actually happening.
In other words, it’s not a purely technical solution, it’s a political solution, and the success or failure of such a political solution can only be judged by real world results rather than technical possibility.
This feature uses SHA-1. 
> Why not allow the master password to also protect the settings and store them encrypted so a third party can’t modify them?
It cannot protect against such. It can protect against a household adversary such as a 4 year old, and that's about it.
Of course if Mozilla had wanted to secure the config settings they could just fix the master password hashing functionality at the same time.
Ordinary users would have 5 levels of toolbars!!! Installed by anti-virus apps, etc.
Locking down can back fire, but think not that this wasn't done in the interest of most users.
1. Type about:config in URL bar
2. Change xpinstall.signatures.required from true to false
3. Restart Firefox
...which is to say, it's a trade-off, and though you can disagree with the weight Mozilla assigned to the pros and cons, their reasoning was valid.
> remotely disable Tor anonymity protections [!]
Maybe a 'certificate is expiring soon' warning on the browser side?
Users would know and Mozilla would be forced to keep certs valid.
This whole thing is also yet another reason to use something like Tails or Whonix which are much safer to use with JS enabled.
Use curl -H "" and links --dump. Until we have something sane that's really the only safe way to browse TOR if you actually have something to hide.
(There's dillo and other things like the absolutely horrifying browser I've been writing but that hardly even supports forms right now heh. but I'm worried about connecting stuff like that to stuff like TOR)
People with heightened privacy concerns can always use Whonix or Tails instead.
"Hey Mozilla - this is why people said that forcing addons to be signed with no way to disable was a bad idea. You didn't even make it a year without screwing it up.
Who could have seen this coming? Oh wait, pretty much everyone who argued against this policy."
This solved that issue.
This isnt a binary proposition, they could have solved that issue while respecting user choices, if it had been a priority. The solution they chose inherently was flawed. Hopefully this gives other providers pause before they build similar things.
You don't need to "buy" this idea - we're talking about free and open source software. If you passionately believe it's insecure, you would do everyone a great service by explicitly pointing out how it is so in the source code, rather than peddle FUD.
Tor is NOT good at keeping you private from government. It is good at keeping you private from your ISP or employer however.
Don't do anything illegal using tor and thinking you are safe.
This is a bold claim. Would you mind backing it up?
The reason it's still a valid criticism is that people suggest that Tor has this resistance in threads like this one. :)
(Of course, you don't need to monitor the whole Internet if it's intra-country traffic that you're interested in.)
Tor is a beacon
This is not a list of the IP addresses of the onion sites nor of the users, in addition to that exit nodes are not used when you connect to an onion site.
In particular, silk road shutdown, and various pedophile rings successfully de-anonymized and operators arrested. There is good evidence that powerful enough entity (esp. entity spanning multinational geographic regions) can successfully de-anonymize tor traffic.
If you keep low profile and just use tor to browse regular internet privately, you are fine. But if you do wide scale illegal activity, you most likely will pay the price.
Why would it be better to connect directly to a site, rather than force them to expend resources to deanonymize my connection?
We already know that some governments use drug money to pay for clandestine / black ops.
It has an asymmetric architecture biased for a 'home network' infiltrating a 'hostile network'. So Tor is perfect for military intelligence officers working from a 'safe network' (one that doesn't zero in [on] Tor protocol beacons) to pop up in some other network and do their job.