I struggle to see how they’re legally different from malvertising: perhaps there’s some vague terms about data or advertising (which also exists for malvertising! you agreed to the ads!) but at no point was there a “meeting of the minds” in which Facebook was authorized to bypass security mechanisms to exfiltrate data from the system.
I don’t see how Facebook’s behavior is anything but classic “hacking”, and a willful abuse of the CFAA.
I think it’s time we started addressing gigacriminals like Zuckerberg: we need to stop pandering to people who commit millions of crimes a day, at “web scale”, and just use force.
If Facebook doesn’t want to get the memo that wanton criminality isn’t okay, then it’s time to use force to correct the behavior.
 I use gigacriminal in the technical sense: I believe Mark Zuckerberg, as the leader of an organization, has ordered over 1 billion criminal acts to be commmitted for his profit.
Also, while I think Zuck is devious (based on the history of FB), I think Sandburg is the brains of the worst of FB.
I would be genuinely curious to hear, via throwaway accounts if need be, about how FB staff rationalise things like this happening.
Do you shrug it off as not a big deal in the long run? As FB still doing a net amount of good versus what you perceive as isolated incidents like this? I'm just in good faith trying to figure out how people willingly work and continue to work for outfits that repeatedly engage in behaviour such as this. I know there are lots of speculative reasons we can put forward, but I think we have a great opportunity here in our community to have first-hand input.
I worked with a group at Facebook, and I almost refused to take the project on. I'm the type that has deleted me Facebook and uses a blocker to stop their tracking, and when I showed up to work with the team, there were surprised that I didn't have an account.
From what I could tell, the teams are fairly isolated and thus don't see the forest for the trees. When someone points to an article like this, it seems that they just shrug it off and think the author probably got it wrong because it doesn't seem that way from the inside (again, they only work in isolated teams, but still think they have enough of an insider's perspective to discount it). Even huge companies we know now as bad had a ton of employees, like Enron.
I would really like the perspective of someone "on the inside". Facebook is one of the companies I trust the least with my data, yet they have so much talent that I can't help but wonder how they convinced them to work there (is it just the money?).
First, obviously there is the money factor, you may choose to ignore it but it is a big factor for many.
Second, the tech is truly state of the art and it is good experience/skill to have/pickup.
Third, I'd say almost everyone passing judgement about people taking up such jobs also judge having such jobs on your resume as a positive. That drives one's value as a candidate up even post such a job.
If people care so much why don't they provide an incentive, would you hire someone who turned down such a job compared to someone who gained experience in such a job? No one ever asked me - so which jobs offers did you discard and why, during any interview.
Fourth, almost every big company has its scandals, now how does one decide which ideal is worth giving up job offer for, is big financial ok? is big pharma ok? is big tech ok? is big anything ok? is working on open source in such a big something ok, say open source software others use at their jobs? is a start up using questionable practices to get to the next level ok? define what's ok according to you, and why do you expect that would be the same for everyone else.
I'd wager that for any area you pick - they probably have a lot of high-end technology relating to image/video storage, data replication, machine learning, and network-layer infrastructure - there are other more morally and ethically sound places where engineers could learn and apply that same knowledge.
We're already reaching the point where working for toxic companies is considered a negative during resume review; I won't provide any such examples here but the bay area tech scene is full of examples of environments where being a former employee at a company can at least warrant raised eyebrows.
Scandals may occur; what matters is how the organization responds to them. And yes it's certainly acceptable to leave an organization if you're not happy with the way it has handled such situations.
The only point I find difficult to disagree with in your comment is the monetary motivation.
Say data at scale, petabytes of data for example. I'd be curious to know if you can name all companies that have this scale of data and are morally acceptable to you. :) Google? Amazon?
> Scandals may occur; what matters is how the organization responds to them. And yes it's certainly acceptable to leave an organization if you're not happy with the way it has handled such situations.
While I can see your point of view, as an engineer you can find other opportunities that may not be as lucrative but are comparably still good. But, I also find it hard that its the engineers that get this judgement regularly on HN while you give users and shareholders a free pass. A scandal surfaces, repeatedly, users and shareholders don't care, nothing changes and for some reason that's ok while engineers are expected to be the moral compass. Wonder how many judging here use instagram/whatsapp/fb and/or own stock in such companies, perhaps even have family and friends that continue to use these services but somehow, I guess, its easier to judge strangers and expect them to behave a certain way instead.
It'd be an interesting discussion to have with someone who feels like they really need to stay at the very peak of private data accumulation - because in my view those actions are potentially very detrimental to wider society, certainly depending on the culture. I'd extend more respect to Google than the others from what I've seen, although opinions may vary elsewhere.
Regarding scandals and reactions - users and shareholders can and do care, and they vote with their feet, or wallets, or ideally both.
The battlefield in these cases is over how much truth about the scandal and resolution are published. A good organization will generally tend towards more transparency in both, while perhaps keeping a few cards close so that they can react to any potential retaliation (such is the world of rapid fake news that we live in).
Edit: s/data accumulation/private data accumulation/
Yes, I would. At this point, willing taking a job at Facebook is a bit of a red flag about the potential employee's ethics.
My impression was that many employees hold a self-contradictory view about the extent of their influence at the company. When asked about their jobs, they tell you that they're working hard on fixing the problem and making impact ("where better to fix it than from inside?"). But when confronted w/stories like Onavo, they get defensive because "it's a big company, I had no way of knowing." Which is fair, honestly; the problem is that they think they can fix anything in the first place. Part of the problem is that FB advertises itself internally as being super transparent but it isn't at all. (This applies mostly to product/data+ML people. The infra folks I worked with for the most part just want to make their money and go home.)
A lot of longtime employees joined when FB was good and amazing in the media, and it's hard for them to accept that it's really gone in a bad direction. A lot of younger ones join for the money, and/or because they're coming from FB's massive, culty college intern pipelines (especially if you come out of FBU) and confuse being dazzled by the perks with actually believing in the mission. The money is big for everyone (I was there for part of the long 2018 stretch where the stock price just fell and fell, and you could feel people getting antsy), and the defensiveness that comes from constantly seeing negative news is another part. Lots of blame thrown around internally (leakers, leadership, bad eng practices) but little responsibility; lots of sunk cost fallacy-ish thinking ("we all took jobs here for a reason, we can't just give up and leave").
Also, that twitter user is a former senior leader at the FTC, and claims something here to be "textbook @ftc deception": https://twitter.com/ashk4n/status/1099164648379580416
Is the root cause that they migrated off of email and did all their sensitive discussions in "internal tools" with no actual data retention enforcement? If so, that seems quite ironic.
"Most full-time employees receive RSUs (restricted stock units) which are shares that become sellable on a set schedule over four years. "
are we (engineers) really that indifferent to what organizations we support do ?
I probably won't ever work at Facebook (I have all of their services blocked in my host file and I've deleted my account and they aren't working on anything interesting to me) but I'm just trying play devils advocate and paint a picture of why someone would choose to work there.
Disclaimer I work at Amazon.
Observationally, ethics does not even enter the equation. Especially career-driven students from backgrounds where name-brand prestige is important just want such a name-brand on their CV.
Not that faculty are any better given how many academics effectively sign away their lab to FB through 'collaborations'. All they see is resources to use, name-brand recognition, a big personal pay-check, and publicity for their work.
What I find most mystifying is that this is a literal non-topic. At best someone may forward a blogpost about a security leak and make a snarky comment, but I have never witnessed any political discussion.
From the small sample, it shows very strong correlation between working in ethically questionable organisations and finishing Academia.
Is the opposite for most of my Self trained peers.
Also someone grinding through a rigorous CS degree might want the biggest ROI vs a self taught type that is happy to take what they can get without and still get paid well without the debt of a CS degree.
I don't think we can draw any conclusions and to state the obvious, correlation =/ causation.
Some time ago, a study was performed in my home country to look into possible correlations between academic performance and work performance among doctors. As it turned out, the best performing doctors were not those with top grades, at least not before entering medical school.
Those who did best had left high school with adequate grades, but not good enough to get into medical school. Rather, they had spent time and effort with supplementary studies to get their grades up to the level where they could apply for medical school.
The possible explanation that was presented was that some of those who had great grades straight out of school simply chose to become doctors because of the promise of prestige and remuneration. Those who didn't, but still fought their way into medical school, however, had a calling beyond money and status.
An undergrad tells me they are going to FB for an internship. I am not their tutor or mentor beyond teaching. They have gone through stress and trouble for their internship, and it's too late to get something else for the year.
The only thing I can do in this instance (where there is no chance at all they would give up their internship to do nothing) is make them feel bad about it. Uncomfortable situation.
Honestly, I think the department should step up and disinvite FB from campus events and not allow them to advertise. It should also hold courses on ethical impact of technology and discuss a few cases of misuse.
Also, people generally believe it's quite possible to work on good things in a company that also does bad things. (And many of them indeed will.) So it's not the most compelling argument that you shouldn't work with X/Y/Z because they did bad thing W.
mentors, professors and friends can have a lot of influence which (imo) could be a moral and ethical obligation one may like to exercise? If you were receiving advise or learning as a junior and looking for direction from a mentor (or any person you feel you can learn from and also trust), wouldn't you appreciate hearing their personal opinion on a subject, and how they arrived at their believes?
> Also, people generally believe it's quite possible to work on good things in a company that also does bad things. (And many of them indeed will.) So it's not the most compelling argument that you shouldn't work with X/Y/Z because they did bad thing W.
it's difficult if not impossible to convince somebody that a company they just passed their 1st interview with, to refuse because of ethics. Especially if they never had a job they might say I'll do it anyway and see for myself, I can still bail if it's that bad.
But employees already working there have more power by changing things from within. I think this is why point above is valid because everyone has mentors so speaking up (without judgement) is key. Only by changing the inside it's possible to have a dialogue about impact on environment/society and only by talking about it will we eventually be able to abolish the practice of labeling any such discussion as anti-profit or social-justice seeking. It does affect the long-term image and how the company/brand will be perceived in the long run.
This is particularly important for young professionals! It's super important for fresh graduates, who enrolled in CS/CompEng because of how enthusiastic they are about technology to hear this stuff!
Why? Because, while we're wondering whether or not ethics is even something that we should bring up in a discussion like this, Facebook has PR and recruiting departments full of smart people who are actively working on getting these folks on board.
And when you've spent the last four years of your life studying a highly-competitive field, in which there's barely any room for the study of philosophy, ethics and humanities, it's pretty hard to figure out this stuff on your own.
So damn right tell friends/students their next job is at an unethical company. I mentor interns every year, and whenever one of them asks me about companies like Facebook or Google, I absolutely tell them that I would never work there. I tell their recruiters the same thing. I tell my friends from outside the tech world the same thing.
> Also, people generally believe it's quite possible to work on good things in a company that also does bad things.
The problem isn't that you can't do good things, the problem is that these companies use them to whatabout the media away from the bad things they do. You think Facebook worked on those "Mark yourself as safe" thing out of the goodness of their heart? As if it brings them any kind of money? No -- they do it to capture a little bit more of their attention, and to point out to any reporter that questions their morale that they're totally on the Light Side of the Force, just look at how many users rely on us to let their friends know they're fine.
It's good if one can take the opportunity to take a stand. But let's not pretend we are parangon because we do.
Ethics is reduced to meaningless posturing if whenever it comes to action people have a litany of excuses on hand. In the case of surveillance whether its government or private its individuals empowering themselves at the cost of others not particularly burdened by the wider social ramifications.
It's a bit tragic that its only when without power that people talk of ethics and even with little power it seems self interest always rules. And discussions become pointless as it becomes impossible to tell how many will do differently.
 Not a condemnation of the young, but they still don't realize what they don't realize at that point.
> Leakers, please resign instead of sabotaging the company
> How fucking terrible that some irresponsible jerk decided he or she had some god complex that jeopardizes our inner culture and something that makes Facebook great?
In particular, this included comments about wanting a loyalty test "screen" in their hiring process:
> Although we all subconsciously look for signal on integrity in interviews, should we consider whether this needs to be formalized in the interview process?
> This is so disappointing, wonder if there is a way to hire for integrity. We are probably focusing on the intelligence part and getting smart people here who lack a moral compass and loyalty.
Also, non-disclosure agreements really fuck people in the overall scheme of things. If you worked for Facebook and came out tomorrow showing some really shady shit that they were doing, it's conceivable that no other company would ever want to hire you. (I think the term is "black-balled" but not sure if it's still in use, today?)
(Whistleblower laws only protect government workers and do not affect the commercial industry. Even then, the whistleblower laws only go so far...)
So, you're damned if you do and damned if you don't.
If having integrity and an interest in the public good aren't strong enough arguments to avoid working for these companies, then perhaps this one is.
No surprise here, again. Facebook launches a feature that allows criminal behavior, reaps the profits, and only tries to roll it back after they are under intense scrutiny.
There is an easy way to fix these repeated problems: Hire people who care about ethics. Of course, the fact that they don't shows that they are only looking for a rubber stamp.
What kind of "privacy" team do they have that approved all the garbage listed here?
Philosophy may not equip you with a concrete answer, but it definitely equips you with the machinery to look at something and recognize there's more to the question than some manager saying it's legit.
I feel vindicated by news like this. It's pretty clear that privacy overreach and violation at Facebook are not "accidents". It's standard operating procedure.