Hacker News new | past | comments | ask | show | jobs | submit login
We’re Updating Our Terms of Service to Better Explain How Facebook Works (fb.com)
50 points by tareqak 53 days ago | hide | past | web | favorite | 26 comments



For those that do not want to visit Facebook, but still see the unedited version of the changes that Facebook has made, in their words:

> Here’s a summary of the information we’ve added to our terms:

> How we make money: We include more details on how we make money, including a new introduction explaining that we don’t charge you money to use our products because businesses and organizations pay us to show you ads.

> Content removals: We provide more information about what happens when we remove content that violates our terms or policies.

> Your intellectual property rights: We clarify that when you share your own content — like photos and videos — you continue to own the intellectual property rights in that content. You grant us permission to do things like display that content, and that permission ends when the content is deleted from Facebook. This is how many online services work and has always been the case on Facebook.

> What happens when you delete: We’re providing more detail about what happens when you delete content you’ve shared. For example, when you delete something you’ve posted, it’s no longer visible but it can take up to 90 days to be removed from our systems.


What happens when you delete muddies the IP clause.


Do you say that because they might still use your posted material?

Or is there some other reason?


Saying “The updates do not change any of our commitments or policies” while rewriting a contract is just a contradiction.

It’s literally impossible to rewrite a contract and claim that its meaning or interpretation hasn’t changed. The way a legal term is phrased is hugely important especially when there is ambiguity.

I understand FB’s intent here and I don’t think the line is meant to deceive. For example, adding terms to the agreement that make the user’s acceptance of how they are monetized conditioned on usage (which has always been required but is now explicit) may have big ramifications for anyone wanting to sue FB for fraud or the like.

Easy to understand terms are a good thing, but I think it would be better to say something like “we have attempted to capture the same meaning of the prior policy while using easier to understand language” or the like would have been more accurate.


> Saying "The updates do not change any of our commitments or policies" while rewriting a contract is just a contradiction.

To be scrupulously fair, that could well be interpreted as "We were already committed, as a matter of policy and/or applicable law[0], to upholding these additional terms not mentioned in the previous contract, and all we're doing now is formally documenting something we were already doing.", but that would require assuming good faith on the part of Facebook of all entities, so no.

0: eg adding "We guarrantee we will provide a replacement [widget] if the one you ordered arrives damaged." in a jurisdiction where that's legally required anyway.


I highly recommend checking out Mark Zuckerburg's new podcast "Tech & Society". He's discusses in length the approach they're looking to take at Facebook. It's genuinely a really entertaining listen.

Obviously don't take everything of what he says at face value and realise that it is without a doubt a public face, but the topics he's approached and guests he's had on to debate him have been fantastic and engaging thus far (including: Jonathan Zittrain, a specialist in tech law from Harvard; Mathias Dopfner, the CEO of Europe's largest news publisher; Yuval Noah Harari, the author of Sapiens, Homo Deus, etc; and Jenny Martinez, the dean of Stanford Law). It's made me see him in a much different light then that of his uncomfortable public speaker persona.


Like most introverts there’s a lot of substance that comes out over an hour of conversation, not a 10 second sound byte. Mass media has always optimized for the latter but podcasts are starting to build up the former.


Good. Thank you. I legitimately appreciate this. Honesty is all anyone asked for.


[flagged]


You can just use > to indicate a quote. It doesn't add any formatting but everyone knows what it means on HN.

Using 4 spaces turns it into a code block and makes it much more difficult to read since paragraphs have no line breaks -- they end up running off the screen and require horizontal scrolling.


[flagged]


Wow, that last one is surprising. I understand that it can be a hard problem to fully delete data if you haven't built systems with that in mind from the start, but isn't that (not deleting backups) a flagrant violation of GDPR regulations?


Nope. Not even a little bit. The Right to Erasure is way overblown.

Even in situations where a data subject can request it (which isn't all of them), a Controller can choose not to delete any data where they have an Overriding Legitimate Interest. That's legalese for "a pretty good reason," as long as the privacy interests of the data subject are taken into account. Fraud detection is explicitly listed as an example of Overriding Legitimate Interest.

I'm not aware of any court action confirming this, but the general sentiment is that "altering back-ups is a terrible idea" is a valid Overriding Legitimate Interest, so long as you make sure that you can re-erase the data subject's data after restoring from back-up. The data subject's privacy interests are generally satisfied because the back-ups are not used for processing.

Other considerations like data minimization still apply, so even if you have Overriding Legitimate Interest for some data that doesn't give you carte blanche to horde all data. But things like logs (with a sane retention policy), back-ups (with a re-erasure process), and trained ML models are generally allowed to keep remnants of your data around.


That seems a very generous interpretation.

The wording about overriding legitimate interests is specifically in the case of erasure because a data subject objects to processing. It doesn't cover any of the other cases where erasure is required, any one of which is sufficient. One of those is "the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed" and another is "the data subject withdraws consent on which the processing is based according to [cross-references] and where there is no other legal ground for the processing".

There may be legitimate interests in things like maintaining logs and back-ups, but it would be quite a stretch to argue that an organisation with the resources of Facebook and the potential amount and sensitivity of personal data Facebook are dealing with can't manage to delete the underlying data in as long as 3 months.


Is it even possible to delete something from the middle of a tape (assuming tape backups are used)? The only thing that can be done is to read the tape in, transfer all "good" data to another tape, then erase the original tape.


I believe Facebook uses Blu-rays and regular HDDs for cold storage, not tape.

https://datacenterfrontier.com/inside-facebooks-blu-ray-cold...


In a conceptual rather than literal sense, yes it's possible to do this in certain cases. In particular, if the per-account bits of the backups are encrypted with per-account keys, deleting an account's key makes that account's bits of the backups unrecoverable.


That's the sort of question that makes the GDPR regrettably ambiguous. It uses a lot of words like "legitimate", "undue" and "reasonable", but there could be genuine uncertainty about how those would apply in quite a lot of practical, everyday situations.

In this situation, however, we are talking about an organisation whose entire business model is built on personal data given by literally billions of people over a period of many years, with tens of thousands of employees and tens of billions in revenues, which is operating on a scale where it is demonstrably capable of designing and building replacements to its own specifications for standard infrastructure and deploying them throughout its own data centres. I don't think it would have much credibility if it claimed it couldn't guarantee to comply sooner than three months after a request because it regularly backs up possibly the largest database in the world on magnetic tape.


The comment you're replying to is not a verbatim quote of the post, they've added the last sentence (among others). The original post states

> What happens when you delete: We’re providing more detail about what happens when you delete content you’ve shared. For example, when you delete something you’ve posted, it’s no longer visible but it can take up to 90 days to be removed from our systems.

FB has previously published notes about backups, such as https://www.facebook.com/notes/facebook-engineering/under-th..., which include sentences like "Our job is to keep every piece of information you add safe, while ensuring that anything that's been deleted is purged in a timely manner".

I work for FB.


Just in case, you know this is not what's actually written by facebook, right? The original text is in the current top comment, and only says that it may take up to 90 days for the data to be removed from the systems, nothing about accidental backups not being deleted.


I'm interested to hear the answer to this, as an American living in Europe who has deleted his Facebook account while living here.


Just in case, what the user posted here is not the wording Facebook used in their post, and was added in as commentary. Granted, it may still be a true statement, we will never know. Facebook's wording is "We’re providing more detail about what happens when you delete content you’ve shared. For example, when you delete something you’ve posted, it’s no longer visible but it can take up to 90 days to be removed from our systems."


90 days for removal is not a violation.

Passing data on to third prties without explicit consent from the user is a violation.

However, you can’t even use FB without giving such consent, or, rather, without being forced to give it through dark patterns and vague wording. The way FB collects consent is a violation.


As a data processor, this is probably a violation.

Question is, who will bring the hammer down.


Not really. Having reliable backups is 1) a legitimate interest for the data processor, and 2) possibly a part of the service in question (that's more generic and not really tied to Facebook).

The data processor needs to make sure old backups are deleted in a timely fashion, and also need to make sure that data that has been deleted will be deleted again, should the need to restore the data arise.


Store the backups encrypted under a separate key for each user, and then throw away the user's key.


In facebook’s case, that means both backup and restore processes must have access to over a billion encryption keys. Doable? Maybe, but it wouldn’t surprise me if that made it impossible to make backups in a reasonable time frame.

And even that isn’t enough for the case where a user deletes part of their data.


Ironically, the gdpr also attempts to protect you from "accidental data destruction" (what folks would usually call data loss in storage, though the gdpr uses the expression for 'lost a USB drive the somebody else can find'). This means that if nothing else, gdpr makes it legitimate business interest (compliance) to have IMMUTABLE backups.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: