EDIT: I should point out, I do understand the other side. People shouldn't be punished forever for a crime they've already paid their ostensible debt for. However, I think this is a pretty nuanced issue, and a universal 'right to be forgotten' or a blanket "everything about everyone is accessible forever" is probably not the right solution.
Removed from google != removed from public record. If google has become our only source of information we are pretty much screwed. If Trump were to have his bankruptcies "forgotten" by google then journalists would still report when relevant since he is a celebrity/(sigh)politician, and any entities considering doing business with him or anyone should be doing proper due diligence.
Google CANNOT become our only source of truth.
Edit 2: My point is about the perception of Google as the only relevance source of information for society. Not a value judgement on the concept of a "right to be forgotten" from any source or surfacer of information.
What about other search engines? Social media shares? What happens when a journalist publishes an article based on offline investigation, does the right to be forgotten still apply? Search engines need to fight libelous SEO attacks, but this is a deeply unsettling over-reaction.
Let's drop the Orwellian phrasing and call it what it is: "Right to hide the past". Forget the truth, security through obscurity.
Knowledge is power, and search engines are the map.
And hopefully it's obvious the issue of search results is separate from consumer data mining/ad profiling.
Embarrassing / reputation-affecting knowledge being available to the public forever is not the same as truth being available. Especially now that we see how engines like Youtube go out of their way to offer extreme and provocative information, we should not leave it to search engines to be unregulated sources of information about individuals. Search engines will not tell your employers that you work diligently everyday but they will show an offensive tweet of yours from 2007 if that's what gets clicks and reactions.
* If someone feels consequences of an offensive 2007 tweet, just delete it. Platforms should be required to make it easy to delete content.
* If someone is scared of saying controversial things, avoid platforms that require real identities. There are plenty of anonymous platforms.
* Uphold social media to the same standards of traditional media, requiring truth and propagation of redactions and corrections. Libel and slander are well-established concepts.
* Demand discretion from friends. In college, my group had a strict "no-camera" rule when it came to embarrassing or unlawful shenanigans. My parent's generation had the same rule.
Hiding search results does not address the problem. If someone posts a photoshop of "Eric driving drunk" on Twitter, I want that post promptly removed... the search results are just a symptom.
That's the nature of the internet. The great equalizer
Removing it at the source removes it from google.
It's like saying the problem isn't that there is a negative story about you in The New York Times from 2007, the problem is that ISPs allow their customers to read that story in The New York Times. Obviously the "problem" is the story -- which you may have no legitimate right to prevent people from reading -- and if you have a legitimate complaint (i.e. libel) then you should have to take it up with The New York Times and not Comcast or Google.
The reason people want to go to Google instead is that they know Google doesn't have a strong enough incentive to stand up for the victim of the censorship. If you go to the source they may refuse to take it down and force you to adjudicate the matter in court where they can argue their side in front of a judge. If you go to Google, economically they have to take it down because nobody is paying them to hire lawyers to spend the hours it takes to make accurate legal determinations and they would go out of business taking on that role without compensation for seven billion people.
so let's say Eric was out drunk with his friends and they walked in a sex shop and picked up some dildos because hah hah this shit is funny we are so drunk and then Eric took some silly photos cuddling the dildo or mimicking putting it in his ass and someone posted it to the internet and now eric would like it gone.
Now Eric's pictures on the internet are actually sort of funny so they have gotten some exposure, and so Eric finds he can't keep his job at the local macho place of employment - I don't know why, maybe he's not good at talking shit to anyone who talks shit to him so everybody picks on him about his stupid funny photos.
So he gets fired or quits because can't handle the harassment, and is sick of going out and hearing hey it's dildo guy. So he moves away. Nobody in his new town know's he's dildo guy, but then he goes out one night with this girl, and somebody exclaims omg it's that dildo guy, hey look at this everybody I got me some google skills, I saw this hilarious picture one time.
Now on the one hand Eric has given the world a lot of (unpaid) entertainment as dildo guy, but on the other hand when he commits suicide because he is emotionally unconstituted to go through life being called out every now and then as dildo guy then he makes people feel bad. So in order to allow Eric to make his stupid drunk dildo hugging photo and not ruin the rest of his life for something embarrassing but totally legal let's just give him the right to remove the stuff from google. That way, when he moves from his old town where there are lots of people remembering the whole dildo picture situation, in his new town people didn't know him when it was fresh and his stupidity is somehow 'forgotten' thus preserving the ability that people have had throughout history of moving from one area because their reputation at that area had become to problematic to continue. Google makes sure the reputation follows.
The problem is the content showing up when searching for someone's name, not that there is an article on a website somewhere.
I guarantee Yandex and Baidu will not respect western delusions of security through obscurity.
This has nothing to do with security through obscurity, it has to do with rights of human beings. Anyway Google doesn't either, hence the article.
All of the "right to be forgotten" arguments seem to devolve to "people shouldn't be able to know any data points that would negatively effect my social standings".
Isn't that a point in the other direction? The existence of convincing fakes will teach people not to believe everything they see on the internet.
If that someone shared it on their own website, hosted out-of-country, on their own hardware... well, I can't do anything about that but i also wouldn't care. It's clearly non-authoritative and won't get SEO traction by itself.
Does Europe not have the concept of a court-appointed lawyer?
> RTBF makes it possible for ordinary citizens who can't afford fancy lawyers to fight big corporations to seek redress.
No it makes it possible for criminals to hide their criminality. To be very blunt, if I were an employer, I don't want to hire a murderer. I don't care if he/she was convicted and served their time. Once you have committed murder, I will not hire you. I will not deal with you in any way. Hiding/obfuscating true information from me is wrong. Criminality is public record. It should be easily accessible.
Not sure you understand how court-appointed attorneys work. First off, you can't use one to sue someone else since you can only get a court-appointed attorney when you're being charged in a criminal suit.
Second, court-appointed attorneys suck. They are underpaid, overworked, and unable to properly do their jobs as is.
Requiring legal action would indeed create a scenario where only the rich bring such cases to trial.
RTBF does not protect murderers, this is a fundamental lack of knowledge of the law and its clearly marked boundaries.
I'm uneasy about some RTBF cases, but where you have a person making prolific publications across a variety of legal jurisdictions and ignoring the legal rulings (which happens in cases of stalking and harrassment, for example) it's impossible for the victim (and they do suffer real harm) to get justice other than asking search engines to de-index the attack pages.
Our legal system tends to attack actual bad actors, not convenient targets. Unless the Internet is involved. The way RTBF and DMCA work, search engines bear 100% of the cost, and people don't ever go after the actual bad actors.
By the way, in the US, if Tabloid X publishes Eric's photo, that's 100% legal thanks to the First Amendment, as long as the photographer agrees. Eric has no part. Attacking search engines on behalf of Eric, in the US, not so cool. Europe doesn't have free speech, so, no problem.
Europe definitely has free speech (all the countries I know of anyway, Europe has a lot of different countries). There is just a different definition of what exactly is free speech and what is something else (some racist things are not considered free speech).
And sure, I'm sure you'd call what you have free speech. But that's not the way the US defines it.
Nope it's not, it's an EU law and EU !== Europe. Europe has 51 countries while the EU only has 28. Just like Mexico is not in the US, a ton of European countries are not in the EU.
RBTF, maybe, but how is the DMCA this way? Certainly not the takedown notice/counternotice provision, which doesn't create new liability, only a special shield from any pre-existing liability.
The problem is that the shield has value to the search engine in excess of the cost to the search engine (but not the cost to the censorship victim) of removing the information. It reduces their risk, even when the risk is low because they would be likely to win, and removes the cost of having to litigate the issue even if they do win, so the search engine takes the deal and then executes ~all the notices even when they're bogus.
Which is a direct cost to the censorship victim compared to handling it the way Section 230 does it, and an indirect cost to the search engine because it has them paying to process the removal of legitimate information and reduces the value of their service to customers -- just not enough of an indirect cost to give them the incentive to refuse, because the brunt of the cost is on the third party being censored.
The search engine could ignore all takedowns and be in exactly the same situation as it would be without the DMCA; the safe harbor provision isn't a mandate on them, it is a benefit to them. They deal with takedowns because the cost of doing so is less than the cost of copyright liability they would have without the DMCA, which means the DMCA is saving them money, not imposing a cost.
You're treating the safe harbor and notice and takedown as indivisible when they obviously aren't. Conditioning the safe harbor on notice and takedown is a huge cost compared to the alternative used in CDA 230.
They obviously are both part of the DMCA, so you can't say that the DMCA imposes costs based on the notice and takedown condition for the safe harbor, because ignoring that condition leaves the host in the same position they would be in without the DMCA.
You can say that the notice and takedown requirement reduces the cost savings of the safe harbor compared to the hypothetical policy of an unconditional safe harbor, or one with alternative sets of conditions, but that's a very different claim than the DMCA imposing costs.
The safe harbor and the anti-circumvention rules are both part of the DMCA too, but it's silly to argue that the anti-circumvention rules don't impose net costs because if you average them together with the safe harbor it comes out somewhere near neutral. They don't cease to be divisible just because they were enacted at the same time. Otherwise you could justify anything by just finding something which is as good as the target thing is bad and lumping them together on the same side of the scale.
And if you've never gotten a DMCA takedown from Perfect 10, you probably don't understand the true terror of the DMCA process.
Just because hate speech isn't protected doesn't mean we don't have free speech.
No-where has the US's extremist version of freedom of speech. Europe has a different form of free speech, and in this example Eric's right to be forgotton probably doesn't trump Someone's right to publish true information.
I guess there'd be some judicial attempt to balance these two rights: Does Eric pose a continuing risk to the public from drunk driving? Is Eric a public figure who's claimed to never have driven drunk? Was this a one off event that happened many years ago, never repeated? This would be something courts are able to decide.
BTW, search engines don't "publish" information in the US sense of the word. The way Russia forces Yandex to self-censor is that Yandex is liable for everything they show to users. That's 'publishing' in the US sense. Newspapers have publishers, and the publisher is the person you sue if you think the newspaper has libeled you.
Meanwhile, Europe (mostly) doesn't consider RTBF to be censorship because it only involves censoring search engines and not newspapers. Except that people are filing RTBF against newspaper site search, too.
...and I read that tweet in 2007, and quoted it in a blog post on my obscure, low traffic blog. Google will still find it.
> * Uphold social media to the same standards of traditional media, requiring truth and propagation of redactions and corrections. Libel and slander are well-established concepts.
What about the case where the damaging content is true? A key thing here is that what is acceptable changes over time. Something that can be a life ruining social faux pas today may have been pretty normal 20 years ago, and many people today won't accept the "oh, that was normal back then" explanation.
We used to be able to avoid these problems because it took effort to dig up records from 20 years ago, and from low circulation sources like local newspapers.
So, for example, if you did stupid things in your home town that ended up in your high school newspaper, and then 20 years later were applying for a job in a city in another state...the employer would probably not find that high school newspaper, even if you were applying for a fairly sensitive job in an industry like finance.
That's because to find things like that they would have to actually send someone to visit your high school library and comb through their archives of the high school newspaper. That's just too expensive to do routinely for job applicants, except for the most sensitive positions.
Nowadays, all that stuff ends up online from the start, and it is cheap and easy to find.
> * Demand discretion from friends. In college, my group had a strict "no-camera" rule when it came to embarrassing or unlawful shenanigans. My parent's generation had the same rule.
All it takes is one person in the group to slip up, or for you to overlook one third party who is not part of your agreement and who can see you. So really, the rule has to be don't undertake embarrassing or unlawful shenanigans. (And as I noted above, standards for what is embarrassing or unlawful can change over time, so it really needs to be don't do anything that could conceivably become embarrassing or unlawful in the next 50 or so years).
Essentially we used to balance privacy vs. public access kind of automatically, due to the limitations we had in information storage, indexing, and retrieval. We've removed most of those limitations, so the balance has been lost.
“Just delete it”.
Except have you ever personally been in the situation that you needed something removed? I have and:
1) You might not even have the password to every random account you created in the past, nor the e-mail addresses that you used when you created those profiles, nor maybe even remember what e-mail address you used for each of them.
2) Turns out that there are a lot of sites out there that copy and preserve a lot of random data from other sites. They do so without regards to the ToS of the site you originally posted to. They do not care about copyright. They do not respond to personal requests for removal of data. They do not respond to DMCA notices. They are outside of the jurisdiction of the country you live in and as are their hosting providers. And even if they are cooperative, there are so many of them that reaching out to all of them and following up on the removal will require much much more time and energy than what you have available.
So then the best you can do is delete what you can and submit the rest for removal from Google.
“Well you shouldn’t have posted it in the first place if you didn’t want it to be public”, right? No, it’s not that simple!
The things you post today can be taken out of context and misinterpreted by someone in the future in ways you would never have imagined today.
We keep posting comments, pictures, videos, creating profiles, liking and sharing posts and information, but most of us rarely delete any of it. As the amount of data increases, so does the room for cherry picking data about you to build up an image of you that while true in the sense that all of it are things you posted, wildly misrepresents what kind of person you are, and on top of this misrepresentation and even more inaccurate image can be painted.
If you had any idea what it feels like to have that happen to you, I think you would want to be able to have some of that information at the very least removed from search results.
Once it’s gone from search, it’s gone from the public eye. And if you are lucky you are able to erase the bits of information that ties the data to you so that even if the data resurfaces in the future it is no longer connected to you, or at least not as directly.
Furthermore, when you are working on having information removed you should first make a list of all of the information, then have it removed from Google ASAP so that 1) it gets harder to find as soon as possible and 2) so that the information is not retained in the publicly available caches of search engines after it’s been deleted from the source sites.
Beyond that, for the information that you could not get deleted but which you were able to have removed from search results, some of it will eventually disappear all together on its own because of bitrot (hardware failures, data management errors, sites going out of business, etc) and some of it will probably stick around forever.
But like I said you want as much of it removed as possible and you want the rest of it to be hard to find and you want as much of it as possible to lose connection to you. And achieving that requires the cooperation of the search engines in removing results.
But Google does cooperate with DMCA. The difference there is that that content deemed in violation of copyright is actually illegal for anyone to distribute; legal responsibility extends all the way to the website owner.
Unless content falling under "right to be forgotten" is ruled privileged and not legal for public distribution, any artificial roadblocks to their discovery will merely present a business opportunity for their circumvention.
Information you, yourself, post publicly to the internet is public. Just the same as if you got up in Times Square and shouted it using a megaphone.
Information that is factually accurate that is posted publically on the internet isn't under your domain to censure. This falls heavily in the camp of "freedom of my speech not freedom of your speech" that seems so common here.
Information that is posted by others that isn't factual is already covered by libel and slander laws so doesn't fall under here.
The internet should be, and for the sake of truth has to be, immutable. The "right to be forgotten" is the right to break any concept of online reputation.
If you want to control your narrative, maybe don't post thoughtlessly and publicly.
I am not posting thoughtlessly. What I am saying is that there is just a million ways that anything can be interpreted in the future that you have no way of foreseeing.
This is not a tech problem (at all). This is a social problem that had existed since forever, but now uncovered by technology's availability. And if the agreed solution to the "world's gone mad" is to grant one legal ability to alter other's memories, then the world's truly gone mad.
However, that doesn't mean that annotating content or making it more clear that a different (Firstname Lastname) did something might be a better response. For example, I have never created an account on facebook, linkedin, twitter, etc. I refuse to give any one company a defacto monopoly over social discourse and interaction; those tools belong on OPEN, FREE (libre+beer), well defined minimum interoperable standard platforms. Currently that's email; it really sucks, but the standard is well defined, anyone /can/ implement it without barriers, and everyone is forced to federate to at least some degree.
It isn't and it has never been.
Given that different countries have different laws and yet all claim to universal application of their laws can you imagine how many people would be killed because of this?
Atheists taken off planes form Arab Emirates flights because they posted about god. Homosexual activists getting assassinated by Russia. Europeans being arrested in the US because of the difference in age of consent.
The web is ephemeral and should be anonymous.
Immutable data structures have their place in programming, and immutable communication has it's place in society, but neither are appropriate for all use-cases.
If I made an offensive tweet in 2007, I'd stand by it, and assert that it's my tweet but I'm also a different person. Personal integrity would demand no less. Hiding behind a "privacy" law to conceal something you shouted from the rooftops in 2007 is a disingenuous position to take.
It happens frequently, not only with rape, but with any false accusation of a major crime... Typically there are a lot of articles about it but if you are proven innocent no main source reports it.
Now, would you want that information to remain online and to appear instantly anytime some HR look for your name after reading it in your cv?
This model is used by other nations because regulating publications prevents both short and long-term harm while still preserving access to the truth.
My parents abused me, however, they were never convicted of a crime in a court of law because nobody ever reported them. The statue of limitations has almost certainly passed by now so they are unable to be convicted. Are you suggesting it should be illegal for me to say that my parents abused me?
Same with Bill Cosby's accusers, the statue of limitations has passed on all of those. Should those women really not be legally allowed to have their voices heard?
I can't see this leading anywhere except an authoritarian regeme where court proceedings are conducted in secret - after all, the suspect is being publicly accused on crime in court.
We know employers can and have discriminated against individuals where convictions have been overturned.
However, I don't think the false conviction should be hidden - what if they were guilty and there's a crimes down the road? That information could be useful for the public.
Nobody owes anyone an apology.
Faces, I can see that but journalists must be very explicit... "This is the suspect who allegedly did xyz".
From my point of view, Google eliminated itself long ago: it's already intentionally very biased, distorting the reality by design, different to every of its users: adjusting the "algorithm" to return to the seeker what the seeker would click the most and also what the Google's marketing would like to present, even to the point of intentionally ignoring the very words you entered in the search(!).
Just try to find somebody with whom you seriously disagree on some topic then ask him/her kindly to "google" the same terms related to that topic and to let you see what he/she sees. Compare.
That said, it would be nice if there was an easy way to turn off this search personalization on a search-by-search basis in a similar manner to how one can look for results from a certain time period
Not counting that even that "magnification of the confirmation" is additionally modified for some topics by the "editorial" policies of Google.
it is, it's called anonymous window
When I open an incognito tab and search for “latex”, the entire first page of results is still all about LaTeX for me and I would bet my left hand that this is not universally true.
In other words, you are still seeing personalized results.
So yes it is the right to hide the past to the extent that it should not be super easy to get all available dirt on anyone at any time anybody gets nosy or irritated. But if you are willing to put in weeks of work to find out that someone once smoked pot or burned an american flag or whatever you can still put in those weeks of work.
No we should not, the law seems well intentioned, some kid may not have to worry about an embarrassing photo from university days following him/her around into their adult life, etc. But I think it would be an overall net negative for abuse in hiding truly relevant information.
You say that, but according to some reports, only 5% of requests to Google to remove personal information from search results comes from high-profile figures (politicians and the like), the rest is just ordinary people . I know that there is a huge selection bias going on here, but I'm not sure you can discount the data point just based off of that.
Because ease of access makes all the difference, and it is disingenuous to argue otherwise.
This law won't protect Trump from having his bankruptcies reported but it will protect someone who made a stupid mistake when they were younger having it haunt them for the rest of their careers or lives because it would otherwise have been the top hit when you google'd their name. It will still be there for anyone willing to put in the effort to find it of course. Just like it was in the old days.
Google will happily ruin that person's life forever for a few cents in ad clicks. Think about that.
I don't agree that this will protect anyone, or that they need to be protected in the first place. This is just hiding the problem that someone would judge you for whatever the content is.
In my opinion the real solution here is to be more understanding to stupid things people have done in the past. Nobody is perfect, so why are you letting whatever that story on Google is affect your judgement of the person, assuming they learned from whatever it was.
You can't deny that the advent of social media has brought about a MUCH bigger problem of people getting fired, judged, or their lives ruined by stupid mistakes, posts online of a single comment they made, etc. These things aren't _new_ occurrences. People have always made stupid remarks; people have always done stupid things. The difference now is that it's trivially easy for me to go find some idiotic thing you did as a teenage that you've outgrown, and point to it as a way to ruin your life if I want to. All it would cost me is a little money to Google or Facebook. Back in the day, an attempt to truly ruin someone's life like this would require _digging_. If the person deserves it to happen to them (which is almost never), then people will go through the effort. But for the casual assholes who think it funny to SWAT someone or to call up embarrassing photos because that person called them a bad word, this barrier to entry would prevent them from doing something about it.
I only heard about this anecdotally. Do you have any data on such a huge increase in firings etc.? That would be quite interesting.
Furthermore, context matters, but context is the first thing that gets lost online.
Something that was said or done in one context can be perfectly fine in that context while at the same time that thing can be completely unacceptable in another context.
We need not and we should not strive to keep a record of everything. It is important that we are able to forget.
The world at large is not capable of ensuring due process. Real lives of innocent people are ruined. People are driven to suicide. All because the public formed a narrative about someone based on incomplete or otherwise inaccurate information.
I wish for compassion and forgiveness.
One tweet taken out of context we’ve seen many examples of is enough to ignite an epic shitstorm. No. You’re just proposing further destruction of public discourse and civility.
If you think the real solution is to change the way the world thinks about, and judges, everyone else then you're being remarkably naive.
This isn't a response, it's an insult. Two well educated, equally experienced people can come two two seperate conclusions, without one being "naive". Try and argue always assuming that the person your taking to is equally intelligent as you.
Regardless, treating the symptom (Search engines displaying old news stories) of the problem (people judging others irrationally), without treating the problem itself will only cause equally bad, if not worse, issues to pop up elsewhere. The vast majority of times a state censors information, it ends up being harmful far past what was initially intended (the red scare/McCarthyism, DMCA, FOSTA, SOPA/PIPA, etc).
But every inconvenience presents a business opportunity that will inevitably be seized upon so long as the information being sought after is still considered part of the public record. The public record consists of all information that the public has a right to know. A more honest solution would actually declare certain information privileged and make it illegal for websites to publicize it.
The question of whether one uses Google or other sources is irrelevant here. Sure, journalists could go to find public records and not be able to be published because of the law (because their article wouldn't be listed etc).
That's why people try to get things delisted from Google, because they can't get the original publisher to remove them.
That's the entire point of this. Right to be forgotten laws don't allow you to censor the intercept (a publication), but do allow you to force aggregators to remove the publication.
If you can prevent a journalists' article from ever being found, you have effectively censored it. Especially, this would impact a journal outside the "top 100" - Someone could conceivably search the 50 or 100 top publications but after that you need an aggregator. So essentially this would censor journalism.
And to my original point, it's not just about Google in particular since the alternative to Google is pretty much some other aggregator.
Oh this I 100% agree with. Information isn't information if it can't be found. (And noting that I'm employed by Google) I don't at all find the argument that "we can't prevent you from publishing something, but we can prevent it from being accessed" not compelling. Does it apply to newspaper anthologies or hard copy aggregation?
"It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying 'Beware of the Leopard.'"
I'm imagining a subscription-based aggregator - something like LexisNexis, say - that isn't subject to the same right-to-be-forgotten laws that search engines are.
It at least seems possible that we'll end up with a situation analogous to the credit scoring industry - with data accessible to corporations, but not accessible to individuals.
Unfortunately we live in a world where if something is not listed in Google's SERP, we doubt it exists.
... would Google be allowed to link to stories that mention those bankruptcies?
It's not like this "right to be forgotten" will apply only to Google. If Google folds, all search engines will surely respect it.
Agreed. That said, should we give Trump the right for his bankruptcies to be deleted from Google?
I think even Trump should be entitled to accurate public reporting, for example, it should be noted Trump has never personally filed for bankruptcy, rather, he has been the owner of companies (casinos) that have filed for bankruptcy.
Would you attribute the recent Toys R Us bankruptcy to individual owners? Or if your favorite startup goes under and has to declare bankruptcy to shield itself from creditors before the company dissolves would you start attributing bankruptcy to individual employees with equity/stock options?
I wouldn't attribute it to the owners of individual stores, but the CEO? Absolutely
> Or if your favorite startup goes under and has to declare bankruptcy to shield itself from creditors before the company dissolves would you start attributing bankruptcy to individual employees with equity/stock options?
Trump wasn't an individual employee with stock options.
Still if you want to attribute the bankruptcies of companies to Trump personally because he was CEO...post a link, because I don’t think Trump was an employee of these companies at all, much less responsible for day to day operations of the casinos as CEO.
He was an owner for sure and likely sat on the board, not unlike many SV investors and VCs of startups that fail everyday.
Yes the casinos have his name on them, but that doesn’t make him CEO, or an employee, the sole owner of these casinos, or mean when the casinos filed bankruptcy they were his personal bankruptcies.
You are right they were not public companies, but neither are startups and we don’t go out of our way to attribute failures and bankruptcies of startups to investors/owners or members of the board.
>It’s not fair to put all the blame on Trump for the four bankruptcies because he’s acting as any investor would. Investors often own many non-integrated companies, which they fund by taking on debt, and some of them inevitably file for bankruptcy, said Adam Levitin, a law professor at Georgetown University.
And basically supports what what I suggest:
>He added that people typically wouldn’t personally blame former Republican presidential candidate Mitt Romney or investor Warren Buffett for individual failures within their investment companies, Bain Capital and Berkshire Hathaway, respectively.
Despite the fact the guy pretends to be self-made (he isn’t), there is something to be said for someone who took a $10M loan from his father and turned it into a $1B+ Business empire. Sure there have been some individual business failures along the way (which his ego barely allows him to acknowledge), but it pretty funny SV with its ethos of not being afraid to fail and iterating, is so petty they extend 4 business bankruptcies (which were all chapter 11 reorganization’s) onto the man personally, but would never dare put that type of standard on SV investors (just as the article you linked suggests).
1) He funded the construction of the $1 billion Trump Taj Mahal casino in Atlantic City, which opened in 1990. By 1991, the casino was nearly $3 billion in debt, while Trump had racked up nearly $900 million in personal liabilities, so the business decided to file for Chapter 11 reorganization, according to the New York Times. As a result, Trump gave up half his personal stake in the casino and sold his yacht and airline, according to the Washington Post.
2) As a result of the bankruptcy, in exchange for easier terms on which to pay off the debts, Trump relinquished a 49 percent stake in the Plaza to a total of six lenders, according to ABC News. Trump remained the hotel’s CEO, but it was merely a gesture -- he didn’t earn a salary and had no say in the hotel’s day-to-day operations
He had a lot to do with those two.
After a certain period a crime is considered spent and you can not be required to reveal it - except in some very limited circumstances. You'd have to reveal spent crimes if you want to join the police or become a magistrate for instance. You might want senior politicians added on this list perhaps.
If you got a fine or a short prison sentence, the period is relatively short. A long prison sentence for a serious offence is never considered spent. I think there is something of a sliding scale of how long a sentence takes to expire.
I would want essentially the same from the right to be forgotten. A citizen should not have to see Google reminding potential employers of a minor or youth offence 20 years ago when the legal system feels it expired ages ago.
Doesn't seem that complex to achieve either. So what would be the problem then?
Since you're familiar with it, I'm curious how the following situations are handled...
There was a famous rape case involving a Stanford student. His name is Brock Turner.
If you search directly on keywords "Brock Turner", the first results will be his rape conviction. I think it's logical to assume the Right To Be Forgotten wants these search results directly associated with his name to be removed.
But what about a level of indirection? What about searching for "The Stanford Rapist". Those results are also about Brock Turner. Are those removed as well?
What about another level of indirection? If one searches images for keywords "mugshot criminology textbook", Brock Turner's photos are the most prominent. Would those get removed too?
What if the victim writes a popular blogpost, "I was raped by Brock Turner and here's my story...etc...", would her article be removed from the google results? What takes precedence? The victim's free speech or the felon's censorship powers granted by Right To Be Forgotten?
(made possible because several reddit threads made it widely known that his photo was now in college textbooks: https://www.reddit.com/search?q=brock+turner)
Those would not be removed under the Right to be Forgotten as defined by the original verdict. The reasoning is that the individual is harmed most by people explicitly googling their name (i. e. recruiters, business partners, family etc). If you come to the article from the other side of the issue, it's far less likely that you're in a position of power vis-a-vis that person.
> What if the victim writes a popular blogpost,
Such an article would once again be subject to the Right to be Forgotten.
> What takes precedence? The victim's free speech or the felon's censorship powers granted by Right To Be Forgotten?
That's far too general a question to ask, and appears to be phrased with a certain outcome in mind.
Neither right is absolute. It all depends on the specific situation. It requires exactly the sort of balancing the judicial system is build for.
A sentence of over 4 years prison, including suspended, would never be spent. Prison for six months would be considered spent after 2 years. However, sex offences are an exception covered by the sex offence register, which gives a different time line. 10 years or life I think, again depending on severity of sentence.
Thus he'd not be rehabilitated and forgotten by UK law until 2025 I think, so should not really be getting the right to be forgotten - it's too current.
The rehabilitation law itself covers insurance, employment, housing, media and so forth and has no awareness of any new media as it's from long before them. So it would solve the indirection via obfuscation: An old, spent, offence would be buried in some newspaper and court archives and probably only surface if someone was dedicated to digging into their background.
The EU right to be forgotten seems like a valiant, if perhaps a little clumsy, attempt to achieve a similar effect.
Also, you might want to consider the barrier to entry for a new search engine.
I've no idea if they considered barrier to entry or relative size when drafting the law or built in any limitations. Of course until an engine reaches some level of popularity it's unlikely to receive many or any removal requests, so should remain manageable for them. If it were the individual sites every site would need a policy and takedown request, just like most have now added something for DMCA requests. Doesn't that just put the barrier to entry up for everyone publishing a site?
And yes, the DMCA is also a barrier to entry for search engines, but at least it's one that I understand well.
Thus, this is one of those times where the lawmakers actually looked at technology, and understood how it was used.
Also, making search engines responsible creates a far smaller surface area of contact than having every site that contains the content.
We provide protections for literal objects in the form of DMCA and other various copyright systems. We provide legal enforcement for the right for these copyrighted systems to be forgotten on search engines. We threaten the rulebreakers with massive fines and prison time.
But when it comes to REAL people. People who can have their lives completely changed with wrong/very negative information about them being accessible everywhere we do nothing.
It's just fascinating.
That's exactly my point. If you have enough money you will be able to take down any information of yourself you don't want under the guise of DMCA or related copyright laws.
that's how world ruled by lawyers works in general
Should I be able to google my children's friends' parents before I let them spend the night over there? I use this case because it wouldn't provide the basis of legal background check and presumably google would be my only source of info. Or does the right to be forgotten not apply to child abuse crimes?
Just give me the information, and let me decide how to interpret it.
The Right To Be Forgotten not only denies me the information in the first place, but it allows someone to selectively curate the information out there about them to tell the story they want to tell about themselves. Sure, maybe partying too hard 25 years ago isn't relevant anymore, but being convicted of child abuse 10 years ago certainly is. If it's just as easy to erase references to the latter as it is for the former, that's a problem. And what perfectly objective and moral party should decide what bits of info are ok to delist, and what should remain? I don't think any of us would trust that such an entity exists.
(I could be wrong, but I think parking tickets are aren’t published in the US, while speeding tickets are).
Either way, I don't see why paranoia should be used as a driver of policy. You can "think of the children" all day long, but most of us have been children and remember it well enough to know we didn't need to be though of that much.
(Sexual abuse is almost always another story, though.)
I'm not specifically aware of whether this is the case, but I would hope that the results of deliberations by criminal courts is public information.
But there is a big difference between having information at your fingertips in a searchable database, and having to go to a library and do the searching manually.
Perhaps google could provide the information after the user solved 1000 captchas :)
Because a judicial process made that determination. Random people don't decide anything, they petition.
As an aside, there absolutely nothing journalistic about an advertising company like Google.
>This "right to be forgotten" seems incredibly dangerous to me.
It doesn't seem like that to me at all. The purpose is also to protect victims of revenge porn, or victims of slanderous propaganda, who have no other recourse.
Up until this point in history it hasn't been feasible to do the same thing for every minute detail of every person's lives. This lack of feasibility is probably why the debate regarding this subject is unfortunately so far behind reality.
> The claimants ... argue that since their crimes are decades old, they should be able to move on from their past. .... Google counters that the information is in the public interest, and still relevant given that the nature of the claimants’ business has not changed dramatically in the ensuing years.
Google is being picky about what to remove and, in this case, go to great lengths to back up their judgment. What is motivating them? Do they want to preserve information for its own sake, or for their own relevance as a search engine? Or both?
> [Google] reported that it rejects more than half of the requests it receives, but did not provide insight as to why.
The statistics of who wants things removed, and why, are surely very interesting. Here's a report from Google (linked from the article):
Edit: Oops I should have read further!
> England’s Information Commissioner ... warned that if the court sided with Google, it would completely defang the original ruling. “In effect,” she wrote, Google “would be able to operate the right to be forgotten regime almost entirely free from regulatory oversight and control.”
Is Google looking for a blanket exemption?
The right to be forgotten only extends to people that are not public personalities, and only to facts that either are invalid (with proof), or irrelevant, but potentially harmful (e.g. you might argue that a ban you received on a minecraft server when you were 14yo, but which shows up as first result for your name on Google, would negatively impress employers, without having factual relevance).
The right to be forgotten has specifically exemptions for all the cases you mentioned.
It should be in the form of broad categories, minimizing the risk of abuse.
> too harmful to allow the public to know
Totally different topic.
So, yes, they should. Without this, no rehabilitation is ever possible, every childhood mistake will affect you in your job.
The right to erasure (right to be forgotten) does not protect this.
It refers to a very specific right that does have real limitations.
The ICO has written some specific guidance about this that might be helpful in your understanding.
The justice system has a perfectly fine systems for storing and expunging criminal records over time, and you can still find the article if you specifically google the event.
Child molester is probably the worst example, since they have specific lists, exclusions zones, and warning system attached to their presence, nobody should rely on google to find if their new teacher is a child molester.
You might want to think about what the basic definition of "journalism" is, then. And how maintaining a relentlessly detailed, universally accessible dossier on the 1 percent of the population that qualifies as truly notable persons (such as politicians or notorious criminals) -- i.e. what traditional "journalists" do -- is in fact, quite different from ...
maintaining such a dossier on everyone, which is what Google does.
Just because something falls under a legal classification, it doesn't mean we'd start calling Google search a journalist in colloquial usage.
Google wants it both ways, in whichever way is momentarily convenient. Almost everyone's comments on Right To Be Forgotten in this thread miss that key issue.
But at the same time, perhaps there needs to be a way to de-emphasize older truths? For example if Alice did something 10-20 years ago, should that appear in the first page of the search results? Perhaps it should be ranked lower due to age.
Perhaps search engines could have a memory system similar to humans. Older stuff is gradually forgotten or de-emphasized or turn blurry.
A humane memory system.
I think this is what is in mind in terms of a "right to be forgotten". Not the large things, but the small bullet points from 10+ years ago that, in a sane world, would be completely ludicrous to compare with your current self.
As others have pointed out, Trump hasn't actually declared bankruptcy. There is an interesting debate about that in other threads, read them there.
If someone searches for your name, google has no way of determining if they are looking for your LinkedIn profile or reports of alleged criminal activity from you, nor does it have journalistic resources to determine if said reports are accurate or to edit them in light of new data.
It's already possible to game google into indexing fake testimonials for scam purposes, as well as push old results off with a barrage of recent activity. Bad actors can always find ways to misdirect and deleting is only but one way. Isn't it fine that delisting just remains astronomically more laborious to do as the volume of reports of badness increase?
Siding with google seems more like a passive aggressive way of getting an ego massage out of "punishing" things one does not like, but it kinda throws out the baby with the bath water.
Those records are all available in a publicly accessible registry maintained by the goverment.
> Do we give child molesters the right for their crimes to be forgotten?
You have the sexual offender registry for those things.
> Why should people have the right to have things they don't like wiped from the historical record?
No, it's about people's mind not being immutable. I used to believe in Santa ~25 years ago, if I would have made a viral tweet about it back then, should I be barred from joining a scientific community today?
Even Eric Schmidt said that young people should get a reset button once they become of age.
My rule of thumb is that: the authorities and law enforcement should have monopoly over maintaining information that should stick with you forever, not some private company that is even more unaccountable than the government.
They can't have it both ways I'm afraid.
However, it's pretty ridiculous really. If they claim to be journalists, then it should be possible to also sue them for defamation if they publish defamatory material. Currently you can't because, quite rightly, they use safe harbour provisions to prevent being sued. They are now sailing into waters where they are at risk of not being able to claim this.
The folks at Google haven't thought this through clearly.
Nobody in their right mind is arguing that a criminal can log onto some crime.gov and say "plz delete me", for example.
I'm sorry, are you arguing that not forgiving governments, elected officials, and people in positions of power and authority (or those seeking such power) is a problem in our society? Like, the US, the country that elected Trump, is having a problem with forgiving/forgetting past transgressions?
> but there are far more reasons to forget.
If there are that many, why didn't you name them?
> If this was the case everyone would hate everyone.
Hate isn't the issue. We're talking about real world problems with consequences. Evil doesn't just go away because you forgive it, people who are harmed are still harmed, and costs incurred need to be paid. If you just forgive and forget naively, that means the people who are hurt just keep paying, over and over. And the people who take advantage of it just keep winning forever.
One reason i desire to see more convict employment advocate groups
So no, Trump's bankruptcies won't be forgotten in this system, but a false accusation of child molestation can be.
> After a request is filled, their removals team reviews the request, weighing "the individual's right to privacy against the public's right to know", deciding if the website is "inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes for which they were processed".
I think i'm cautiously in favor of the standard you propose though, if that is indeed the standard.
There used to be plenty of material contradicting his version of events (which I know to be factual) but, today, you can't find any of it.
You could say all this material was 'falsehood and lies' but, in the absence of any public record, who is the arbiter of truth here?
Seems to me, it's whoever has the money to do it?
This whole "right to be forgotten" thing just seems like a good way for wealthy people to hide inconvenient facts from public view. If it was really about falsehoods and lies, why wouldn't they go to court with the existing laws?
If the US based Google was responsible, it would be easier for people without that kind of money to protect themselves, wouldn't it?
Your criminal record is expunged after a time. This is about whether Google is above the law of the land.
But, I still take some small amount of umbrage with people claiming that nothing is being erased, just people's ability to find it. I think that's a very blurry line to draw - the idea the it's OK for information to exist somewhere in the void, but making it too easy to access is not OK.
The intent behind rules like this is to make it much harder for people to access information. If Google isn't the Internet, and delisting won't suppress speech, then what value is the law? If Google is the Internet, and delisting from their site will hide information from the public, then how is this not censorship?
There's no point in getting myself delisted from Google unless I expect that this will prevent most people from being able to find my information. People search names on Google because they want that information. The only way I can see for GDPR to stay effective or valuable in hiding me from those people is if it becomes a blanket ban on indexing or curation in general.
I wonder what would happen if an explicitly archival service like the Wayback Machine got itself a state of the art search algorithm and became widely used in casual search. Would it also be required to delist?
Let us assume somebody commits a crime. He goes to jail, he pays for it, as society deems necessary.
A local newspaper reported it and now every time you google their name such an article pops up. This makes every person they might do business with to withdraw.
It seems to me the debt to society has been paid by going to jail, and yet they keep paying by having the news of their mistake be very prominent.
What is the right course of action, in this case?
Note that the difference between traditional newspapers is their general availability. Now the entire history is just a few clicks away, whereas with paper you need to go look for it specifically to find it, so it's a LOT less damaging.
What does HN think about this?
>This makes every person they might do business with to withdraw.
Hiding information from people sounds like a bandaid to the issue of people being overly judgemental. Maybe we need to be more accepting to rehabilitation.
I mean, listen to yourself. Rather than an easy, technical solution, for a very limited and well defined problem, you're proposing we instead just wait until the billions upon billions of people on earth suddenly wise up, hit nirvana, sing Kum ba yah, and _be less judgemental_? So that, what, a giant, wealthy corporation isn't mildly bothered?
If we have to consider the reality of the situation for felons, we must also face the reality that they don't really seem to be the end goal, but just a story to placate worries about overregulation and censorship.
If I am a search engine, and don’t comply, I am going to jail. It’s not easy nor technical.
Put another way, if I put the name of my new neighbor into Google, the resulting webpage I would in fact most want to see as the result would definitely be the article on the murder they committed 20 years ago, even though they have a Facebook page that was updated 10 minutes ago. Facebook page can be result number 2.
If Google didn't show that result at all because it happened 20 years ago, and that person never did anything else notable more recently, I would consider Google to be a little broken.
People change and your neighbor now has a family and lives quietly. It seems unfair the world should know him about his past mistakes
Your approach is of little comfort to the people this actually affects.
A reduction in this privacy might precipitate a change in how people react to learning the private affairs of others. If everybody's legal history was laid bare, for instance, one might discover that getting in trouble is something quite a lot of people do--and in this light one might temper his reaction to learning of another's checkered past.
I think it is a bandaid, but the proper solution would (1) require massive social engineering and (2) be really hard.
But I live in a country (the US) which is savagely vindictive towards "criminals" (though only poor criminals) both by cultural tradition and as a means towards suppression of minorities.
I'm white, but my name is more common amongst black Americans. If you google me, some of the top image hits are mugshots of unfortunate black people. It doesn't affect me, but it reminds me of how awful it is that mugshots have become so easily aggregated and republished.
Journalists sometimes talk about the public's "right to know". This directly contracts an individual's "right to privacy," so the journalist has to weigh balancing interests and make a decision. (If they screw up, the courts will review it, though the damage is done.)
Years later, it's not unreasonable for some other person to weigh these balancing interests again. Someone has to make the call balancing the public's "right to history" versus the individual's "right to be forgotten".
This can't be automated. You're not going to make a good decision without some person (journalist or historian) evaluating the particular case.
(By the way, I don't think talking about "rights" is at all that helpful when thinking about balancing conflicting interests.)
These are primitive times. If someone commits a crime we should be figuring out why they did it and how we can prevent it from happening again. We shouldn't just be putting people away with moral debts. Our society's vindictive nature on these matters is exactly why they don't forgive and forget. They can't forget, because they never cured the 'root cause' of the crime, and thus must constantly live in fear of it recurring.