First, kids use sites and apps when they're below the ToS age (usually 18 or sometimes 13, at least in the US). How are you meant to know that as a service provider? Don't get me wrong: it doesn't excuse you from doing due diligence. There's such a thing as wanton disregard for the truth. But we all know under 18s use IG, Snapchat, Tinder, etc. I'm not sure fining a company billions of dollars and putting executives in jail is the outcome anyone really wants if a 17 year old uses Tinder.
Second, what is personal information, exactly? It's not as obvious as you might think to the point where lawyers who work for these companies can't agree among themselves. There are obvious things (eg IP addresses, dates of birth, SSNs, addresses, names). But what about media? Photos, videos and sound files may all contain PII, technically.
Third, some will say "well don't store anything". Well, if a photo contains PII (which it can) then you can't operate a photo-sharing service at all for risk of storing PII of children. Also, this ignores legal obligations. If the Feds show up with a warrant about a particular user's actions then the argument "we log/store nothing" won't get you as far as you think.
So I'm not saying Google didn't do wrong here but knowing the complexities I am sympathetic to the idea of resisting broad legislative overreach as you can end up with bad laws that do just that.
And if we're going to start singling out tech giants for lobbying Congress, my question is why them in particular? We currently have a Senate that is essentially beholden to the personal interests of one unelected rich guy: Charles Koch. Koch lobbies against health care, taxes on the rich and infrastructure in the Senate. Around the country, he has been instrumental in defeating public transit and many other initiatives in many cities.
A provider offers a service is responsible to ensure it functions within the law. The "How" is the responsibility of the provider. If they can't do it, they can explicitly state they can not do so and work with the government to find constructive solutions. If they absolutely can not, it means that they have to bear the consequence of the law. If they can't accept the consequence, they can decide to shut shop.
In other words, doing the very best that’s possible doesn’t count for shit if it’s impossible to do what you have to do. Because it’s always possible to comply, just not always possible to do it digitally. Tough shit. Then don’t do it digitally.
This is what I like to call "engineer thinking" (and I say this as an engineer). By this I mean it's absolute.
The law doesn't work that way and when it does, it tends to be terrible unjust and illogical (eg mandatory minimum sentences, three strikes laws).
Here's how it actually works: because of the nature of your service users need to be able to give consent for legal reasons. It may be possible to provide such a service to under-18s but the legal compliance may be a significant amount of effort. Or it may be impossible.
Already we've run into a problem because what about emancipated minors? They can, by definition, give legal consent. You may choose to ignore this corner case because it's too hard. That may be fine or it might not. You may be accused of unfairly denying services in certain segments of the population. Depending on your service, you may even run into legal trouble.
Assuming you're a good actor, you make a "best effort" to comply with the age restriction. You may ask for certain personal information or even run a credit check. Certain things can only be done in person, perhaps just the first time, perhaps every time.
If it ever comes up that you provided services to minors and there were legal consequences (civil or criminal), a court would ultimately look at the balance of evidence to decide if your actions were sufficient given the consequences of noncompliance, the harm you could've done and so on (note: the standards are different criminally and civilly too).
So if you provide a hook up app and just ask "Are you over 18? Y/N" and have lots of profile pictures of suspiciously young people, lots of mentions of high school and so forth then it can be decided that you had a reckless disregard for your legal obligations and be fined or charged criminally. A court may also decide your actions were "reasonable" (this word comes up a lot). For example: you tried to verify them with a Facebook profile (with efforts to make sure it wasn't fake), you monitored photos and profiles for certain keywords and so on. That could satisfy your legal requirements even if it wasn't perfect.
It's the difference between a bank who sometimes unknowingly facilitates criminal financial transactions vs a bank that has flagrant disregard for the law. There are banks that will charge you 15-20% for a wire transfer, for example. No legitimate customer would ever pay that. It's clearly aimed at people who don't want any questions or scrutiny. This will land you in jail. The first won't.
Ironically, you called my thinking as absolute and went ahead to do the very same thing you called "Engineer Thinking". Nothing in my comment above contradicts your comment. Your effort to explain HOW in detail is admirable. However, I wish you hadn't created a label on my way of thinking. It doesn't help a discussion.
Quite frankly: I want 17 year olds to use Tinder, figure out how it works, and understand the problems therein. I want kids to be able to tip their toe in anonymously to adult activities. Any hard-line requirements create a nice hard line - People who have no experience with X, and people who are expected to be experts with X - And that's how you create a feeding ground for predators. 17 year olds don't have to be able to recognize that a 40-year old man who's hitting on them despite the fact the 17 year old lied about their age is not a quality catch - They have to have support networks and enough opportunity to test the waters to figure that out before they get caught in a predator's web. One of the most important tools towards that is the ability to create anonymous accounts and explore, with understanding but without consequences.
 I won't iterate over the ways that is true, and I won't iterate over the ways that your perfect age verification is either a) broken or b) totalitarian and broken.
This is like a bar complaining that they shouldn’t have to check ids because counterfeits exist. You can’t serve your customers At Scale if you have to comply with those pesky laws, so maybe don’t do it at scale.
How about 16 yr olds? 15? 13? 10? 7? Should we draw a line for "dipping their toe anonymously in adult activities" somewhere, or should that line simply not exist?
Do you believe that every 13 yr old is in fact being appropriately monitored by parents? If not, should there be some safeguards to mitigate the fact that in practice this is not reliable?
Age of consent differs across countries. So yes, different countries and jurisdictions already draw a line for various adult activities. They also disagree what those lines should be and where they should be drawn.
Reality is messy.
I'm in the unenviable situation of having had one of my big projects shut down for precisely this reason: it's too expensive to moderate sites that kids may use. The reason for that is actually very simple, proved by data, and verboten to say.
Yet, that might not stop a 17 years old without a license from stealing my keys and taking it for a joy ride. Should we then outlaw cars since evidently it doesn't function within the law?
These kids are violating the ToS.
First of all, the TOS click through flow is more like leaving a car running, unlocked, and adding a piece of masking tape that says "please don't drive me" on the door handle. TOS operate entirely on the honor system.
Second, you have no incentive for random children to come drive your car. For the tech giants, more users are great! So their incentives here are all in favor of letting kids sign up and doing the bare minimum to try and prevent this required to appease regulators.
It's still unlawfull to enter it and drive it, at least in America.
"Your honor, the car was asking for it".
Jail, would be disproportionate, but, those giants are perfectly aware that kids under certain age have absolutely no idea of the value of money and how much it costs to bring it home, then spend it carefully so there will be some available for less important stuff. So you show them ads of the latest cellphone or pair of famous-rapper-sponsored shoes that cost half of what their parents bring home in a month, and they'll want that crap no matter the cost. This is no that different from targeting very old people with phone calls, among which some can be lured into buying unnecessary stuff, sign fraudulent contracts, or worse, just because a well dressed drone rang their door and spent the evening talking with them to conquer their trust.
Quite frankly I'm disgusted by those manipulative people, and tech giants dwelling in the dark side to maximize their profits make no difference.
I don't see how companies not properly filtering out minors from their platform equate to them being similar to fraudsters who target the elderly. One is an example of (potentially) willful ignorance of the age of users on their platform, and the other is willful malice targeted at a vulnerable group.
Honestly, why not? If society deems that illegal, and the enablers are responsible, then so be it.
If said company can’t figure out how to profit while preventing this then why do I care? Lot’s of businesses aren’t viable without illegal practices.
I’m doubtful this will be the actual fallout, but it wouldn’t bother me at all if it was.
If your businesses success relies on scale, but that scale in turn presents insurmountable issues, then congratulations, you found an unsustainable business model. No big deal.
Why not? I can think of a couple different situations where a given democratic society deemed X to be illegal, but that does not make it right.
Because they're the criminals. By their own admission.
The real question is why it took Congress so long.
If systems cannot be designed to comply with regulatory and legal requirements, then perhaps they should not be built.
Alternatively, reasonable alternatives and methods should be defined in the law.
And you should disclose your employer affiliation in comments such as this.
We're way beyond blatant disregard.
You aren't. You're supposed to stop spying on your users.
> I'm not sure fining a company billions of dollars and putting executives in jail is the outcome anyone really wants if a 17 year old uses Tinder.
If Tinder is spying on it's users then I am sure of that, since I want that outcome. (And now you're sure of it too.)
You think major providers don't know who is a kid? Please.
They have profiles for every person living and dead. They have 1000s of data points for every person.
They only pretend to not know. CYA.
A fiction that stops when they target ads towards those same kids.
How would that work in practice? A lot of us here operate websites. Your website gets a request. How are you supposed to tell the age of the person that request came from? Maybe Google / Facebook have enough data to identify the person and their age, but ironically, they would have to collect information about them to determine that it's a person they shouldn't collect information about.
Certainly no child in the history of the Internet has ever lied to a website when asked their age.
I run some (low-traffic) websites. I don't collect any information about anyone. It's not essential.
If you want to have a newsletter, or offer a service where people log in and want to recover their password, you'll need at least an email address. Can you verify that every email address is exclusively used by people 13+?
re: collecting IP addresses: I run my web sites behind Nginx and does have rolling logs, but I ignore IP addresses.
For your e-mail use case, I think a reasonable compromise is asking if the user is older than 13 and what alternative is there but to trust them? Personally, I think that collecting e-mail addresses for people signing up for a newsletter is a great example of when it is perfectly fine to collect private data.
We all see how incompetent the government is when it comes to understanding technology and how it works (see the Missouri governor clamoring to prosecute someone for “hacking” because they used view source), so remember when you’re asking for greater regulation who it is that’s going to be doing the regulating.
That said, I am not suggesting that businesses should or could do this. Some businesses are required through regulations to capture IP addresses of their customers at a minimum.
Children can only participate with these companies because the product is 'free'. Same with p*rn and free2play games: eliminate the 'free' component to effectively block them from children (at least without parental consent).
Idea, browsers would implement parent control, so when you setup a child account on your kid device the OS will store the age of the user, the browsers would read this. Then in the request (maybe depending or region or whatever protocol) the browser can tell you if the user age is <13 or other interval.
So this means the browsers need to collaborate with the OS and the parent has the responsibility to setup the correct user account for the user in the OS, for older OS the parent will have to setup this setting directly in browser and setup the browser in a "child" mode.
I have a YouTube playlist of videos I'd like to show my kids. It turns out that YouTube will not allow videos marked as "intended for children" to be added to a playlist, even by an account that belongs to a adult. So the only videos I can put in that playlist for my kids are ones YouTube thinks are NOT for kids.
You could ask everyone to confirm their age with eID.
- "can maybe connect sensitive data to some persons"
- "can absolutely connect sensitive data to every user unless they really go out of their way to avoid it"
This sounds like a starting point for further investigation.
Perhaps the litigation will reveal more about what is going on inside Big Tech.
They want less privacy protections for the humans they're exploiting? Add more. Add so many protections they probably won't be able to make any money anymore.
See also: revolving door policy making.
Lobbying lawmakers is a basic practice that happens all the time. And a law is not right on its own right, there is always supporters and detractors.
Just that we agree as a society yo abide by the laws we currently have.
If anything we should invest in more public awareness to make sure strong privacy requirements are signed into the law.
On the other hand, strong privacy laws is a sizable buerocratic burden on the society. So we should be cautious to not over regulate.
Laws are difficult to change or revoke even if they are found ineffective or too stringent later on.
A high trust environment where most of the players do the right thing without the buerocracy and enforcement is way more efficient.
I knew FB / Google Ads was a hive of scum and villainy (but would, nonetheless, accept the paycheck generated by all that), but did not think it was quite this bad. Much like when the varnish started cracking on both Google and Apple when Eric Schmidt fired the recruiter due to Steve Jobs' displeasure.
If not, then I don't know if these mega-corporations are perhaps to be considered untouchable, which would cast serious doubt on our government; or rather, erase most doubt remaining about the degree of corruption thereof (at least from my perspective.)
Google hasn't been shy about their own cash for kids(' data) schemes. It's also pretty wild how similar their market creation tactics are (and how beneficial they would be) to groomers other child exploiters.
What are advertising and privacy protection laws regarding children like in other counties? Could Google have happened somewhere with stronger consumer and child protection laws?
There are at least 4 similar stories on the front page right now all stemming from the same amended Texas court filing.