Hacker News new | past | comments | ask | show | jobs | submit login
Protecting Children's Safety Requires End-to-End Encryption (simplex.chat)
116 points by Sami_Lehtinen 3 months ago | hide | past | favorite | 81 comments



I'm happy to have a discussion about this, but I don't buy that end-to-end encryption will, overall, increase children's safety. I'm happy to see strong evidence one way of the other, but this article is very one-sided.

Child grooming on the internet is a huge problem, and I suspect (again, happy to see evidence against it) it's better to allow scanning of children's communications, rather than ensuring their communication with whoever they are talking to is end-to-end encrypted.

I don't think it's reasonable to make this the parent's responsibility, when basically the whole world is internet connected, and those parents are fighting against multi-billion dollar international corporations, who dedicate huge teams of highly paid people to ensuring children get addicted to their platforms. Looking at the children in my life, it's very hard to keep them off social media, even if you don't give them devices they will just use their friend's, or devices at school.


> it's better to allow scanning of children's communications, rather than ensuring their communication with whoever they are talking to is end-to-end encrypted.

This is not purely a technical matter. Children and adolescents want confidential communication just like older people. You might argue, that it is in the best interest of minors to have someone/something look at their private communication to intervene if necessary, but there are much better and more versatile ways to protect the children: Educate them and their parents. Children need to learn how to manage trust with strangers in the offline and online world anyway and the internet doesn't change that. In fact, sexual abuse usually has an offline component to it that you can only effectively protect against by educating children.


>there are much better and more versatile ways to protect the children: Educate them and their parents.

That's exactly it, we should address the root cause of the problem, not its consequences by punishing and threatening to be punished. It's much more effective.


> Educate them and their parents. Children need to learn how to manage trust with strangers in the offline and online world anyway and the internet doesn't change that.

I'm not sure this is even possible. I understand children are smart but I don't think they are this smart. We can educate children to be cautious but i don't think we should expect them not to get out witted by an adult.


Children are smart enough to understand that no stranger except for a doctor should request to see their private parts. Children are smart enough to learn that they should not respond to sharing names, ages and locations when asked about it on the internet.

Also: Children are a lot smarter than AI, which is allegedly the super-magical protection against child grooming.


> Children are smart enough to learn that they should not share names, ages and locations with strangers on the internet.

I'm old enough to remember when that was the norm.

But it's been along time now since Facebook etc. came along, and they spent the entire time since then insisting on a real-name policy and asking us to give details of our age, our location, which school we go to, and post a continuous stream of photos of ourselves.

> Also: Children are a lot smarter than AI, which is allegedly the super-magical protection against child grooming.

By what measure are children smarter?


Any measure? ChatGPT still can’t solve a sudoku, let alone do any form of reasoning. It’s more of a search engine than “intelligence”, while humans obviously can reason and solve novel tasks.


> ChatGPT still can’t solve a sudoku,

Now you've got me wondering, if there's even an appropriate test set to find out.

When I do a sudoku, it's either printed and then I don't know if I made a mistake until the very end; or it's on a computer which tells me about mistakes as soon as I make them.

I do know I make mistakes when doing them, but I don't know how often — more or less frequently than ChatGPT.

(Also, at least one LLM has been demonstrated to do Turing-machine-like operations[0], and there were Sudoku-solving AI well before LLMs).

> let alone do any form of reasoning. It’s more of a search engine than “intelligence”, while humans obviously can reason and solve novel tasks.

I am aware of Clever-Hand and that I may be making the same mistake, but it sure looks like it can do those things as well as humans.

Humans aren't that great either when it comes to genuinely novel tasks, which is why we have to be trained to do science and use the method of falsification.

[0] I'm not sure if I need the "-like" caveat? https://manifold.markets/DanMan314/will-anyone-win-victortae...


I have never seen an example where it didn’t basically just rehash some answer already found in its training data. Where it excels is language tasks, and retrieving knowledge can be looked at as a language task - it has some internal representation of some textual form of the repeating parts of its training data set, and it can translate it into an answer appropriate for a question.

It fundamentally can’t come up with a novel reply, unlike a human. It breaks spectacularly the instant you go off its training set, which is very easy to notice if you ask it about a topic you are knowledgeable about.


It has to generalise just to be able to fill in the gaps within-domain.

If it was not so, I could copy-paste lines from this into google and find out where it was quoting from: https://chatgpt.com/c/3105db2c-b88d-42c6-83bb-ea2b9d6d65a1

> unlike a human

You know the phrase "thinking outside the box"? It's a thing people look for specifically because it's rare.

I can't tell how good or bad LLMs are at this compared to a human due to LLMs having a ridiculous advantage when it comes to breadth of knowledge.

My guess is they are indeed worse than us at generalising.

But they do generalise. They have to generalise even to be at this level.


I feel like you are just asserting things because it feels right. I don't have children myself but I have a gaggle of nieces and nephews and I just can't bring myself to be convinced that they can be trained to thwart the plans of an adult. Add to that all the cases I've heard about, no parent said they didn't try to train their child not to talk to strangers. The predators are using tools designed by billion dollar companies and very refined grooming tactics. It will take some evidence before I'm convinced children can combat all that with training from over worked and underpayed parents.


It however requires no evidence to suggest that the problem is having children without sufficient time and resources to support them will lead to poor outcomes.

If you want to protect your children, you have to be able to take those steps, not outsource the responsibility to someone else with time and money.

Predators are bad, but most of them are known to the child before the injury -- that suggests parents are way more responsible for child abuse than randos on the internet, it's just that randos have more opportunities.


> It however requires no evidence to suggest that the problem is having children without sufficient time and resources to support them will lead to poor outcomes.

This is such an infuriatingly bad victim blaming comment that I can't believe it's on this forum.


Nobodies blaming the victim - that was the assertion of the problem by you.

If you have children you can't support, don't blame society that you're having trouble supporting them. It's callous maybe, but remember the victims are the children, not the parents. The parents created the problem


That's not even remotely what I asserted. I'm asserting even if I had all the time in the world the I can't fight the forces of billionaires creating ever more sophisticated tools of communication by training my children.


> ... children can combat all that with training from over worked and underpayed parents

Parenting rule 1? Show up. The thing overworked parents don't have? Time.

Hence my entire point- if you don't have the time for kids, its not on society to fix absenteeism. If you're not spending time with your kids, they're exposed to others, so the overworked or inattentive? Their children are the victims. Those are the ones who should have parents in their lives paying attention to them and reinforcing good values.

Society can't be forced to babysit everyone who had kids but couldn't commit the time to it . It takes a village - and we will pick you and your kids up when you trip -- but if you spend 20 years not raising your kid, it aint our fault.


Did you not read anything else I wrote and only focused on one thing?


Do you understand that what you're writing is repeatedly saying it's everyone else's fault?

The people screaming think of the children are almost always the ones I want nowhere near children.


I find your implication petty offensive, but moving on. I'm advocating for giving parents tools to "show up" as you say. I don't know wtf you are advocating for.


Children. Always the children - you're advocating for parents to delegate their responsibilities. I'm advocating for children. There's no implication, I'm being quite explicit - protect the children -- especially from bad parents.


I'm sure you don't mean this, but it sounds a lot like you are saying that it is a child's fault when they get sexually abused and don't tell anyone about it.

That happens all the time to children, they are groomed both offline and online. It's a real problem, and it isn't going to be solved by saying we just need smarter children.


I'm guessing this mismatch is a question of differing life views/philosophies; as a general rule, for me, it is absolutely my responsibility as a parent to educate my daughter to make sure she can evaluate risk "properly" and then, it is on her* to decide how to manage that, always knowing that are people out there that will attempt to harm you.

This clearly means that we sometimes will get it wrong but, in my mind, you will not reduce those risks with a technical solution as this applies to all facets of your life, online or otherwise, and you will be opening doors that can backfire too much.

*age matters a lot but that is a different discussion by itself


Looking at an extreme case, do you think it would be reasonable to make guns available to all children, at all ages, and trust children and parents to ensure that everyone is safe?

If not (and I'm assuming not, let me know if you do think that's reasonable!), there is some space for laws which restrict children, in return for increasing their safety -- we just have to discuss where that line should be.


> I'm sure you don't mean this, but it sounds a lot like you are saying that it is a child's fault when they get sexually abused and don't tell anyone about it.

That is valuable feedback. Thank you. I'll try to improve my choice of words.

> it isn't going to be solved by saying we just need smarter children.

Agree. We need better education, which does not require smarter children. And to achieve a better education, we as a society need to talk more about how abuse can happen and where red lines, that our children can understand, have to be drawn


You are right, while we might disagree one some things (do children need end-to-end encryption), I do, I think, all want to figure out how to make sure all children end up happy and safe, and I don't think we know the best way to achieve that.


There are many ways we don't give children the same rights as adults. They might want confidential communication, but I'm not sure they should have it.

Why are you so sure that it is better and more versatile to educate them and their parents? We've been educating children, and parents, basically forever (and I've done a lot of educating of children myself). While they are young and still learning, I really believe they do need a mixture of freedom and guard-rails, to ensure they can't do themselves too much harm while still developing.


> They might want confidential communication, but I'm not sure they should have it.

Why not? Assuming you're using a Western metric of what a child is, why does a 17 year old not have the right to private communication, but an 18 year old does?

Furthermore, why do you think corporations are going to be like "Oh, this person is 18 now. We're going to respect their privacy now!"?


Well, for the exact same reason why in many western countries we decide that 17 year olds are not capable of consenting to sex, but 18 year olds are, same with driving a car, drinking alcohol or smoking cigarettes - the cut off is somewhat arbitrary but there has to be one. If it's not 18 then what other age is appropriate? Or do you believe that a child of any age should have access to completely confidential communication outside of a medical setting?


> Or do you believe that a child of any age should have access to completely confidential communication outside of a medical setting?

Confidential from governments/corporations/people that aren't their parents? Yes, absolutely.


And who exactly is a children? Because teenagers infantilization is a grave issue in the US, where they can in several states buy guns and die for their country, but can’t drink a beer, and they having sex with their peers is somehow viewed as wrong.

A ~15 years old absolutely should have every right to private communication.


It is 100% the parents responsibility to have an active enough role in their Child's life to know who they're interacting with and to provide appropriate supervision. If your child is getting addicted to something, it's because they've been enabled, perhaps through their friends devices or what have you.

You cannot push the burden of being a responsible parent on people who do not wish to be your childrens parent. Society owes you places they can go that they are safe, not padded walls on every sharp corner they might bump into -- that's baby proofing, and it has to be done at home.


The entire point of being a teenager is to get away from your parents and find new peer groups.

Unfortunately, what we evolved for is 160 people in a village with a peer group of about a dozen within a few years of us, not even a school with hundreds to thousands, let alone the internet with two people for every heartbeat in a human lifetime of which 440 million (~= the population of the EU) is within a few years of us.

And given the power imbalance between manipulative corporations and normal people is sufficient for the former to convince the latter to watch adverts while conversing with their friends — the entirety of Facebook's business model — it's not even clear to me that the internet in its current state is a good thing for adults either: to many of these companies, even the ones that are legal rather than criminal scams, we are the trees upon which money grows.


Once we're at teenagers, then it's still - did you raise them to be aware of the world around them? Did you give them the tools to evaluate safe and unsafe online activity?

A kid doesn't accidentally end up addicted heroin unless they have exposure to heroin. Child grooming predates encryption and the internet and relied primarily on trusted people convincing children not to talk.

No amount of decrypted internet traffic will protect your children from unsupervised in-person access by strangers - worrying about what they do online as a black box of whataboutism -- and again, it's the parents, not society as a whole who has that responsibility.

Are there evil people? Yes. Are there corporations that are interested in you being addicted to their products (Oxy, Alcohol, Nicotine, Meta) -- yes.

Do you need the right to inspect what I send to my wife to protect your children? No.

I'm sorry that teenagers are hard, they always have been - that's a hard part about parenting. The hardest part though is that you raised them to make their own choices - and this is the final exam -- did your kids make good choices. You don't get to put that responsibility on anyone else, you raised them, you gave them the tools - and at some point, they will make a choice without you. If you believe the only way for them to make a good choice is to be able to surveil every aspect of their lives, then I hate to tell you what will happen when they stop being children.


I'm not suggesting that we should break crypto to keep kids safe, I'm suggesting that we shouldn't let kids be on the internet at all. Society doesn't let kids vote, doesn't let kids buy alcohol or tobacco, doesn't let them drive or join the army, and doesn't let them watch 18-rated films.

I remember being a teenager.

I remember the valuable life-lesson we got about not trusting strangers in the form of the police creating a (safe) demonstration with one of their own plain-clothed officers pretending to be a villain, I remember subsequently making huge errors of judgement, and I remember others being far more trusting than me.

I remember the school anti-drugs campaign centred on Leah Betts, and that the way the teachers talked to us felt so patronising that I wanted to do drugs just to spite them. (Never acted on it, just wanted to).

I remember the art department computer, some very old attempt at security to stop kids messing with it, which I could trivially circumvent.

I remember when we got ISDN at school, and that I immediately started heavily using the internet whenever possible. They had content filters of course, but they were mediocre at best.


To be clear, I agree with you in principal - I think unsupervised children on the internet is a horrible idea, I just draw the line short of legislating that.

I'm not looking forward to when mine are old enough to realistically need to start using the internet, and wish that I could put them on dial-up text-mode internet like I grew up on instead of this ad-filled garbage we have today... But that's on me I suppose.


I hear you, but children aren't raised by parents alone.

They are also raised by society and environment. It has to go hand in hand.

Implementing structures in communications and tech that evolve around actual real-life material conditions and challenges, is what will make any innovation useful.

Society can no longer afford to separate well-being and technology as two separate concerns for the long run. It's either for humans and what their realities really look like, or it's for nothing, imo.

Whether the above is really a great step, I can't say, tho.


Society plays a role, but like raising a dog, it's on the parents to socialize them and teach them how to interface with it safely.

Society cannot progress if the adults have to live in a child safe box. There must be places safe for children, but it's the parents who have to keep them from sticking their fingers in plug sockets, not putting the burden on having a third party validate their access to electrons.

If you send your child to a bar, they'll see alcohol - if you send them to the internet, they'll find the internet. I'm happy to tell you you shouldn't do that with your kids, but if you do it with yours, it's not on me to stop drinking at bars.


Why does scanning of children's communications need to involve non-E2E-encrypted communication? The parents can just be the admins of their children's devices and tell those devices to capture and report communications before they're encrypted.


Putting aside that children as a group are brilliant and constantly jailbreaking every kind of device management they are subjected to, the vast majority of parents aren't technical enough to understand this problem much less set themselves up as "admins" and the vast majority of those who are technical enough to understand do not have the time. They're already trying to ensure their children are fed, educated, socialized, emotionally supported, and so on.

This is a wonderful example of engineers thinking everything is a technical problem when almost everything is instead an organizational or human problem.


Who said anything about a technical problem? Being an “admin on their phone” just means knowing their PIN code and passwords and grabbing their phone whenever you want. No technical know-how required.

Also, I object to this:

> They're already trying to ensure their children are fed, educated, socialized, emotionally supported, and so on.

In 2024, all of the above except “fed” occur at least partially over the Internet. Ignoring your child’s time spent online is like being the oblivious parents in an urban fantasy story who don’t realize their kids are slipping off to have adventures in the fae realm every night.


This:

> The parents can just be the admins of their children's devices and tell those devices to capture and report communications before they're encrypted.

requires more technical skill than:

> Who said anything about a technical problem? Being an “admin on their phone” just means knowing their PIN code and passwords and grabbing their phone whenever you want. No technical know-how required.

I don't even know how to get them to "capture and report communications before they're encrypted" (generically over all apps and websites), and I've been writing iPhone apps since iOS 4 and the first retina iPod touch.


In addition to ben_w's point, your proposal requires parent to understand every app that could possibly be used to communicate with others, how they can be used, how chats can be hidden, to audit all of the above on a regular basis, and still won't catch fairly basic things like a kid who deleted their messages as soon as they're read.


Or... it requires you to use the parental controls built into the phone OS (which it loudly insists you set up if you're setting up a phone for a child) to whitelist what apps the child can even be accessing, to only be ones you have the time to understand.

---

To be clear here — and maybe this is a miscommunication in this conversation — when I'm talking about monitoring the online activity of "children", I'm talking about e.g. six-year-olds. Kids who are at a young-enough age that they don't yet have a mental schema of the world that includes a term for "online sexual predators." (And yes, six-year-olds these days have smartphones. Or at least they do if they take the bus to school, do any organized sports, live back and forth between divorced parents, or one of any number of other things that create the possibility of an emergency where they're stranded somewhere and need to be able to reach you. Pay-phones don't exist any more, remember?)

I don't think monitoring the online activity of a six-year-old is a particularly difficult problem, because it's not adversarial in the way that monitoring older minors would be be. Six-year-olds really just don't have that many things they want to do online; even the ones who are heavily absorbed into screens, are mostly passively consuming content or playing games. They have no impulse to create social-media accounts and profiles describing themselves in the way teenagers do. (There's a clear through-line from going through puberty and unlocking the "hormone production" part of the human tech-tree, to wanting to be seen as socially impressive, to oversharing on social networks.)

Meanwhile, I don't think parents should bother trying to monitor their teenagers' devices at all. For two reasons:

1. If you have a healthy communication channel with them, and you explain what the risks of engaging with strangers online are — and if they also have no reason to actively seek out adult attention (i.e. if you aren't neglectful/abusive) — then they will do perfectly well protecting themselves. Their mental schema of the world does include a term for "online sexual predators" — trust me, it does, even if you have never personally explained the concept. They might have ill-considered reactions to such threats — e.g. deleting messages that gross them out instead of saving them as evidence, or capitulating to (what are to you obviously bluffed) threats of blackmail — but these are exactly the sort of practical micro-skills that make up the modern version of "educating and socializing" someone.

2. It's just a losing game to try. Unless you literally never let a teenager leave your sight, they can work around literally any restrictions you try to impose on them. And not just by nerding it up with jailbreaking or what-have-you. If you give a teenager a locked-up/monitored phone, their most likely response will be to treat that phone as a decoy, and to ask a friend at school — one without helicopter parents — for an old used phone of theirs (everybody has one) to secrete away and use as their real phone. Then, the first time they get a spare $20 in cash, they'll walk into a convenience store and get a prepaid SIM card for that phone, so that they can ensure it never touches your house's internet. There is literally nothing you can do to stop this (that wouldn't make their life worse than that of most maximum-security prisoners.)


A lot of parents are at the very least extremely unaware of what their children are doing on the internet.


Most of the abuse already happens on "responsible" platforms that provide zero expectation of privacy, like Discord.

Scanners are just not good at finding anyone but second-degree consumers, and that seems to happen more on web forums (and also, focusing attention here doesn't get rid of producers).


Didn't we have quite a few scandals involving LE saving and sharing media as well? Doesn't make it much better, you'll just have a bigger pool of people looking at the unencrypted chats.

I think a better solution would be filters which run only locally and can block problematic messages, as well as alarm platform provider/LE/parents. E2E still works, but you have some safety as well.


Who decides what's problematic message or not?


Not a whole lot to discuss here, I fully agree. This goes beyond children's safety and into general privacy(which the vast majority of people are totally clueless about, contrary to their beliefs): no amount of encryption can compensate for bad privacy habits. People tend to think that the bottomless pit that is the internet and the billions of users will keep them hidden, which couldn't be further from the truth. And speaking of the truth, all people are surprisingly unique in their own way and there is no such thing as "average". I've been fiddling with this idea for a while and I am putting it to the test in my spare time and so far the results are astonishing: it's far easier to doxx someone online as a consequence of a personal fact that they've publicly shared than some data leak by a private company or even a government institution. Leaks make matters worse, sure but the pile grows exponentially and not linearly(which seems to be what most assume): People get freaked out about their number being posted somewhere online while not realizing that by saying something along the lines of "I studied in London in the early 2010's, then in Edinburgh until 2015" and "I lived in Paris between 2017 and 2019" and someone wants to find their identity, they've willingly helped them narrow down the identity to a handful of people - likely in the very low double digits. Add one more random fact, such as "I have a Volvo xc 70" and you've basically put a pin on yourself on the map.

Most people have probably never considered this. It may sound like a cliche but it's far more important to educate people as well as their children from a very early age about those things than to assume that outsourcing the problem to technical solutions would somehow solve it - it won't.


I don't follow your argument in the last paragraph. What has the fact that children get addicted to social networks to do with the encrypted, peer-to-peer communication? It seems to me like they are two different things.

While I agree it might be healthy for children to avoid social networks (its addictive features, like algorithmic feeds), I don't think it's healthy for them to avoid any Internet communication.

That's how humans are gonna do things in the future, you plug-in yourself to the collective consciousness, unmediated instant communication with many, many people (of your own choice). It's a positive thing, and children need to learn how to operate in that environment. Forbidding them from communicating (for fear of predators) is backwards. It's like forbidding computers to kids because there are violent video games.


> Child grooming on the internet is a huge problem, and I suspect (again, happy to see evidence against it) it's better to allow scanning of children's communications

Assuming the scan always happen by a well-intentioned actor, and that no data is ever leaked, and that the children consent to the scanning... lots of assumptions here.

ANY human communication has the right to be private.

Imagine a proposal to install microphones on kids to record anything they say just in case. It would be outrageous, right? We focus on electronic communications just because if we want, we can break their privacy.


> Child grooming on the internet is a huge problem, and I suspect (again, happy to see evidence against it) it's better to allow scanning of children's communications, rather than ensuring their communication with whoever they are talking to is end-to-end encrypted.

How can you do this without compromising anonymity? The inevitable conclusion of this argument is that an ID check will be mandatory in order to enable E2EE messaging, otherwise any child could be instructed to enable it by a bad actor.


You should understand that you cannot have the cake and eat it.

The logic here is very simple:

1) Any scanning messages of children and their parents means providing access to them, effectively removing the protection provided by the encryption.

2) If these messages are accessible to any algorithm or systems, however secure, there is no way to guarantee that the bad actors will be prevented from accessing these messages - it's only the question of time until this information is leaked.

3) If these messages are accessible to bad actors, it will enable more grooming - because they will be used to train language models to more effectively manipulate children and other people.

So while politicians can engage in their wishful and magical thinking about some wonderful technology that will allow, what they call, a "lawful access" but at the same time will somehow prevent unlawful access by bad actors, everybody with a bit of technical knowledge understands that it is simply impossible - it's either e2e encrypted and nobody other than the communicating parties can access it, or it is not - and then anybody with resources can access it, including bad actors.

That's why they want to scan messages of everybody other than themselves: https://www.eureporter.co/business/data/mass-surveillance-da...

It was never about protecting children - it is simply about eradicating the privacy as we know it, and preventing any dissent from being formed, and criminalising private conversations in the same way free speech is being criminalised. These ideas are not new - it's just now that they try to weaponise "let's protect the children" narrative to achieve it.

The only way to protect children and vulnerable people online is to reduce their discoverability on online platforms, not to expose their communications to enable more efficient grooming - which would be the effect of the scanning.


>I don't think it's reasonable to make this the parent's responsibility, when basically the whole world is internet connected...

I think it is, tbqh. It's a tool. They have to be taught to wield it responsibly.

>fighting against multi-billion dollar international corporations, who dedicate huge teams of highly paid people to ensuring children get addicted to their platforms.

I dare say... I see the problem here, and it isn't one scanning everyone's communications is an effective mitigation for.


In the U.S anyone under 18 have been and can easily be prosecuted and entered into the widely abused sex offender registry for not just other minors, but themselves. The law is designed to be strict in a world where teens are breaking it constantly. Puritan cultures like this are not hard to find, and act solely to punish people and abuse government authority.


> Child grooming on the internet is a huge problem

I'm happy to see strong evidence on this.


> those parents are fighting against multi-billion dollar international corporations, who dedicate huge teams of highly paid people to ensuring children get addicted to their platforms

So why not put a stop to this then?

With any luck we'll come to realise that these platforms and the people working on them are the same scum that run the tobacco industry.


I completely agree with you, we will look back on the big social media companies exactly the same way we look at big tobacco, they knew all along the damage they were causing, and spent heroic efforts covering it up to make profits.


Do you think pedos don't read the news? Of course, if privacy is dismantled by law, they will move to other platforms. They are not stupid.

The ones who suffer will be dissidents, political opponents, people who do victimless crimes, regular people where the scanners scanned wrong etc.

All it would take is for one socialist or fascist to come into power, and our dear politicians will have created the perfect platform for imprisoning dissidents.

This discussion is short sighted, naive, and so forgetful of history and the benefits of privacy that I really truly have come to believe that politicians at large are the retards they appear to be.


"They are not stupid."

Some are, those are caught.

But those who are not, learned to be very sneaky. They have to hide their dark side ALL the time. That is a pretty intense training in terms of operational security.

And the solution to this is probably not technical, but social. People could get to know their neighbors more.

My experience with those guys is, it still shows, if you pay attention. But if you are sure your neighbor is not a pedo, then it is one more person you can trust to keep a lookout for the dangerous types. You know, classic community building stuff.


Yay, Neighborhood Watch! Wannabie police harrassing citizens, thats just what we all need!


Well, if a community is working well and everyone knows and supports each other, then there is no need for police. The crazy ones simply get early help. But the way things develope, neighborhood watches will likely be more and more common. (pedos are just one motivator)


It could done though, via in channel-system- analysing communication, communicating to parties outside the black box, without monitoring the conversation otherwise. If the NNaNNy detects inapropriate conversations by parties underage, it calls the police.


> who dedicate huge teams of highly paid people to ensuring children get addicted to their platforms

one word: corruption. there's no solution for it because it's a matter of character. now I've spent quite some time in school and university with people who now work for criminal police, police, governments, media, and I know that some of them, if not most, shouldn't be trusted withe safety of other peoples' children. and i also know that some of them absolutely will sell data collected through scanning children's messages and I also know that there's plenty of people in psychology, again, I know some, who would absolutely use said data for therapeutic experiments, marketing campaigns, and so on just to see what can be done with people. this shit has been going for decades in different segments of the population. so we absolutely must establish strong encryption practices wherever we can, especially for children, who's future is still in development, who are learning to be productive and constructive, however long it may take. even more data points for those who already fucked up with what they collected in the past is a bad idea and it's not a matter of competence, these guys are good, it's a matter of character, these guys need to tend to different desires, but they have been "set up" to be what they are now and they are spreading and applying that shit more and more systematically now.


Please, not the "we're protecting the children".

It always starts with "we're protecting the children".

It always ends with "we're censoring the Internet".


I find the argument a bit weak because client side scanning is not the same as creating a central store for all send images. I do indeed find it very concerning that flagged content will eventually be looked at by a human, and said human may be a criminal, or said human and her computers will indeed be the target of hackers looking to extort.

Would be nice if this scenario would first play out for a politician that helped establish Chat Control.


> Would be nice if this scenario would first play out for a politician that helped establish Chat Control.

Unfortunately French politicians already thought about it, and politicians will be exempted from Chat Control.


Not related to encryption, but "child safety" seems to be a really popular subject for ushering in all kinds of overreaching policies.

Encrypted communications? Think of the children.

Legalizing cannabis? Think of the children.

LGBTQ+ rights? Think of the children.

Non-western immigrants? Think of the children.

Non-western social media becoming too popular? Think of the children.

etc.

Sorry if this was political, but every time I see someone use "Think of the children" as their main argument, I can't help but to think there's some ulterior motives, and kids (who can't voice their opinion, or won't be heard / taken seriously if they could) are just being used as pawns - because who would want to argue against the safety of kids?


Nothing new.

https://en.wikipedia.org/wiki/Four_Horsemen_of_the_Infocalyp...

We quickly forget our history.


> Not related to encryption, but "child safety" seems to be a really popular subject for ushering in all kinds of overreaching policies.

Approximately the same age as pretending everything is a slippery slope the other way.

Who knew: extremists are evergreen and forever doomed to not be reasonable by definition.


> LGBTQ+ rights? Think of the children.

A lot of people prefer dead kids to queer kids, unfortunately.


Amusingly, for every single political example you gave, I've heard "think of the children" from both directions.


Can you elaborate? Can't say I've ever heard someone say we should legalize cannabis so the kids can smoke it too.


I have. Particularly from very libertarian people.


Really - today in NZ we have a political stoush where a serving politician has been chatting with young teenagers on snapchat - while there are lots of places where perfect security is important, with predatory adults around perhaps it's not the solution for protecting kids


You cannot say flat-out say that E2E alone must be maintained, you must make it harder for others to reach you, thru something done in-person/screen-only QR-code, Bluetooth'd contact vCard, ..., something that prevents drive-by spammers/perps/malicious actors from randomly attaching to an account.

The key thing is to make distribution of account name harder:

1. Send 256-char hash of your account 2. They tag your hash with a name of their own choice

Disassociative of name to account gets maintained when used with a Signal protocol, which is what Signal app can do.

Again, weak sauce on claiming E2E is alone essential for child's privacy.


How do you know what's running on Signal's server?

Sidenote: Signal is based in the USA. They do not have the benefit of the doubt.


You have association revealed openly to anyone having access to Signal server by User ID but barely know the name or phone associated with either end, much less ever able to see any actual and readable message content.

Unless it is the end-device being "accessed"/compromised, which any nation-state can do.


Why would it be impossible for Signal to decrypt and re-encrypt on their server while pretending to be "peer 2 peer encrypted" if we don't know what's running on their server? What's more: we don't even have the source of the client binary running on our phone.

Also "which any nation-state can do" is an interesting angle.

Let me put that differently: why would Signal NOT be a honey-pot operated by the CIA/NSA/other? Sounds almost too easy to not be true.


You and GP bring signal into this, but TFA is hosted on simplex.chat blog - the communicator famous for not having user accounts at all, so it trivially fixed the problem GP mentioned.


Simplex shy away from User ID, which means you are giving up something in exchange (notably no message encrypted-at-rest in their server): then the government can access the message contents from Simplex server.

It says so on their website:

"Only client devices store user profiles, contacts and groups; the messages are sent with 2-layer End-to-end encryption."

What it does NOT say is that message content remains encrypted on their server like Signal does.

It is like time, money, resource: pick only two. But in secured communication, that would be personal anonymity, personal privacy, and personal security: only can pick two.

https://simplex.chat/#how-simplex-works


Of course it stays encrypted - if it is sent e2e encrypted, it cannot be decrypted by the servers in the middle - it's the whole point of e2e encryption.

The privacy policy is very explicit about it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: