Hacker News new | past | comments | ask | show | jobs | submit login

NCMEC has essentially shows that they have zero regard for privacy and called all privacy activists "screeching voices of the minority". At the same time, they're at the center point of a highly opaque, entrenched (often legally mandated) censorhip infrastructure that can and will get accounts shut down irrecoverably and possibly people's homes raided, on questionable data:

In one of the previous discussions, I've seen claims about the NCMEC database containing a lot of harmless pictures misclassified as CSAM. This post confirms this again (ctrl-f "macaque")

It also seems like the PhotoDNA hash algorithm is problematic (to the point where it may be possible to trigger false matches).

Now NCMEC seem to be pushing for the development of a technology that would implant an informant in every single of our devices (mandating the inclusion of this technology is the logical next step that seems inevitable if Apple launches this).

I'm surprised, and honestly disappointed, that the author seems to still play nice, instead of releasing the whitepaper. The NCMEC seems to have decided to position itself directly alongside other Enemies of the Internet, and while I can imagine that they're also doing a lot of important and good work, at this point, I don't think they're salvageable would like to see them disbanded.

Really curious how this will play out. I expect attacks either sabotaging these scanning systems by flooding them with false positives, or exploiting them to get the accounts of your enemies shut down permanently by sending them a picture of a macaque.




> I'm surprised, and honestly disappointed, that the author seems to still play nice, instead of releasing the whitepaper.

I'm the author.

I've worked with different parts of NCMEC for years. (I built the initial FotoForensics service in a few days. Before I wrote the first line of code, I was in phone calls with NCMEC about my reporting requirements.) Over time, this relationship grew. Some years, I was in face-to-face development discussions, other times it have been remote communications. To me, there are different independent parts working inside NCMEC.

The CyberTipline and their internal case staff are absolutely incredible. They see the worst of people in the media and reports that they process. They deal with victims and families. And they remain the kindest and most sincere people I've ever encountered. When possible, I will do anything needed to make their job easier.

The IT group has gone through different iterations, but they are always friendly and responsive. When I can help them, I help them.

When I interact with their legal staff, they are very polite. But I rarely interact with them directly. On occasion, they have also given me some very bad advice. (It might be good for them. But, as my own attorney pointed out, it is generally over-reaching in the requested scope.)

The upper management that I have interacted with are a pain in the ass. If it wasn't for the CyberTipline, related investigators, and the IT staff, I would have walked away (or minimized my interactions) long ago.

Why haven't I made my whitepaper about PhotoDNA public? In my view, who would it help? It would help bad guys avoid detection and it will help malcontents manufacture false-positives. The paper won't help NCMEC, ICACs, or related law enforcement. It won't help victims.

About this time, someone usually mocks "it's always about the kids, think about the kids." To those critics: They have not seen the scope of this problem or the long term impact. There is nearly a 1-to-1 relationship between people who deal in CP and people who abuse children. And they rarely victimize just one child. Nearly 1 in 10 children in the US will be sexually abused before the age of 18.


> About this time, someone usually mocks "it's always about the kids, think about the kids." To those critics: They have not seen the scope of this problem or the long term impact.

The problem is people use this perfectly legitimate problem to justify anything. They think it's okay to surveil the entire world because children are suffering. There are no limits they won't exceed, no lines they won't cross in the name of protecting children. If you take issue, you're a "screeching minority" that's in their way and should be silenced.

It's extremely tiresome seeing "children, terrorists, drug dealers" mentioned every single time the government wants to erode some fundamental human right. They are the bogeymen of the 21st century. Children in particular are the perfect political weapon to let you get away with anything. Anyone questions you, just destroy their reputation by calling them a pedophile.


This quote sums it up perfectly :

“Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive. It would be better to live under robber barons than under omnipotent moral busybodies.

The robber baron's cruelty may sometimes sleep ,his cupidity may at some point be satiated; but those who torment us for our own good will torment us without end for they do so with the approval of their own conscience.

They may be more likely to go to Heaven yet at the same time likelier to make a Hell of earth. This very kindness stings with intolerable insult. To be "cured" against one's will and cured of states which we may not regard as disease is to be put on a level of those who have not yet reached the age of reason or those who never will; to be classed with infants, imbeciles, and domestic animals.” C.S. Lewis

There are so many problems with this feature :

- it can easily be used to identify also other types of dangerous material such as warnings against government activity or posters of organized protests

- some malicious actor can send you this kind of content in order to get you in trouble

- false positives

- moral busy bodying (your employer not agreeing with you going drinking sunday night)

Honestly, it feels like we're about to enter a new age of surveillance. Client-side surveillance.


The consequences seem very simple - once this system is in place, it's only a matter of time until a state actor, be it China or US or Russia or pretty much anywhere goes to Apple and says "hey this hash matching algorithm you have there? If you want to keep operating in our country, you need to match <those> hashes too, no, we won't tell you what they are and what they represent, but you have to report to our state agency who had those present on the device".

Once the technology exists it will be abused.


Completely agree. I have absolutely no doubt this technology will be abused. I bet it will be used to silence dissent, opposition and wrongthink more often than to protect children.


Why doesn't anyone ever question their intent to protect children? There is no such intent. When they do protect children, these are the sorts of observations of what happens when the state has 100% control over children:

https://www.kansascity.com/news/special-reports/article23820...

Why does the state, when they treat children like this, get to proceed on "more" child protection to protect children? Are we totally mad? Any effort to protect children is always a de facto effort to deliver more children into this system, and the state system is much worse than being abused in society (never mind that someone found there's actually more abuse in foster care than with abusive parents! Therefore even if you assume social services is always right ... they never protect kids against social services itself. Therefore you can reasonably assume that a kid that is getting abused will be forced into a worse situation by state "help")?

I mean if you want to protect children, obviously the first step is to fix the child protection system that everybody knows is badly broken (you constantly hear stories about schools sabotaging research into abusive parents to protect the child against social services, covering for the child, lying about attendance or even wounds, ..., because they know social services will be much worse for the child).

There's states where the corrections department provides better care (and definitely better educational instruction) for children than social services do. And yes: that is most definitely NOT because the corrections department provides quality instruction. It's just MUCH worse with social services.

They cover themselves by showing the worst possible situations in society "demonstrating the need to intervene". And you won't hear them talk about how they treat children ... because frankly everyone knows.

Reality is that research keeps finding the same thing: a child that gets abused at home ... is being treated better (gets a better future, better schools, better treatment, yes really, more to read, ...) than children "protected" from abuse by the state:

https://www.aeaweb.org/articles?id=10.1257/aer.97.5.1583

(and that's ignoring that reasons for placement are almost never that the child is unsafe. The number 1 reason is non-cooperation with mental health and/or social care by either one of the parents or the child themselves. There is not even an allegation that the child is unsafe and needs protection. Proof is never provided, because for child protection there is no required standard of proof in law)

And we're to believe that people who refuse to fix child services ... want extra power "to protect children"? How is this reasonable at all? They have proven they have no interest whatsoever in protecting children, as the cost cutting proves time and time again.


To add on this: Nothing else is to expect form a country that—under protection of its legal system!—tortures mentally ill and disabled children.

https://en.wikipedia.org/wiki/Judge_Rotenberg_Educational_Ce...

https://www.independent.co.uk/news/world/americas/massachuse...


I am supportive of Apple’s new child protection policies. I spent two years helping to build some of the tools used for scanning and reporting content (EDIT: not at Apple). There are a few sentiments in this thread that I'd like to address.

> Yeah, and none of these assholes (pardon the language) is willing to spend a penny to provide for millions of children suffering of poverty: free education, food, health, parental support. Nothing, zero, zilch. And these millions are suffering right now, this second, with life-long physical and psychological traumas that will perpetuate this poverty spiral forever.

Many people in child safety and I, personally, strongly support policies that enhance the lives of children, including improved education, food, access to health services, and support for parents. To your point, though, it's also true that many political leaders who vocally address the issue of child sexual abuse on the internet also happen to be opposed to the policies I would support. Like most political alliances, it is an uncomfortable one.

> Anyone questions you, just destroy their reputation by calling them a pedophile.

I see this sentiment a lot. I have worked with people across law enforcement, survivors, survivor advocates, NGOs, social workers, private companies, etc. and I don't know anyone who responds this way to people raising privacy concerns. At worst, they might, rightly or wrongly, consider you alarmist or uninformed or privileged (in that you don't have images of your abuse being actively traded on the internet). But a pedophile? I just can't imagine anyone I've worked with in this space accusing you or even thinking of you as a potential pedophile just because you're opposed to content scanning or want E2EE enabled, etc. I suppose maybe someone far, far removed from actual front line workers would say something so ridiculous.

---

Separately, I want to suggest that there are paths forward here that could include risk controls to reduce the risk that this technology gets extended beyond its initial purpose. Maybe NCMEC could provide verifiable digests of the lists used on your device to verify that additional things haven't been added to it. Or there could be public, concrete, transparent criteria for how and when a lead is referred for law enforcement action. By designing this system by which the matching occurs on device against a list that can be made available to the user, Apple has made content scanning far more privacy-preserving and also created avenues by which it could be further regulated and transparent. I'm very excited about it and, honestly, I think even the staunchest privacy advocates should be cautiously optimistic because it is, in my opinion, a step in the direction of less visibility into user data.

I think that privacy advocates are arguing in good faith for protecting society from a surveillance state. I think that advocates of scanning are arguing in good faith for protecting children. I also think that both sides are using language (screeching on the NCMEC side, comparisons to hostile regimes on the other) that make it very hard to move forward. This isn't pedophile privacy advocates vs. surveillance state NCMEC. Neither of those groups even exist. It's concerned citizens wanting freedom for all people vs. concerned citizens wanting freedom for all people.


HN is the only place I feel safe enough to post these pro-privacy opinions. In many other communities, I've seen people being accused of serious crimes for daring to take issue with whatever surveillance the authorities are trying to implement. I've seen people openly speculating about the CSAM stashes of encryption users. What else could they possibly be hiding, right?

I don't think I trust actual authorities either.


Given NCMEC's continuing attacks against consenting adult sex workers, that they never seem to regard adult sexual identities as valid under any circumstances, that they repeatedly retweet abolitionist groups for adult sexual expression, their actions during Backpage, them lying to Congress about the level of abuse that occurs and the recent statements by their staff, I find it kind of baffling that anyone would defend their leadership at this point.

NCMEC are willing to completely compromise their mission in order to chase other moral crusades that are unrelated to children, and seem to never care about the consequences of any of the things they call for.

I don't trust them, and they've earned that.


> Many people in child safety and I, personally, strongly support policies that enhance the lives of children, including improved education, food, access to health services, and support for parents. To your point, though, it's also true that ...

You work with people who, almost always against the will of children and parents kidnap children (sorry, but search on Youtube for images of their action: kidnap is the only correct term) ... then proceed to NOT care for those children and destroy their lives. Obviously this is the only correct reasoning is that this is entirely immoral until the care system is fixed, because they are not improving the lives of children, and their complete lack of caring about this tells you what they're goals aren't.

Watch "Short Term 12" for what facilities these children are provided with. Stop pretending that helping these people acquire children (because that's exactly what you're doing) helps a single child. It doesn't. Terrible, abusive, violent, parents take better care of children than these places do. The moral reaction should be to do what most of society, thankfully, does: sabotage the efforts of social workers.

And if you're unwilling to accept this, as soon as this whole covid thing dies down a bit, find the nearest children's facility and visit. Make sure to talk to the children. You will find you're making a big mistake.

Whatever you tell yourself, please don't believe that you're helping children. You're not. You're helping to destroy their lives

> I see this sentiment a lot. I have worked with people across law enforcement, survivors, survivor advocates, NGOs, social workers, private companies, etc. and I don't know anyone who responds this way to people raising privacy concerns.

I'm adding this response to the other reply to your comment ... which makes 2 people who disagree with your assessment: you are likely to be threatened in many places. I feel like one might even say it's likely we're 2 people who have been threatened.

I would like to add that either as a child or an adult, child protection authorities, the people you help, will threaten you, and I've never known a single exception if they think (correctly or otherwise) that you're hiding something from them. That's if you're at their mercy (which is why every child in child services makes sure to commit some despicable/violent/minor criminal act or two every month and keep it secret: if you don't have something to confess that won't get you sent to a "secure facility" you will be horribly punished at some unexpected random time. That's how these people work. And over time you may learn that, for a kid in the system, a whole host of places are traps. Like the hospital, child services itself, police, homeless shelters or school. As in every person in one of those places will be shown your "history" as soon as you mention your name and if they report you ... very bad things will happen. Some children explicitly make bad things happen (e.g. commit violent theft), because they'll get sent to "juvie" and finally the stress of suddenly getting punished out of nowhere disappears. Also some believe you get better schooling in juvie. So you hide, even if you're hurt or sick. This is why some of those kids respond angrily or even violently if someone suggests they should see a doctor for whatever reason. Sometimes literally to make sure the option of suicide remains open (which is difficult in "secure" care). And why does this happen? NOT to protect children: to protect themselves and "their reputation", and their peace of mind against these children).

This, of course, you will never hear your new friends mention needs fixing. They are in fact fighting with this side of the system over money, so that EVEN LESS money goes to the kids the system takes care of. That, too, you will never hear from them. They are doing the opposite of what someone who means well with disadvantaged children will do.

I have zero doubts they will use your work, not to convict offenders, because that's hard, years of very difficult work, but to bring more kids into a system that destroys children's lives, and MORE so than the worst of parents do. Because throwing children into this system (and destroying their lives) is very easy. I'm sure occasionally they will convict an offender (which, incidentally, doesn't help the kids) or get a good outcome for a child. It happens. If is absolutely not common.

And not to worry they will put great emphasis on this statement: "I've never seen anyone in the system who didn't mean well". Most are idiots, by the way, if you keep digging at their motivations, they will reveal themselves very clearly quite quickly.

These "people in child safety" DO NOT mean well with children. They merely hate a portion of society (which includes those victimized children) and want to see them punished HARD. If you are a moral person, volunteer at a nearby children's shelter and show understanding when the inevitable happens and you get taken advantage of (do not worry, you will learn). DO NOT HELP THESE PEOPLE GET MORE CHILDREN.


Meta-question: Is there a way to get rid of such "hot air" comments form throwaway accounts?

This comment does not add any argument in favor of its point, despite its length.


And i think if you're working on these systems, then it's all too easy to only think about the happy scenario.

But time has shown that an initial good concept will transform into something worse.

We have a very recent example with Corona contact-tracing apps, that law enforcement in multiple democratic countries are now using for other purposes.

So no, we should not allow this client-side scanning to go through, it will not end well.


This is what will happen, it's obvious to anyone not emotionally invested in the propaganda theater driving all social interaction.


This is such a stupid fallacy and it comes up every time something like this is discussed. You don't know if anything is or will be abused you expect it because you expect your elected government to do so. The problem here is with you or your electors not with the technology itself. You want to fix the symptoms but not the problem. As usual.


How so? The UK's snooper's charter that compels ISPs to save your entire browsing history for a year was only meant to be used to catch pedophiles and terrorists, and now 17 different agencies have been given completely warrantless access to the database, including the UK's Food Standards Agency(??!?!?!?). People have already been fired for abusing access too, so it definitely happens too.

>>The problem here is with you or your electors not with the technology itself

No offense, but this is such a shitty answer, and it's always made by apologists for these incredibly invasive technologies. Like, please explain to me, in simple terms, how can I, an immigrant who doesn't even have the ability to vote, vote to make an American corporation do nor not do something. I'm super curious.


It's the exact same shit I was talking about instead of fixing your governing body you want to fix it with technology. All the crypto apologist are the same way if the government YOU are electing does something stupid you want to fix the symptom and not the governing body. You are just throwing around goal posts instead of working on the real problem.


>government YOU are electing does something stupid

Is America the entire world to you? What should a Chinese citizen do? What should a Saudi citizen do? What should a Russian citizen do? Even if you ignore the fact that the chance of fascism in American is not zero, why should Apple make it easier for totalitarian regimes to spy on their citizens? Or do you expect a Saudi person to "just move to America"?


I am talking to people on this website which is banned in 3 of the 4 countries you are talking about. Stop moving your shitty goalpost we are talking about the US, Europe and other democracies. You know what would help people in regimes? Governing bodies that stand up for them in other countries... guess who could change that.


I'm not moving the goalposts. I, one, have empathy for people other than myself and two, I am not deluded enough to think that fascism will never return to the West. If you don't care about citizens in other countries, that's on you.

>You know what would help people in regimes? Governing bodies that stand up for them in other countries... guess who could change that.

What would also help is if Apple didn't build tools for those regimes to suppress opposing political bodies.


>>if the government YOU are electing

Except like I clearly said above, I'm an immigrant without the right to vote, so I'm not electing shit. Again, how exactly am I supposed to vote my way out of this?


[flagged]


>> you can be political active in other ways than direct votes.

Which is what we're doing here, by participating in protests, complaining to agencies and governing bodies as well as Apple itself against this technology. Or is this not up to your definition of "politically active"?

>>You can leave the country

I'm envious of your position in life where this is the first thing that springs to your mind, well done you.

>> Don't wiggle your way out of your extremist position that you can't do anything except building technology which would be totally obsolete if you fixed your political problem.

Uhm....are you sure you have the right argument there? Or maybe replying to the wrong person?


> you can be political active in other ways than direct votes

Absolutely. That's why we're posting our thoughts here and attempting to convince others.


Civil disobedience. If we think a law is unjust, it is our duty to disobey and undermine it. Technology is a tool that allows us to do exactly that.


> You don't know if anything is or will be abused

Actually I do. Governments abuse their powers all the time. They have done it before, are doing it right now and will continue to do it in the future. This is not fallacy, it is fact.

Here's an example:

https://en.wikipedia.org/wiki/LOVEINT

The only solution is to take their power away. No way to abuse power they don't have. We must make it technologically impossible for them to spy on us.


That's the exact same fallacy you are proposing to fix a symptom and not the problem. You want to reign in the government YOU ARE ELECTING TO GOVERN you with technology instead of real political work. Either vote different or get into politics and fix yourself.

Your wikipedia link doesn't show anything regarding abuse of the governing body ALL of the examples are from private persons.


>>Your wikipedia link doesn't show anything regarding abuse of the governing body ALL of the examples are from private persons.

Have you even like....read the page they linked?

"Siobhan Gorman (2013-08-23). "NSA Officers Spy on Love Interests". Washington Wire. The Wallstreet Journal. "

Are NSA Officers "private persons" now? They are government employees, they were abusing the power they were given while employed by the government. It doesn't matter in the slightest if they were abusing the power for private or state gain, it's a state agency and its employees abusing the access, that implicitly makes it the state abusing the power they have.


Wow if you can't distinguish between rogue agents and an institutional abuse of power there is nothing left to argue.


If you really think that an NSA employee abusing their access is just a "private person" then yeah, I guess there is nothing left to argue. I guess it must be nice sleeping well at night not worrying about this stuff, right?


I didn't elect anyone. There is not a single politician in power right now that represents me. I'm not even american to begin with so it's not like I have any influence over american administration.

In any case, there's no reason why politics and democracy ought to be the only way to bring about change. We have a far more powerful tool: technology.

Governments make their laws. People make technology that neutralizes their laws. They make new laws. People make new technology. And so on. With every iteration, the government must become ever more tyrannical in order to maintain the same level of control over the population it previously enjoyed. If this loop continues long enough, we'll either end up with an uncontrollable population or with an oppressive totalitarian state. Hopefully limits will be found along the way.

> Your wikipedia link doesn't show anything regarding abuse of the governing body ALL of the examples are from private persons.

A government employee abused his access to the USA's warrantless surveillance apparatus in order to spy on his loved ones. If this isn't abuse of power, I don't know what is.

Honestly, it's just human nature. No person should ever be given such powers to begin with. I wouldn't trust myself with such power. It should be impossible to spy on everyone.


Any modern state implements separation of powers, trias politica in most cases. Your argument ignores that. You also haven't made clear what fallacy you mean.

Want to fix child abuse? Fund teachers and child care. Apple cannot help those kids and I don't mean that as an indignation towards the company.


The thing is tools like this will only make it harder to fix the problem.


It's not just idle speculation- history speaks for itself.


Yeah, and none of these assholes (pardon the language) is willing to spend a penny to provide for millions of children suffering of poverty: free education, food, health, parental support. Nothing, zero, zilch. And these millions are suffering right now, this second, with life-long physical and psychological traumas that will perpetuate this poverty spiral forever.


Thank you! This is a really important observation about anything controversial „child“ related. Be it abortion or abuse you can tell what people‘s real priorities are by looking at how much they are willing to spend on the welfare of kids outside their immediate point of concern.

All of these types of problems are likely better solved in some other ways. Why not have general mental health coaching in schools freely available to increase the chance of early detection of abusive behaviors? Why not improve the financial situation of parents so that the child is not perceived to be another burden in a brutal life? Why not offer low-friction free mental health coaching to all individuals?

The way the world is organized now is ABSURD to the highest degrees! We pretend to care about a thing but only look at issues narrowly without considering the big picture.

In the end, politics and the way power is distributed is broken. There are too many entrenched interests looking out for their narrow self-interest. There seems to be not enough vision to unite enough power for substantial change. Even the SDGs don‘t seem to inspire the developed nations as something to mobilize for. We need the fervor of war efforts for positive change.

Doing the right thing and doing things right needs to become the very essence of what we as individuals stand for. Not consuming goods or furthering the idealogical agenda of a select few. Each and every one of us should look into themselves and look into the world and work to build a monument of this collective existence. We were here. Let us explain to all following us how we tried our best to do the right things in the right way and where we fell short of our ambitions and hope that others might improve upon our work.

Sorry, I think this turned into a rant but it comes from the heart.


> Be it abortion or abuse you can tell what people‘s real priorities are.

There is also (almost always) an unhealthy matching support for an enormous military with nuclear weapons whose sole purpose - when push comes to shove - is to indiscriminately murder the same babies once they have grown up.

Its a symptom of a deeper mental pathology.


The thing with war is, and especially with nuclear weapons - it does not wait for the babies to be grown up. It might have gotten more precise to avoid direct hits of babies as they make for bad PR, but that still happens a lot and mothers with their babies on the run in a burning city is not much better either.


That would require demurrage which is never going to happen. I'll be honest. It's not demurrage itself that we need, it's just much easier to implement. What we need is a linkage between the lifetime of debt and money. Defaulting on debt should destroy money. I.e. money is only valid as long as the contract (the debt) that created it is valid. Given a sufficiently advanced electronic currency you would track the expiry of every single dollar. The lender then would decide whether he wants to extend the expiry of the dollar and thereby extend the due date of the debt.

The fundamental problem with short term thinking (positive time preference) is that people want to "flee" into fictional wealth. Rather than build a long lasting monument (companies like blue origin count as monument) they prefer to increase a number on a balance sheet in the form of a bank account. What makes fictional wealth so attractive? As mentioned above money is just the other side of a debt contract. By withholding your money you extend the debt. In other words, a lot of people promise to work for you. Having idle servants is the entire reason behind accumulating money. If you truly wanted to achieve full employment then all money earned must be spent eventually, to be more specific all debts must be fulfilled. It would mean that your wealth cannot exist in the form forced coercion of other people. Employees would still come and work in your companies but they would leave each day with a fair share of the wealth they helped create. All your wealth would have to be long lasting and the environment, which is the longest lasting form of prosperity, would be considered part of your wealth.

Ancient Egypt had something closer to a "grain standard" meaning that farmers deposit grain in a storehouse where they receive something akin to a coupon which you can trade in for grain. Their money is representing a claim to a spoiling good! The horror! The complete antithesis of the gold standard. Just imagine what a backwards society that must have been! Of course the truth is everyone remembers ancient Egypt as an advanced civilization for its time.


No rant at all, you make very good points. Thanks for sharing


Thank you for this argument, I will put it in my cannon of responses to people who are pro spyware.


this so much this...


OK, I'll forfit 100% of my rights to eradicate CP from my country. Then lets say these measures are effective and CP is removed in all but name from the country. Now what do we do with the thousands of people and billions or of dollars that is being funneled into the effort? Well history shows us this answer. None of those people or authorizes are suddenly going to say well that's a wrap. Lay us off and take the budget to a more meaningful use. Nope, instead they double down and start looking for other "infringements" to justify their funding and their jobs. Ever increasing in the pettiness of the trespass until powerful sociopath realizes they can now repurpose this agency an their resources for their own power gain to remove opponents and silence opposition.

Perhaps even worse. They move on to some ML based detection that uploads suspected CP that is not in the database. Most people have no idea how bad a false accusation can destroy their lives. Even if you are 100% innocent there is already a formal government case against you. At the very least you will be compelled to hire a very expensive attorney to handle the matter for you. After all you cant risk a budget attorney with a huge caseload to handle this potential life sentence case. Not to mention that during this time you will possible be locked out of accounts,finances, put on leave from employment, banned from hiring, banned from assistance programs. Perhaps even posted in the media as a sex offender.


> someone usually mocks "it's always about the kids, think about the kids." To those critics: They have not seen the scope of this problem or the long term impact.

What you are saying here kind of sidetracks the actual problem those critics have though, doesn't it? The problem is not the acknowledgement of how bad child abuse is. The problem is whether we can trust the people who claim that everything they do, they do it for the children. The problem is damaged trust.

And I think your last paragraph illustrates why child abuse is such an effective excuse, if someone criticizes your plan, just avoid and go into how bad child abuse really is.

I'm not accusing you of anything here by the way, I like the article and your insight. I just see a huge danger in the critics and those who actually want to help with this problem being pitted against each other by people who see the topic of child abuse as nothing but a convenient carrier for their goals, the ones that have actually heavily damaged a lot of trust.


What are the main instances where these claims were made fraudulently?


One example I quote from EFF's post Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life [1] on the topic:

We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content [2] that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT) [3], is troublingly without external oversight, despite calls from civil society [4]. While it’s therefore impossible to know whether the database has overreached, we do know that platforms regularly flag critical content [5] as “terrorism,” including documentation of violence and repression, counterspeech, art, and satire.

[1]: https://www.eff.org/deeplinks/2021/08/apples-plan-think-diff...

[2]: https://www.eff.org/deeplinks/2020/08/one-database-rule-them...

[3]: https://gifct.org/

[4]: https://cdt.org/insights/human-rights-ngos-in-coalition-lett...

[5]: https://www.eff.org/wp/caught-net-impact-extremist-speech-re...


Thanks, I think I see what you mean. Essentially another organisation has used file hashes to scan for extremist material, by their definition of extreme.

I agree that has potential for abuse but it doesn't seem to explain what the actual link is to NCMEC. It just says "One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed..." but doesn't name this "technology". Is it talking about PhotoDNA?


I'm not the one you reply to, but I have some relevant thoughts on the process. I'm value my fundamental right to privacy. I understand that technology can be used to stop bad things from happening if we accept a little less privacy.

I am okay with giving up some privacy under certain conditions, that is directly related to democracy. In essence a database and tech of any kind that systematically violates privacy we need the power distributed and "fair" trials. I.e. legislative branch, executive branch and judiciary branch.

* Only the legislative branch, the politicians, should be able to enable violation of privacy, i.e. by law. Then and only then can companies be allowed to do it.

* The executive have oversight of the process and tech, they would in essence be responsible to saying go/no-go to a specific implementation. They would also be responsible to perform a yearly review. All according to the law. This also includes the police.

* The judiciary branch, the justice system, is responsible for looking at "positive" hits and grant the executive branch case by case powers to use the information.

If we miss any of those three, I am not okay with systematically violation of privacy.


Here in France, law enforcement & probably some intelligence agencies used to monitor P2P networks for child pornography & content terrorists like to share with one another.

Now we have the expensive and pretty much useless "HADOPI" paying a private company to do the same for copyright infringement.

Ironically enough, it seems the interior and defence ministries cried out when our lawmakers decided to expand it to copyright infringement on the request of copyright holders. They were afraid some geeks, either out of principle or simply to keep torrenting movies, would democratise already existing means to hide one's self online, and create new ones.

Today, everyone knows to look for a VPN or a seedbox. Some even accept payments in crypto or gift cards.

¯\_(ツ)_/¯


Thanks for pointing that out, I have updated my comment to provide the links from the quote, which were hyperlinks in the EFF post.

> it doesn't seem to explain what the actual link is to NCMEC

The problem I see is the focus that is put onto the CSAM database. I quote from Apples FAQ on the topic [1]:

Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM

and

Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations.

Which is already dishonest in my opinion. Here lie the main problems and my reasons to find the implementation highly problematic derived from how I personally understand things:

- Nothing would actually prevent Apple from adding different database sources. The only thing the "it's only CSAM" part hinges on is Apple choosing to only use image hashes provided by NCMEC. It's not a system built "specifically for CSAM images provided by NCMEC". It's ultimately a system to scan for arbitrary image hashes, and Apple chooses to limit those to one specific source with the promise to keep the usage limited to that.

- The second large attack vector comes from outside, what if countries decide to fuse their official CSAM databases with additional uses? Let's say Apple does actually mean it and they uphold their promise: There isn't anything Apple can do to guarantee that the scope stays limited to child abuse material since they don't have control over the sources. I find it hard to believe that certain figures are not already rubbing their hands about this in a "just think about all the possibilites" kind of way.

In short: The limited scope only rests on two promises: That Apple won't expand it and that the source won't be merged with other topics (like terrorism) in the future.

The red flag for me here is that Apple acts as if there was some technological factor that ties this system only to CSAM material.

Oh and of course the fact that the fine people at Apple think (or at least agree) that Electronic Frontier Foundation, the Center for Democracy and Technology, the Open Privacy Research Center, Johns Hopkins, Harvard's Cyberlaw Clinic and more are "the screeching voices of the minority". Doesn't quite inspire confidence.

[1]: https://www.apple.com/child-safety/pdf/Expanded_Protections_...



Snoopers Charter in the UK springs to mind.


The snoopers charter was always fairly broad though. It was never about child abuse, it covers collecting phone records etc which is used in most police investigations. Unless you are referring to some particular aspect of it I'm not aware of?

I understand very broad stuff might be sold as "helping to fight child abuse" while avoiding talking about the full scope, but that isn't what Apple/NCMEC are doing here. They are making quite explicit claims about the scope.

That's why I'm wondering if there is precedent for claims about "we will ONLY use this for CSAM" which were later backtracked.


>About this time, someone usually mocks "it's always about the kids, think about the kids." To those critics: They have not seen the scope of this problem or the long term impact. There is nearly a 1-to-1 relationship between people who deal in CP and people who abuse children. And they rarely victimize just one child. Nearly 1 in 10 children in the US will be sexually abused before the age of 18.

I think we have seen "think of the kids" used as an excuse for so many things over the years that the pendulum has now swung so far that some of the tech community has begun to think we should do absolutely nothing about this problem. I have even seen people on HN in the last week that are so upset by the privacy implications of this that they start arguing that these images of abuse should be legal since trying to crack down is used as a motive to invade people's privacy.

I don't know what the way forward is here, but we really shouldn't lose sight that there are real kids being hurt in all this. That is incredibly motivating for a lot of people. Too often the tech community's response is that the intangible concept of privacy is more important than the tangible issue of child abuse. That isn't going to be a winning argument among mainstream audiences. We need something better or it is only a matter of time until these type of systems are implemented everywhere.


> Too often the tech community's response is that the intangible concept of privacy is more important than the tangible issue of child abuse.

I think the actual idea, which the nuance is often lost (and let's be honest, some people aren't really aware of and are just jumping on the bandwagon), is that privacy is a right, and erosion of rights is extremely important because it has been shown many times in the past to have far reaching and poorly understood future effects.

It's unfortunate that some would choose to link their opposition to the idea with making the thing it's attempting to combat legal when it's something so abhorrent, but in general I think a push back on any erosion of our rights is a good and useful way for people to dedicate time and effort.

While we shouldn't lose sight that there are real children in danger and that this should help, we also shouldn't lose sight that there is plenty of precedence that this could eventually lead to danger and problems for a great many other people, children included.


>I think the actual idea, which the nuance is often lost (and let's be honest, some people aren't really aware of and are just jumping on the bandwagon), is that privacy is a right, and erosion of rights is extremely important because it has been shown many times in the past to have far reaching and poorly understood future effects.

The encryption everywhere mindset of a lot of the tech community is changing the nature of the right to privacy. 50 years ago all these images would have been physical objects. They could have been found by the person developing the photos or the person making copies. They could have been found with a warrant. Now they are all hidden behind E2EE and a secure enclave. To a certain extent the proliferation of technology means people can have an even stronger degree of privacy today than was practical in the past. It is only natural for people to wonder if that shift has changed where the line should be.


In the past these people would have been developing the pictures themselves, and handing them between each other either personally or in some hidden manner using public systems.

Not only has communication become easier, so has surveillance. The only difference now is that it's easier for people not to be aware when their very personal privacy is invaded, and that it can be done to the whole populace at once.


>Not only has communication become easier, so has surveillance.

It is a devil's advocate response, but can you explain why this is a bad thing? If communication is scaling and becoming easier, why shouldn't surveillance?


1. We judge these programs under the wrong assumption that it is run by the good guys. Any system that depends on the goodness of the people running it is dangerous, because it can be taken over by the bad guys.

2. I am a law abiding, innocent citizen. Why should I have to face the same compromised privacy as a criminal? It used to be that only people under suspicion are surveilled and that a judge had to grant this based on preliminary evidence. Now everyone is surveilled. How long until people who are sidestepping surveillance (i.e. using Open Source systems that don't implement this stuff), fall under suspicion just because they are not surveilled? How long until it's "guilty until proven otherwise"?

In my opinion it is absolutely fair and necessary to scan images that are uploaded to the cloud in a way that makes them shareable. But never the stuff I have on my device. And the scanning system has to be transparent.


>We judge these programs under the wrong assumption that it is run by the good guys

Speaking as someone who as a boy was tortured and trafficked by operatives of the Central Intelligence Agency of the United States of America, I am surprized how little appreciation there is for the standard role of misdirection in the espionage playbook.

It is inevitable that all these obstensibly well-intended investigators will have their ranks and their leadership infiltrated by intelligence agencies of major powers, all of whom invest in child-trafficking. What better cover?


"Wont somebody think of the (white, wealthy, documented) chiiiiilren!"

:sigh:


> We judge these programs under the wrong assumption that it is run by the good guys. Any system that depends on the goodness of the people running it is dangerous, because it can be taken over by the bad guys.

And sometimes, the "good guys" in their attempts to be as good as they can imagine being, turn into the sort of person who'll look a supreme court judge or Congresspeople or oversight committees in the eye and claim "It's not 'surveillance' until a human looks at it." after having built PRISM "to gather and store enormous quantities of users’ communications held by internet companies such as Google, Apple, Microsoft, and Facebook."

https://www.hrw.org/news/2017/09/14/q-us-warrantless-surveil...

Sure, we copied and stored your email archive and contact lists and instant messenger history. But we weren't "surveilling you" unless we _looked_ at it!

Who are these "good guys"? The intelligence community? The executive branch? The judicial branch? Because they're _all_ complicit in that piece of deceit about your privacy and rights.


Surveillance does have its place, but a large part of the problem with the new technology of surveillance is the people passing the laws on surveillance don't understand the technology that surrounds it.

Take, for instance, the collection of metadata that is now so freely swept up by the American government without a warrant. This includes the people involved in communication, the method of communication, time and duration of communication and locations of those involved with said communication. All of that is involved in the metadata that can be collected without a warrant by national agencies for "national security" done on a massive scale under PRISM (PRIZM?).

Now, this is non targeted, national surveillance, fishing for a "bad guy" with enhanced surveillance capabilities. This doesn't necessarily seem like a good thing. It seems like a lazy thing, and a thing which was ruled constitutional by people who chose to not understand the technology because it was easier than thinking down stream at the implications.


> is the people passing the laws on surveillance don't understand the technology that surrounds it.

And that the people running the surveillance have a rich track record of lying to the people who are considering whether to pass the laws they're proposing.

"Oh no, we would _NEVER_ surveil American citizens using these capabilities!"

"Oh yeah, except for all the mistakes we make."

"No - that's not 'surveillance', it's only metadata, not data. All we did was bulk collect call records of every American, we didn't 'surveil" them."

"Yeah, PRISM collects well over 80% of all email sent by Americans, but we only _read_ it if it matches a search we do across it. It's not 'surveillance'."

But they've stopped doing all that, right? And they totally haven't just shared that same work out amongst their five eyes counterparts so that what each of them is doing is legal in their jurisdiction even though there are strong laws preventing each of them from doing it domestically.

And how would we even know? Without Snowden we wouldn't know most of what we know about what they've been doing. And look at the thanks get got for that...


> It is a devil's advocate response, but can you explain why this is a bad thing?

I didn't state it as a bad thing, I stated it as a counter to your argument that encrypted communication is more common, and therefore maybe we should assess whether additional capabilities are warranted. Those capabilities already exist, and expanded before the increased encrypted communication (they happened during the analog phone line era). I would hazard that increasingly encrypted communication is really just people responding to that change in the status quo (or overreach, depending on how you view it) brought about by the major powers (whether they be governmental or corporate).


Why shouldn't every Neuralink come with a mandated FBI module preinstalled? Where, if anywhere, is private life to remain, where citizens think and communicate freely? Is Xinjiang just the start for the whole world?


Surely there is communication surrounding the child sexual abuse, making plans with other pedophiles and discussing strategies. Some of this may occur over text message. Maybe Apple’s next iteration of this technology can use my $1000 phone that I bought and that I own to surveil my E2EE private encrypted messages on my phone and generate a safety voucher for anything I say? The sky’s the limit!


Good point: neural nets are getting better at natural language every year.


Do you see how this underlines my original point? I brought up real child abuse that is happening today and your response is about brain implants being mandated by the government. Most people are going prioritize the tangible problem over worrying about your sci-fi dystopia.


A plausible story of how our rights are in the way is always ready to hand. If we can't at some point draw a clear line and say no, that's it, stop -- then we have no rights. It's chisel, chisel, chisel, year after year, decade after decade.

In America one of those lines is that your personal papers are private. Get a warrant. I don't have to justify this stand. I might choose to explain why it's a good stand, or I might not; it's on you to persuade us.


>In America one of those lines is that your personal papers are private. Get a warrant. I don't have to justify this stand. I might choose to explain why it's a good stand, or I might not; it's on you to persuade us.

Part of the problem is that these devices are encrypted so a warrant doesn't work on them. That is a big enough change that maybe people need to debate if the line is still in the right place.


That is a change worth considering, though it must be treated at the level of rights, not just a case-by-case utility calculus. At the same time, most other changes have been towards more surveillance and control: cameras everywhere, even in the sky; ubiquitous location tracking; rapidly improving AI to scale up these capabilities beyond what teams of humans could monitor; tracking of most payments; mass warrantless surveillance by spy agencies; God knows what else now, many years after the Snowden leaks. This talk you hear about the population "going dark" is... selective.

I think my vehemence last night might've obscured the point I wanted to make: what a right is supposed to be is a principle that overrides case-by-case utility analysis. I would agree that everything is open to questioning, including the right to privacy -- but as I see it, if you ask what's the object-level balance of utilities with respect to this particular proposal, explicitly dropping that larger context of privacy as a right (which was not arrived at for no reason) and denigrate that concern as science fiction, as a slippery-slope fallacy -- then that's a debate that should be rejected on its premise.


My memories and thoughts are also warrant-proof. We just accept it as a limitation.


20 years ago a device 24*7 inspecting personal data of 60% citizens for samples remotely provided by government was sci-fi dystopia.


To whom do you give the keys?


Post your passwords.


That's too short a thought, it detracts from the point.

It is bad to post passwords, not just because you lose privacy, but because you'd lose control of important accounts. Asking people to post their passwords is not reasonable.

I think you might have a point you're trying to make, but please spell it out fully.


No actually I kind of like it. It boils down the essence of the problem to it's core point-- trust.


Post your home address and phone number.


Right, 50 years ago you needed a warrant. Now with Apple’s change you don’t. That’s not progress.


> Too often the tech community's response is that the intangible concept of privacy is more important than the tangible issue of child abuse.

Is it intangible? 18% of the world lives in China alone. That's more people than the "1/10 who are victims of child abuse*", and I'm sure that 18% will only grow as other authoritarian countries get more technologically advanced.

I think "Think of the kids" applies very well to the CREATORS of pornography. Per wikipedia, there isn't any conclusive causal relationship between viewing CP and assaulting children.

* Per a google search "A Bureau of Justice Statistics report shows 1.6 % (sixteen out of one thousand) of children between the ages of 12-17 were victims of rape/sexual assault" which is a lot less than 10% figure you're citing. Non-sexual abuse wouldn't really have any bearing here, right?


>Is it intangible? 18% of the world lives in China alone. That's more people than the "1/10 who are victims of child abuse*", and I'm sure that 18% will only grow as other authoritarian countries get more technologically advanced.

You didn't mention any tangible results here. How would this system by Apple make my life worse? Can you answer that without a slippery slope argument?

>I think "Think of the kids" applies very well to the CREATORS of pornography. Per wikipedia, there isn't any conclusive causal relationship between viewing CP and assaulting children.

Why does the causality matter? A correlation is enough that cracking down on this content will result in less abusers on the streets.

>* Per a google search "A Bureau of Justice Statistics report shows 1.6 % (sixteen out of one thousand) of children between the ages of 12-17 were victims of rape/sexual assault" which is a lot less than 10% figure you're citing. Non-sexual abuse wouldn't really have any bearing here, right?

I wasn't the one citing that, but you are also citing an incomplete number since it excludes younger children.


> Can you answer that without a slippery slope argument?

So far any defense of this whole fiasco can be boiled down to what you are trying to imply in part. When you say "The possibility of abusing this system is a slippery slope argument", as if identifying a (possible) slippery slope element in an argument would somehow automatically make it invalid?

The other way around if all that can be said in defense is that the dangers are part of slippery slope thinking, then you are effectively saying that the only defense is "trust them, let's wait and see, they might not do something bad with it" or "it sure doesn't affect me" (sounds pretty similar to "I've got nothing to hide"). This might work for other areas, not so much when it comes to backdooring your device/backups for arbitrary database checks.

And since "oh the children" or "but the terrorists" has become the vanilla excuse for many many things I'm unsure why we are supposed to believe in a truly noble intent down the road here. "No no this time it's REALLY about the kids, the people at work here mean it" just doesn't cut it anymore. So no, I'm not convinced the people at Apple working on this actually do it because they care.

When "but the children" becomes a favourite excuse to push whatever, the problem are very much the people abusing this very excuse to this level, not the ones becoming wary of it.

> some of the tech community has begun to think we should do absolutely nothing about this problem

I don't believe that people think that, I believe that people rather think that the ones in power simply aren't actually mainly interested in this problem. The trust is (rightfully) heavily damaged.


>> You didn't mention any tangible results here. How would this system by Apple make my life worse? Can you answer that without a slippery slope argument?

That's a weird goal-post.

>> Why does the causality matter? A correlation is enough that cracking down on this content will result in less abusers on the streets.

Obviously of the people who look at cp, a higher percentage of those will be actual child abusers. The question for everybody is -- does giving those people a fantasy outlet increase or actually reduce the number of kids who get assaulted. At the end of the day that's what matters.

>> I wasn't the one citing that, but you are also citing an incomplete number since it excludes younger children.

[EDIT: Mistake] Well you didn't cite anything at all, and were off by a shockingly large number. Please cite something or explain why you made up a number.


>Well you didn't cite anything at all, and were off by a shockingly large number.

Sorry, I am just baffled by your last point here. How can I be off by a shockingly large number when I didn't even cite a number?


My B, I see now that was in a section where you were quoting hackerfactor. I guess I should direct that question to him.


> How would this system by Apple make my life worse? Can you answer that without a slippery slope argument?

“With the first link, the chain is forged. The first speech censured, the first thought forbidden, the first freedom denied, chains us all irrevocably. The first time any man's freedom is trodden on we're all damaged...."


This is melodramatic high school Hamlet-ism. It’s also silly - there has obviously been a case where the first speech was censured. It happened before civilizations. Are we all still damaged and bondaged by that?

Look, speech is important. So is protecting the public good. But if one believes in absolutes, rather than takeoffs, they are IMO getting too high on their own supply.

Let’s talk about the trade offs that we have already made.


Does the fact that the NSA can comb your personal files and look at people's nude photographs not concern you? That's a present day reality brought to light by Snowden. Showing a colleague a 'good find' was considered a perk of the job.

We're lying to ourselves if we think this couldn't be abused and can implicitly be trusted. We should generally be sceptical of closed source at the best of times, let alone when it's inherently designed to report on us.

To your point of 'as a layman end user what is the cost to me?': more code running on your computer doing potentially anything which you have no way to audit -> compromising the security of whatever is on your computer, and an uptick in cpu/disk/network utilisation (although it remains to be seen if it's anything other than negligible).

My defeated mentality is partly - 'well they're already spying on us anyway'...


> This is melodramatic high school Hamlet-ism. It’s also silly - there has obviously been a case where the first speech was censured. It happened before civilizations. Are we all still damaged and bondaged by that?

Frankly, yes.


That seems like a ‘no’ then.


You can’t seriously be suffering that Apple not implementing these measures will somehow be good for privacy in China?


The implication -- and I think it's a valid one -- is that this client-side mechanism will be very quickly co-opted to also alert on non-CSAM material. For example, Winnie the Pooh memes in China.


I think it’s not valid to claim that it will be quickly used for that purpose.

However I absolutely agree that it could be used to detect non-CSAM images if Apple colludes with that use case.

My point is that this is immaterial to what is going on in China. China is already an authoritarian surveillance state. Even without this, the state has access to the iCloud photo servers in China, so who knows what they are doing, with or without Apple’s cooperation.


You can label it collusion, but when Apple does it, it's going to call it _complying with local regulation_.


It doesn’t matter what you label it. Hand wringing over making things worse in China is not a valid concern.


I'm old enough to remember when the revelation that NSA is spying on everyone was a shock.

Now people are seriously arguing that continuous searching through your entire life without any warrant or justification by opaque algorithms is fine.

It only took what, 10 years?


> Now people are seriously arguing that continuous searching through your entire life without any warrant or justification by opaque algorithms is fine.

Where is anyone arguing that?


Fine, then let’s talk about other countries.

Does anyone seriously doubt that Germany will use this mechanism to ban Nazi imagery?

Then from there, it’s not a big leap to talk about controlling far right (or far left) memes in France or the UK.

More insidiously, suppose some politician in a western liberal democracy was caught with underage kids, and there were blackmail photos that leaked. Do you think those hashes wouldn’t instantly make their way onto this ban list?


> Fine, then let’s talk about other countries.

I’ll let you change the subject, but let’s note that every time someone realizes privacy in China as a concern, it’s just bullshit.

> Does anyone seriously doubt that Germany will use this mechanism to ban Nazi imagery?

Yes.

> Then from there, it’s not a big leap to talk about controlling far right (or far left) memes in France or the UK.

This one is harder for me to argue against. Those countries could order such a mechanism, whether Apple had built this or not. Because those countries have hate speech laws and no constitutional mechanism protecting freedom of speech.

This is a real problem, but banning certain kinds of speech is popular in these societies. It is disturbingly popular in the US too. That is not Apple’s doing.


It’s just sad how things have shifted.

During the Cold War, the West in general and the US in particular were proud of spreading freedom and democracy. Rock & roll and Levi’s played a big role in bringing down the USSR.

Then in the 90s, the internet had this same ethos. People fought against filters on computers in libraries and schools.

Now that rich westerners have their porn and their video games, apparently many are happy to let the rest of the world rot.

I guess I just expected more.


I actually feel the same way. I miss those earlier eras.

I just think that exaggerating scares is part of the problem, not the solution, regardless of which side of a debate is doing it.


Agreed in principle. But in this particular case, I think it's difficult to exaggerate the badness of this scare. This strikes me as one of the "Those who forget their history are doomed to repeat it" kind of things.

Like with the TSA and the no-fly list. Civil liberties groups said it was going to be abused, and they said so well before any actual abuse had occurred. But they weren't overreacting, and they weren't exaggerating. They were right. Senator Ted Kennedy even wound up on that list at one point.


I don’t think the scare is warranted.

This really is a narrowly targeted solution that only works with image collections, and requires two factors to verify, and two organizations to cooperate, one of which is Apple who has unequivocally staked their reputation on it not being used for other purposes, and the other is NCMEC which is a non-profit staffed with people dedicated to preventing child abuse.

People who are equating this with a general purpose hashing or file scanning mechanism are just wrong at best.

It’s not like the no-fly list at all.


What tangible impact in the life of an everyday Chinese citizen are you expecting if Apple offered an E2EE messging and cloud backup service in China? And why do you think the Chinese government would not just ban it and criminalise anyone found using a device which connected to Apple's servers, rendering any benefits moot?

(And why do you think it's morally right, or the responsibility of a foreign private company to try and force anything into China against their laws? Another commenter in a previous thread said the idea was for Apple to refuse to do business there - but that still leads to the question, how would that help anyone?)


> I think "Think of the kids" applies very well to the CREATORS of pornography. Per wikipedia, there isn't any conclusive causal relationship between viewing CP and assaulting children.

“Think of the Kids” damn well applies to the consumers of this content - by definition, there is a kid (or baby in some instances) involved in the CP. As a society, the United States draws the line at age 18 as the age of consent [the line has to be drawn somewhere and this is a fairly settled argument]. So by definition, in the United States, these are not consenting victims in the pictures.

Demand drives creation. Getting rid of it on one of the largest potential viewing and sharing platforms is a move in the right direction in addressing the problem.

What I haven’t seen from the tech community is the idea that this will be shut down if it goes too far or beyond this limited scope. Which I think it would be - people would get rid of iPhones if some of the other cases privacy advocates are talking about occur. And at that point they would have to scrap the program - so Apple is motivated to keep it limited in scope to something everyone can agree is abhorrent.


>Demand drives creation. Getting rid of it on one of the largest potential viewing and sharing platforms is a move in the right direction in addressing the problem.

Yeah, that focus has worked really well in the "war on some drugs," hasn't it?

I don't pretend to have all (or any good ones for that matter) the answers, but we know interdiction doesn't work.

Those who are going to engage in non-consensual behavior (with anyone, not just children) are going to do so whether or not they can view and share records of their abuse.

The current legal regime (in the US at least) creates a gaping hole where even if you don't know what you have (e.g., if someone sends you a child abuse photo without your knowledge or consent) you are guilty, as possession of child abuse images is a felony.

That's wrong. I don't know what the right way is, but adding software to millions of devices searching locally for such stuff creates an environment where literally anyone can be thrown in jail for receiving an unsolicited email or text message. That's not the kind of world in which I want to live.

Many years ago, I was visiting my brother and was taking photos of his two sons, at that time aged ~4.5 and ~2.

I took an entire roll of my brother, his wife and their kids. In one photo, the two boys are sitting on a staircase, and the younger one (none of us noticed, as he wasn't potty trained and hated pants) wasn't wearing any pants.

I took the film to a processor and got my prints in a couple of days. We all had a good laugh looking at the photos and realizing that my nephew wasn't wearing any pants.

There wasn't, when the photos were taken, nor when they were viewed, any abuse or sexual motives involved.

Were that to happen today, I would be sitting in a jail cell, looking at a lengthy prison sentence. And when done "repaying my debt to society" I'd be forced to register as a sex offender for the rest of my life.

Which is ridiculous on its face.

Unless and until we reform these insane and inane laws, I can't support such programs.

N.B.: I strongly believe that consent is never optional and those under the age of consent cannot do so. As such, there should absolutely be accountability and consequences for those who abuse others, including children.


> Were that to happen today, I would be sitting in a jail cell, looking at a lengthy prison sentence.

No you would not, I was ready to somewhat agree with you but this is just false and has nothing to do with what you were talking about before. The law does not say that naked photos of (your or anyone else's) kids are inherently illegal, they have to actually be sexual in nature. And while the line is certainly not all that clear cut, a simple picture like you're describing would never meet that line.

I mean let's be clear here, do you believe the law considers to much stuff to be CSAM, and if so why? How would you prefer we redefine it?


> The law does not say that naked photos of (your or anyone else's) kids are inherently illegal, they have to actually be sexual in nature.

But that depends on who looks at it.

People have been arrested and (at least temporary) lost custody over their children because someone called the police over perfectly normal family photos. I remember one case a few years ago where someone had gotten into trouble because one photo included a toddler wearing nothing (even facing away from the camera if my memory serves me correctly) playing at the beach. When police realized this wasn't an offense, instead of apologizing they got hung up on another photo were kids were playing with an empty beer can.

Recently this was also linked https://jonathanturley.org/2009/09/18/arizona-couple-sues-wa...

which further links to a couple of previous cases.

I'd say we get police or health care to talk to people who think perfectly normal images are sexual in nature, but until we get laws changed at least then keep us safe.

> I mean let's be clear here, do you believe the law considers to much stuff to be CSAM, and if so why? How would you prefer we redefine it?

Another thing that comes up is that a lot of things that are legal might be in that database because criminal might have a somewhat varied history.

Personally I am a practicing conservative Christian so this doesn't bother me personally at the moment since for obvious reasons I don't collect these kinds of images.

The reason I care is because every such capability will be abused, and below I present in two easy steps how it will go from todays well intended system to a powerful tool for oppression:

1. today it is pictures but if getting around it is as simple as putting it in a pdf then obviously pdfs must be screened too. Same with zip files. Because otherwise this so simple to circumvent that is worthless.

2. once you have such a system in place it would be a shame not to use it for every other evil thing. Dependending on where you live this might be anything: Muslim scriptures, Atheist books or videos, Christian scriptures, Winnie the Pooh drawings - you name it and someone wants to censor it.


As soon as it is used in a negative way beyond CSAM scanning, it will cause people to sell their phones and stop using apple products.

If Apple starts using the tech to scan for religious material, there will be significant market and legal backlash. I think the fact that CSAM scanning will stop if they push it too far will keep them in check to only do CSAM scanning.

Everyone can agree on using the tech for CSAM, but beyond that I don’t see Apple doing it. The tech community is reacting as if they already have.


Problem one is Apple doesn't know what they are scanning for.

This is by design and actually a good thing.

It becomes a problem because problem number 2:

No one is accountable if someone gets their life ruined over a mistake in this database.

I'd actually be somewhat less hostile to this idea if there was more regulatory oversight:

- laws that punishes police/officials if innocent people are harmed in any way

- mandatory technical audits as well as verification that for what it is used for: Apple keeps logs of all signatures that "matched"/triggered as well as raw files, these are provided to the court as part of any case that comes up. This way we could hopefully prevent most fishing expeditions - both wide and personalized ones - and also avoid any follow up parallel reconstructions.

I'm not saying I'd necessarily be OK with it but at that point there would be something to discuss.


It may be worth taking very seriously that you might be overestimating both how quickly regular people become aware of such events and how emphatically people will react.


> I'd say we get police or health care to talk to people who think perfectly normal images are sexual in nature, but until we get laws changed at least then keep us safe.

Personally I don't find anecdotes convincing compared to the very real amount of CSAM (and actual child abuse) we already know exists and circulates in the wild, but I do get your point. That said personally I don't think changing the laws would really achieve what you want anyway - I don't think a random Walmart employee is up-to-date on the legal definitions of CSAM, they're going to potentially report it regardless of what the law is (and the question of whether this is a wider trend is debatable, again this is an anecdote).

With that, they were eventually found innocent, so the law already agrees what they did was perfectly fine, which was my original point. No it should not have taken that long, but then again we don't know much about the background of those who took them, so I'm not entirely sure we can easily determine how appropriate the response was. I'm certainly not trying to claim our system is perfect, but I'm also not convinced rolling back protections for abused children is a great idea without some solid evidence that it really isn't working.

> Another thing that comes up is that a lot of things that are legal might be in that database because criminal might have a somewhat varied history.

That didn't really answer my question :P

I agree the database is suspect but I don't see how that has anything to do with the definition of CSAM. The legal definition of CSAM is not "anything in that database", and if we're already suggesting that there's stuff in there that's known to not be CSAM then how would changing the definition of CSAM help?


> Personally I don't find anecdotes convincing compared to the very real amount of CSAM (and actual child abuse) we already know exists

First: This is not hearsay or anecdotal evidence, this is multiple innocent real people getting their lives trashed to some degree before getting aquitted.

> I don't think a random Walmart employee is up-to-date on the legal definitions of CSAM, they're going to potentially report it regardless of what the law is (and the question of whether this is a wider trend is debatable, again this is an anecdote).

Fine, I too report a number of things to the police that might or might not be crimes. (Eastern European car towing a Norwegian luxury car towards the border is one. Perfectly legal in one way but definitely something the police was happy to get told about so they could verify.)

> With that, they were eventually found innocent, so the law already agrees what they did was perfectly fine, which was my original point.

Remember the job of the police is more to keep law abiding citizens safe than to lock up offenders. If we could magically keep kids safe forever without catching would-be offenders I'd be happy with that.

Making innocent peoples lives less safe for a marginally bigger chance to catch small fry (i.e. not producers), does it matter?

The problem here and elsewhere is that police many places doesn't have a good track record of throwing it out. Once you've been dragged through court for the most heinous crimes you don't get your life completely back.

If we knew police would always throw out such cases I'd still be against this but then it wouldn't be so obviously bad.


> First: This is not hearsay or anecdotal evidence, this is multiple innocent real people getting their lives trashed to some degree before getting aquitted.

"multiple" is still anecdotal, unless we have actual numbers on the issue. The question is how many of these cases actually happen vs. the number of times these types of investigations actually reveal something bad. Unless you never want kids saved from abuse there has to be some acceptable number of investigations that eventually get dropped.

> Remember the job of the police is more to keep law abiding citizens safe than to lock up offenders.

Maybe that should be their purpose, but in reality they're law enforcement, their job has nothing to do with keeping people safe. The SCOTUS has confirmed as much that the police have no duty to protect people, only to enforce the law. However I think we agree that's pretty problematic...

> Making innocent peoples lives less safe for a marginally bigger chance to catch small fry (i.e. not producers), does it matter?

I would point out that the children in this situation are law abiding citizens as well, and they also deserve protection. Whether their lives were made more or less safe in this situation is debatable, but the decision was made with their safety in mind. For the few cases of a mistake being made like the one you presented I could easily find similar cases where the kids were taken away and then it was found they were actually being abused. That's also why I pointed out your examples are only anecdotes, the big question is whether this is a one-off or a wider trend.

If reducing the police's ability to investigate these potential crimes would actually result in harm to more children, then you're really not achieving your goal of keeping people safer.

> The problem here and elsewhere is that police many places doesn't have a good track record of throwing it out. Once you've been dragged through court for the most heinous crimes you don't get your life completely back.

Now this I agree with. The "not having a good record of throwing it out" I'm a little iffy on but generally agree, but I definitely agree that public knowledge of being investigating for such a thing is damaging even if it turns out your innocent, which isn't right. I can't really say I have much of a solution for that in a situation like this though, I don't think there's much of a way to not-publicly take the kids away - and maybe that should have a higher threshold, but I really don't know, as I mentioned earlier we'd really need to look at the numbers to know that. For cases that don't involve a public component like that though I think there should be a lot more anonymity involved.


A large? portion of “sexual predators” are men peeing on the side of the interstate[1]. So it’s not far-fetched to think that a random pic would also land you in jail.

[1]: I looked up the cases of sex offenders living around me several years ago.


Random pictures aren’t going to be in the CSAM database and trigger review. And to have multiple CSAM hash matches on your phone is incredibly unlikely.

An unsolicited email / having photos planted on your phone or equipment is a problem today as much as it will be then, but I think people over-estimate the probability of this and ignore the fact it could easily happen today, with an “anonymous tip” called into the authorities.

If they are scanning it though iMessage they will have logs of when it arrived, and where it came from as well - so in that case it might protect the victim being framed.


That is such a tortured, backwards argument, but the only one that has even a semblance of logic so it gets repeated ad nauseam. Why be so coy about the real reasons?


Any reach into privacy, even while "thinking of the kids" is an overreach. Police can't open your mail or search your house without a warrant. The same should apply to your packets and your devices.

Why not police these groups with.. you know.. regular policing? Infiltrate, gather evidence, arrest. This strategy has worked for centuries to combat all manner of organized crime. I don't see why it's any different here.


Devil's advocate: It may not be mostly big organized crime. It may be hard to fight, because it's not groups. It mostly comes from people close to the families, or the families themselves.

Here's a relevant extract sourced from Wikipedia:

"Most sexual abuse offenders are acquainted with their victims; approximately 30% are relatives of the child, most often brothers, sisters, fathers, mothers, uncles or cousins; around 60% are other acquaintances such as friends of the family, babysitters, or neighbours; strangers are the offenders in approximately 10% of child sexual abuse cases.[53] In over one-third of cases, the perpetrator is also a minor.[56]"

Content warning: The WP article contains pictures of abused children.


Sure, but the people sharing these images do so in organized groups, often referred to as "rings". I agree it would be very hard to catch a solitary perpetrator abusing children and not sharing the images. However since they would be creating novel images with new hashes, Apple's system wouldn't do much to help catch them would it?


The laws regarding CSAM/CSA are not the problem, they are fine. The problem is that we are expected to give up our rights in the vague notion of 'protecting children' while the same authorities actively ignore ongoing CSA. The Sophie Long case is an excellent example where the police has no interest in investigating allegations of CSA. Why is it that resources are spent policing CSAM but not CSA? It is because it is about control and eroding our right to privacy.


I agree that our current legal and law enforcement system isn't up to speed with 21st century internet. And it has to be updated, because this kind of makes the internet a de-facto law less space, covering everything from stalking, harassment over fraud to contraband trafficking and CSAM.

I don't think full scale surveillance is the way to go in free, democratic societies. It is the easiest one, so. Even more if surveillance can be outsourced to private, global tech companies. It saves the pain of passing laws.

Talking about laws, those along with legal proceedings should be brought up to speed. If investigators, based on existing and potential new laws, convince a court to issue a warrant to surveil or search any of my devices, fine. Because then I have legal options to react to that. Having that surveillance incorporated into some opaque EULA from a company in US, a company that now can enforce its standards on stuff I do, is nothing I would have thought would be acceptable. Not that I am shocked it is, I just wonder why that stuff still surprises me when it happens.

Going one step forward, if FANG manages to block right to repair laws it would be easy to block running non-approved OS on their devices. Which would than be possible to enforce by law. Welcome to STASI's wet dream.


All of the 'bad things' you mention are already very illegal. Changing the laws in this case will only lead to tyranny. I cannot emphasize this enough, you cannot fix societal ills by simply making bad things illegal. Right to repair is of course crucial to free society.


By laws I mean the laws governing police work. And bringing police up to speed. Obviously CP and other things are already very much illegal, these laws are just hardly enforced online. That has to change.


The internet isn't the wild west, laws are very much enforced.


Stalking and harrasment isn't, at least over here. Victims are constantly left out in the cold. Same goes for fraud, most cases are not prosecuted. Especially if these cases cross state, and in the EU, nation borders. Because it becomes inconvenient, so police isn't really bothering. And if they do, the fraud is done. The stalking went on for years. And nothing really improved.

Hell, do I miss the old internet. The one without social media.


International fraud is a really interesting problem. I've proposed a mandatory international money transfer insurance that would pay out in case of court decided fraud. It would make doing business with corrupt countries that look the other way on fraud within their borders crack down to preserve their international market access.


I have hands on experience with, what I'd call at least attempted fraud, with crypto. Back when Sweden thought about state backed crypto, a lot of ads showed up where you could invest in that. I almost did, call centers used Austrian numbers. Not sure if there was even any coin behind that. I reported it to police, got a letter after a couple of months that the investigation led nowhere and was dropped, apparently Austrian authorities did find anything on the reported number.

A couple of hours online found

- the company behind that operated out of the UK - the call center was not in Austria but used local numbers for a while - company was known for that shady business but never even got a slap on the wrist

I decided to never count on authorities for that kind of fraud. Or report it, because that's just a waste of time, unless you lost a shitload of money.


There's a lot of dumb fraud. Being international of course makes everything harder. Usually the amount of investigation is related to how many people were scammed. People need to learn how to do due diligence because the definition of fraud can get vague at times. I think your example is a good case of due diligence, it couldn't hurt to blog about their fraud though.


People did write about it, that's how I found out so much so quickly. I didn't invest, but cake reasonably close. One could call it due diligence, but I can see how easy it is to fall for things like that. I got a lot more sceptical of these ads, even more so than before.


It’s not just thinking of the victims when they are kids - if they aren’t killed after the material is made, then they have issues for the rest of their life with a real cost to society.

We’re talking a life-long impact from being abused in that stuff…


> Why haven't I made my whitepaper about PhotoDNA public? In my view, who would it help? It would help bad guys avoid detection and it will help malcontents manufacture false-positives. The paper won't help NCMEC, ICACs, or related law enforcement. It won't help victims.

Making it public would allow the public to scrutinize it, attack it, if you will, so that we can get to the bottom of how bad this technology is. Ultimately my sincere guess is that we’d end up with better technology to do this not some crap system that essentially matches on blurry images. Our government is supposed to be open source, there’s really no reason we can’t as a society figure this out better than some anti CP cabal with outdated crufty image tech.


PhotoDNA is a very simple algorithm. Reproducing what OP did to "reverse engineer" is not hard. If you really have the guts to try this go ahead and you'll be surprised. I'm living in the US with a greencard so I'm not touching the problem even with a laser pointer.

The techno-legal framework is worse than just "reversing the hashes is possible". You could brute force creation of matching images using a cloud service or online chat as an "oracle".


> About this time, someone usually mocks "it's always about the kids, think about the kids." To those critics: They have not seen the scope of this problem or the long term impact.

The exploitation of children is a real and heartbreaking ongoing tragedy that forever harms millions of those we most owe protection.

It's because this is so tragic that we in the world of privacy have grown skeptical. The Four Horsemen, invoked consistently around the world to mount assaults against privacy, are child abuse, terrorism, drugs, and money laundering.

Once privacy is defeated, none of those real, compelling, pressing problems get better (well, except money laundering). Lives are not improved. People are not rescued. Well-meaning advocates just move on to the next thing the foes of privacy point them at. Surely it will work this time!

Your heart is in the right place. You have devoted no small part of your time and energy to a deep and genuine blooming of compassion and empathy. It's just perhaps worth considering that others of us might have good reason to be skeptical.


> Why haven't I made my whitepaper about PhotoDNA public? In my view, who would it help? It would help bad guys avoid detection and it will help malcontents manufacture false-positives. The paper won't help NCMEC, ICACs, or related law enforcement. It won't help victims.

I have to respectfully disagree with that statement and the train of thought unfortunately. However, you should seek legal counsel before proceeding with anything related: the advice from a stranger on the web.

a) You are not in the possession of a mystical secret for some magical curve or lattice. The "bad guys" if they have enough incentive to reverse a compression algorithm (effectively what this is) they will easily do that, if the money is good enough.

b) If we followed the same mentality in the cryptography community we would still be using DES or have a broken AES. It is clear from your post that the area requires some serious boost from the community in terms of algorithms and implementations and architectural solutions. By hiding the laundry we are never going to advance.

Right now this area is not taken seriously enough as it should by the research community -- one of the reasons also being the huge privacy, and illegal search and seizure concerns and disregard of other areas of law most of my peers have. Material such as yours can help attract attention necessary to the problem and showcase how without the help of the community we end up with problematic and harmful measures such as what you imply.

c) I guess you have, but just in the slightest of cases: From what I have read so far and from the implications of the marketing material, I have to advise you to seek legal counsel if you come to the possession of this PhotoDNA material they promised or about your written work. [https://www.justice.gov/criminal-ceos/citizens-guide-us-fede...] Similarly for any trained ML model -- although here it is a disaster in progress still.


> Right now this area is not taken seriously enough as it should by the research community

Another reason is that basic material needed to conduct this research (child porn) is legally toxic to posses. Sure, there are ways around that for sufficiently motivated researchers. But if you are a CS researcher, you have a lot of research oppurtunities that do not involve any of that risk or overhead.


I'd lament that using 'thinking of the children' as an excuse for genuine overreach, when that's what it is, puts us in a worse situation.

There is incredible second-order harm in overreach, because the reaction to it hurts the original cause, too.

If you try to overcorrect, people will overcorrect in response.

The sort of zeal that leads to thoughts like "screeching minority", I think shows carelessness and shortsightedness in the face of very important decisions.

I have no informed opinion on Apple's CSAM tool, beyond deferring to the FotoForensics expert.


Thanks for the write-up and putting all the work into this important issue.

> "There is nearly a 1-to-1 relationship between people who deal in CP and people who abuse children. And they rarely victimize just one child. Nearly 1 in 10 children in the US will be sexually abused before the age of 18."

One thing I wondered and have not seen brought up in the discussion so far is this: As far as I understand the perceptual hash solutions work based on existing corpora of abuse material. So, if we improve our detection ability of this existing content, doesn't that increase the pressure on the abusers to produce new content and in consequence hurt even more children? If so, an AI solution that also flags previously unknown abuse material and a lot of human review are probably our only chance. What is your take on this?


> doesn't that increase the pressure on the abusers to produce new content and in consequence hurt even more children?

Not really. Two points:

1) Many / all CP boards these days ask applicants to provide CSAM (as police in almost all jurisdictions except for IIRC US and Australia are banned from bringing CSAM into circulation). And to keep police where allowed from simply re-uploading stuff, they (as well as the "client base") demand new, yet-unseen stuff, and so no matter what the police is doing there will always be pressure for new content.

2) The CSAM detection on popular sites only hits people dumb enough to upload CSAM to Instagram, Facebook and the likes. Granted, the consumer masses are dumb and incompetent at basic data security, but ... uhh, for lack of a better word, experienced CSAM consumers know after decades of busts that they need to keep their stashes secure - aka encrypted disks.

> If so, an AI solution that also flags previously unknown abuse material and a lot of human review are probably our only chance. What is your take on this?

There are other chances to prevent CSA that don't risk damaging privacy, and all of them aim at the early stages - preventing abuse from happening:

1) teach children already in early school years about their body and about consent. This one is crucial - children who haven't learned that it is not normal that Uncle Billy touches their willy won't report it! - but unfortunately, conservatives and religious fundamentalists tend to blast any such efforts as "early sexualization", "gay propaganda" and whatnot.

2) provide resources for (potential) committers of abuse. In Germany, we have the "Kein Täter werden" network that provides help, but many other countries don't have anything even remotely similar.

3) Screening of staff and volunteers in trust / authority positions dealing with children (priests and other clergy, school and pre-school/kindergarten teachers, sports club trainers) against CSA convictions. Unfortunately, this is ... not followed thoroughly very often and in some cases (cough Catholic Church) the institutions actively attempt to cover up CSA cases, protect perps and simply shuffle staff around the country or in some cases across the world.


Thanks, that was very insightful.


> About this time, someone usually mocks "it's always about the kids, think about the kids." To those critics: They have not seen the scope of this problem or the long term impact. There is nearly a 1-to-1 relationship between people who deal in CP and people who abuse children. And they rarely victimize just one child. Nearly 1 in 10 children in the US will be sexually abused before the age of 18.

Maybe it's the fact that I don't have kids, or that I spend most of my life online with various devices and services. But I would much rather drop the NCMEC, drop any requirement to monitor private messages or photos, reinstate strong privacy guarantees, and instead massively step up monitoring requirements for families. This argument seems like we're using CSAM as a crutch to get at child abusers. If the relationship is really nearly 1:1, it seems more efficient to more closely monitor the groups most likely to be abusers instead.

It even seems to me that going after a database of existing CSAM is counterproductive. With that material, the damage is already done. In a perverse sense, we want as many pedos as possible to buy old CSAM, since this reduces the market for new abuse. It seems to me that initiatives like this do the opposite.

I am not defending CSAM here. But CSAM and child abuse are connected problems, and istm child abuse is the immensely greater one. We should confront child abuse as the first priority, even at the expense of CSAM enforcement, even at the expense of familial privacy. With a rate of 1 in 10, I don't see how not doing so can be ethically defended.


"and instead massively step up monitoring requirements for families"

Pls have a family first and then see if you ask for more state governence into your life?

It is true, most abuse happens in the family, but there is also schools, churches, sports clubs, ...

But the thing is, if the schools for example would have the stuff, with enough time (and empathy) to care about the actual children and talk with them and interact with them (and not mainly the paperwork about them) - then you could easily spot abuse everywhere and take action.

But they usually don't, so you have traumatized children coming into another hostile and cold environment called public school, where they just stonewall again and learn to hide their wounds and scars.

Child abuse is a complex problem, with no simple solution. But I prefer a more human focused solution amd not just another surveillance stepup.

Child abuseres also do not fall from the sky. If you pay more attention to them while they are young - you can spot it and help them, while they need help, before they turn into creepy monsters.


Honestly, if we take the ratio of 1 in 10 seriously, I think the time for human focus and caution has passed. That's an epidemic. To be clear, I'm not letting schools, churches, sports clubs etc off here; all those places clearly need massively increased external oversight as well. But at a 10% rate, we cannot exclude the family; it must be considered an institution in a state of failure.


Well, there are still lots of people who take the bible literal:

"Whoever spares the rod hates their children, but the one who loves their children is careful to discipline them."

https://www.bibleref.com/Proverbs/13/Proverbs-13-24.html

This is the ideological base for it, in my opinion. Unchecked authoritive power tpgether with physical violence. The thing is, the state institutions have not really a clean record on abuse either.

And I did not say anything about excluding the family from monitoring of abuse. I said I see no reason to increase the monitoring. With the meassure in place right now, you could spot plenty of abuse already everywhere - if it would be really about the children.

No easy problem, no easy solution.


> talk with them and interact with them (and not mainly the paperwork about them) - then you could easily spot abuse everywhere and take action

My impression is that the abuse is easily spotted, and paperwork done, but that often not much comes of it. We (USA) don't actually seem to have very good systems for handling things once they're discovered, partly (largely?) due to lack of resources.


I mean there is improvement in some regards, that for example priests or teacher molesters do not silently get moved to a different place in the same job anymore, but yeah - we can easily spot the actual problems on the ground now. One more reason to reject dystopian technological solutions that also do not solve the real problems: children in need of a real protected home.


> someone usually mocks "it's always about the kids, think about the kids."

Yes, let's think about the kids, please.

I certainly don't want my children to grow up in an authoritarian surveillance state...


Exactly, we can’t save the children by giving them a dystopia to grow up in.


Information is power and you can dethrone anyone with access to their communication. Just a quote out of context is enough, especially with many people out for blood to handle their doubts and fears.

A lot of people allegedly knew about Epstein and he was completely untouched while connected to high ranking politicians. You wouldn't have needed surveillance to identify child abuse and if it had turned up anything, I doubt something had happened. Even with that surveillance implemented, evidence would only be used if politically convenient.

If you are against child abuse, you should add funding to child care. People there will notice more abuse cases when they have the support and funding they need because that means more eyes on a potential problem. An image algorithm is no help.


> The paper won't help NCMEC, ICACs, or related law enforcement. It won't help victims.

But it will help all of society by preventing a backdoor from gaining a foothold. If the technology is shown to be ineffective it will help pressure Apple to remove this super dangerous tool.

Once a technical capability is there, governments will force Apple to use it for compliance. It won’t be long before pictures of Winnie the Pooh will be flagged in China.


OP knows that, and is acting accordingly.

They enjoy being “unquestionable”/above being audited.


> Why haven't I made my whitepaper about PhotoDNA public? In my view, who would it help?

Would it help activists push for more accurate technology and better management for NCMEC? Would it help technologists come up with better algorithms? I see all kinds of benefits to more openess and accountability here.


>> There is nearly a 1-to-1 relationship between people who deal in CP and people who abuse children

Can you give a source for this please? Also by "deal in" do you mean create or view?


Hey, first of all kudos to you for all your hard work and having the heart in the right place.

I don’t really want to take any specific position on this issue as I don’t have enough context to make a fair assessment of the situation. However, I do want to point out one thing:

By supporting a specific approach to solve a problem, you generally remove some incentives to solve the problem in some other way.

Minted to this situation, I think it would be interesting to ask what are other potential solutions to the problem of child abuse and how effective may they be compared to things like PhotoDNA? Is it the biggest net benefit you could habe or maybe even a net cost to work on this to solve the problem of child abuse?

I don’t have the answer but I think it‘s important to look at the really big picture once in a while. What is it that you want to achieve and is what you are doing really the most effective way of getting that or just something that is „convenient“ or „familiar“ given a set of predefined talking points you didn’t really question?

All the best to you :)


> Nearly 1 in 10 children in the US will be sexually abused before the age of 18.

Please lets not invent distorted statistics or quote mouthpieces who has it in their own interest to scare people as much as possible into rash actions just like Apple has done.


Thank you for those insights.

I've long held a grudge against Microsoft and NCMEC for not providing this technology, because I live in a country where reporting CSAM is ill-advised if you're not a commercial entity and law enforcement seizes first and asks questions later (_months_ later), so you end up just closing down a service if it turns out to be a problem.

This puts it into perspective. PhotoDNA seems fundamentally broken as a hashing technology, but it works just well enough with a huge NDA to keep people from looking too closely at it.

NCMEC needs a new technology partner. It's a shame they picked Apple, who are likely not going to open up this tech.

Without it, it's only a matter of time until small indie web services (think of the Fediverse) just can't exist in a lot of places anymore.


> it's only a matter of time until small indie web services (think of the Fediverse) just can't exist in a lot of places anymore

I expect they will simply become P2P E2EE darknets eventually, meaning there won't really be a "service" anymore. Matrix already has E2EE and is actively working towards P2P.


> someone usually mocks "it's always about the kids, think about the kids." To those critics: They have not seen the scope of this problem or the long term impact.

This is making it worse : because people with an agenda use CP as an excuse to force immoral behavior into us, now child suffering is always associated with bullshit. Those action are hurting the children by supressong the social will to take it seriously.

Stop hurting the children!


>Why haven't I made my whitepaper about PhotoDNA public? In my view, who would it help? It would help bad guys avoid detection and it will help malcontents manufacture false-positives.

I would suggest that the people NCMEC are most enthusiastic to catch know better than to post CSAM in places using PhotoDNA, particularly in a manner that may implicate them. Perhaps I overestimate them.


1. Your claims about CP and child abuse need substantial evidence.

2. Assuming all that as true, opaque surveillance and the destruction of free, general computing is much worse than child abuse.


> About this time, someone usually mocks "it's always about the kids, think about the kids." To those critics: They have not seen the scope of this problem or the long term impact.

I'd say it's the other way around. If freedom dies this time, it might die for good, followed by an "eternity" of totalitarianism. All violence that occured in human history so far combined is nothing compared to what is at stake.

> Free thought requires free media. Free media requires free technology. We require ethical treatment when we go to read, to write, to listen and to watch. Those are the hallmarks of our politics. We need to keep those politics until we die. Because if we don’t, something else will die. Something so precious that many, many of our fathers and mothers gave their life for it. Something so precious, that we understood it to define what it meant to be human; it will die.

-- Eben Moglen


> About this time, someone usually mocks "it's always about the kids, think about the kids." To those critics: They have not seen the scope of this problem or the long term impact. There is nearly a 1-to-1 relationship between people who deal in CP and people who abuse children. And they rarely victimize just one child. Nearly 1 in 10 children in the US will be sexually abused before the age of 18.

Does CP being available create victims? I'd say that virtually everybody who suddenly saw CP would not have the inclination to abuse a child. I don't believe that availability of CP is the causal factor to child abuse.

But putting aside that and other extremely important slippery slope arguments for a minute about this issue: have you considered that this project may create economic incentives that are inverse of the ostensible goal of protecting more children from becoming victims?

Consider the following. If it becomes en vogue for cloud data operators to scan their customers' photos for known illegal CP images, then the economic incentives created heavily promote the creation of new, custom CP that isn't present in any database. Like many well-intentioned activists, there's a possibility that you may be contributing more to the problem you care about than actually solving it.


I have no idea whether this statistic is true or not, but "nearly 1 in 10 children in the US will be sexually abused" is an incredibly high number.

I have no doubt that some cases of sexually abused children must have also existed in the environment where I have grown up as a child, in Europe, but I am quite certain that such cases could not have been significantly more than 1 per thousand children.

If the numbers would be really so high in USA, then something should definitely be done to change this, but spying on all people is certainly not the right solution.


What makes you quite certain about your 1-1000 number? It’s really easy to mistake our experience for something generalizable.


It's hard to say without data, but the number is so astonishingly high that, really, the first reaction is to call it into question.

It also seems to be the kind of thing that is hard to measure.


I dunno, seems reasonable to me. Crimes of this nature are underreported, so I tend to assume I’m only aware of a very small percentage of these situations. I’m not saying you’re wrong - just that I had the reaction that it seemed within the general ballpark. For example, most of the apparently normal women I have known well have shared stories of abuse and trauma.

What makes you say it’s hard to measure? The definitions are pretty clear cut.


Of my female friends, I know at least 5 who are victims. Victims make up the majority of my closer female circle.

It is way more commonplace than you would like to think.


I'm glad that you at least were transparent about the shortcomings of these types of systems. Maybe some academics in forensic image analysis can take up the torch and shed light on how much of a failure this whole system is, especially when they're putting spyware on our personal devices. Anyway, I like your writings, keep up the good fight.


It's somewhat off-topic for the article, but I'm curious as to why you retain images uploaded for analysis after the analysis is completed?

After all, surely 99% of uploaders don't hold the copyright for the image they're uploading, so retaining them is on shaky legal grounds to begin with, if you're the kind of person who wants to be in strict compliance with the law - and at the same time, it forces you and your moderators to handle child porn several times a day (not a job I'd envy) and you say you risk a felony conviction.

Wouldn't it be far simpler, and less legally risky, if you didn't retain the images?


> Why haven't I made my whitepaper about PhotoDNA public? In my view, who would it help? It would help bad guys avoid detection and it will help malcontents manufacture false-positives. The paper won't help NCMEC, ICACs, or related law enforcement. It won't help victims.

It would help those victims who are falsely accused of having child porn on the phone because of a bug.

It would also help those people who is going to be detained because PhotoDNA will be used by dictatorial states to find incriminating material on their phone just as they used Pegasus to spy on journalists, political opponents, etc.


Hey - thanks for writing the OP. I may be missing something obvious, but how does fotoforensics actually identify all the CSAM content it reports to NCMEC?


You ask how they can know that their false positive rate is 1 in a trillion, without testing trillions of images. Simple, they require more than on hit. If each hit has a false positive rate of 1 in 100, which is very testable, you can simply require six hits.


It doesn't matter how "nice" the people on the lower levels are.

An organization is determined to act like the management wants. If management consists of jerks the organization they lead will always behave accordingly.

A fish rots from the head down…


> There is nearly a 1-to-1 relationship between people who deal in CP and people who abuse children. And they rarely victimize just one child. Nearly 1 in 10 children in the US will be sexually abused before the age of 18.

These are clearly propaganda statistics. There is absolutely no way 10% of the US child population is molested.

The fact that you not only believe this, but repeat it publicly calls into question your judgement and gullibility.

You’ve been recruited to undermine the privacy and security of hundreds of millions (billions?) of people and indoctrinated with utterly ridiculous statistics. Your argument is essentially “the ends justify the means”, and it could not be any more ethically hollow.


Not that I particularly care one way or the other, but doesn’t writing a ‘whitepaper’ (or calling your notes such) indicate an intention to release it?


How feasible is it to change the algorithm to flag police uniforms? The Tank Man?


No need to change anything. It's supported by design.


> I built the initial FotoForensics service in a few days.

Why did you specify "In a few days"?


My guess is to establish the approximate amount of collaboration. He didn't work there full time for years, that collaboration has been small and limited.


It does make one wonder--careless work, trivial problem, reuse of existing projects, writer portrays self as extremely productive, or some combination?


I think you're assuming bad faith from someone who has proven very competent, technical, and coherent.

It might simply be truthful. And if the author is proud of that, it is both irrelevant and perfectly okay. Let's focus on the facts and arguments relevant to the article, not personal attacks.


None of the options I proffered are in the "bad faith" category.


Ya, it's hard to interpret it differently, so was curious. Don't know why I'm being downvoted for a question about rhetorical intention.


I didn’t downvote, but it comes across as a nitpick on a thoughtful and reasonable piece.


[flagged]


No - but this comment is abhorrent.


I think there is a trend across the world of power centers becoming more authoritarian and punitive. It is en vogue to see it in China and point them out for it.

It is harder to look in the mirror and see something similar happening in the US. In the US, we superficially see government doing it (Snowden disclosures) and then we see corporations doing it separately (Facebook, Google ad tracking, etc...). However, I think government and tech giants are working closely together to act like China. This feels like another bud of this trend.

- Social credit system

- Loss of financial services (PayPal)

- Loss of access to social media (Facebook, Google, Apple, YouTube)

- Loss of access to travel related services (AirBnB, travel ban)

- Banning of material (Amazon)

- Control of media (owned by just a few major corporations who don't have to make money from the media... Comcast basically has a monopoly on land line Internet, AT&T has a very strong position on mobile, Disney has a strong position in films... they own CNN, NBC, and ABC)... EDIT: And let's not forget Fox

You can say all you want about freedom of association, but the effect is similar in China and the US. You are ostracized from the system.

Tech has lost its neutral carrier status and now is connected into a system that enforces consent. I wonder why? Are we being prepared for an economic war? Is this the natural evolution of power seeking power? Is this just a cycle of authoritarianism and liberalism?

P.S. I don't think groundswell upcry leading to cancellation means much. I am much more concerned when corporations do it. I just think they are much, much more powerful.


I agree. NCMEC is the root cause. It's very much like the war on drugs. There are a lot of horrendous crimes out there and this level of overreach isn't deemed necessary for any of them.

Child sex abuse seems to have become the excuse to create these heavy handed policies to terminate accounts with no recourse by Google and scanning locally stored photos by Apple. Even receiving a cartoon depicting child sex abuse can get you in trouble.

Hopefully this will be a catalyst to reform the law.


> I'm surprised, and honestly disappointed, that the author seems to still play nice

Stopping to that level is what they want, CNN: “‘privacy activists’ have released a tool to allow the spread of CP”


If that’s how they play then the only winning move is to respond in kind.

>shadowy government affiliated agency abuses its role of protecting children to install malware on a billion devices


That's a PR fight you will lose.


I'm not so sure, I think it may be a lose-lose scenario, with both the government and the companies who's products have become compromised taking a big hit.

The people merely avoiding the tainted products aren't on blast here.


The people on blast are the ones publicly advocating for privacy. They're the ones who would be damaged in that scenario, and I for one want a society that takes them seriously.


Well, it doesn't look that way so far. I haven't seen any credible mainstream positions ridiculing the people advocating for privacy. It seems like everyone's pretty well aware that technology sucks and systems like this are not to be trusted, even if they're implemented with good intentions.

We will see.


That's not even news at this point. Privacy invading laws have been pushed for years under the guise of either CP or terrorism.


Scary to admit this, but whatever. In my younger years, which coincidentally were also the younger years of the internet, I used to come across... let's say...bad things all the time.

Being a decent enough human, I made many reports to NCMEC. Human verified(me) reports.

Never once did I hear back. Not even some weird auto-reply. Needless to say, I have zero faith in that agency. Fitting they'd get others to do the work.


>I'm surprised, and honestly disappointed, that the author seems to still play nice, instead of releasing the whitepaper.

Would it help anything? Apple isn't using PhotoDNA, so proving PhotoDNA is bad would just be met with "we don't use that".


I believe it would because other image systems will probably have to make similar implementations and trade offs that PhotoDNA did.


>> Would it help anything? Apple isn't using PhotoDNA, so proving PhotoDNA is bad would just be met with "we don't use that".

> I believe it would because other image systems will probably have to make similar implementations and trade offs that PhotoDNA did.

Even if that's the case, that's too subtle of a point to be very effective, and you'll have all kinds of PR people (Apple and otherwise) selling the "we don't use that [so those criticisms don't apply to us]" narrative.


Why? It would be interesting to compare the two systems, but there is no reason to assume the trade-offs are the same.


There are trade offs that can be made that are inherent to the space.

There are a handful of hashing methods in papers, and each can have its parameters tuned, again making trade offs for things like efficiency or accuracy.

Then when it comes to efficient searching through hashes for matches and fuzzy matches, there are common algorithms and data structures used across perceptual hashing systems, each with with their own trade offs, implementation details and parameters that can be tuned.

If there's an issue with PhotoDNA that doesn't come down to a poor implementation, then there's a good chance that other systems might have met the same pitfalls they did. And if it comes down to a poor implementation, it would be prudent for operators of other systems to make sure their own systems don't make the same mistakes.


NCMEC is a private nonprofit. No government oversight. How are they allowed this much power over every photo on everyone's devices?


> It also seems like the PhotoDNA hash algorithm is problematic (to the point where it may be possible to trigger false matches

A post here some days ago (since removed) linked to a google drive containing generated images (which displayed nonsense), the hashes of which matched those of genuine problems images.


Isn’t that an alleged memo from the NCMEC? I would sincerely doubt an official memo from any agency would use the term “shrieking minority”, it sounds like imaginative fiction of what a government agency would actually say.


9to5Mac published the full message that NCMEC sent to Apple:

https://9to5mac.com/2021/08/06/apple-internal-memo-icloud-ph...

It does say "We know that the days to come will be filled with the screeching voices of the minority."


Wow. Thank you, looks like I was wrong.


The fact that they're willing to openly make such a brazen statement is part of why I think they're unsalvageable.


One way to interpret the macaque photo is that the database has already been subverted for uses other than catching CP.


Or it’s one of many test images, like EICAR, so people can test and validate the system without using abhorrent images.


Sure. Though, it’s not of much use if it is not documented.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: