
Facial Recognition Cameras Do Not Belong in Schools - edtechstrats
https://www.nyclu.org/en/news/facial-recognition-cameras-do-not-belong-schools
======
Someone1234
This proposal may be problematic because they seem to want to tie it into
other computer systems/databases (e.g. state and federal crime, immigration).

There's a lot of paranoia about new technology, but the biggest threat isn't
strictly from the technology itself, but our use of the resulting data and how
it is disseminated.

For example fingerprints for lunch payments/library books isn't inherently
problematic, it becomes problematic if the fingerprints are stored beyond the
student attending the school or sent to others (who could store it
indefinitely or misuse it).

~~~
candu
Storing this data enables a wide range of easily-foreseeable negative
consequences, and it sends a clear message that school administrations do not
trust their students. In that light, we could very easily see the simple act
of storing it in the first place as problematic; we could even see the act of
_thinking seriously about storing it_ as problematic.

Taking your example, what message does that send to a kid if they need to be
fingerprinted just to eat lunch? Sending that message is, IMHO, extremely
problematic.

~~~
sharcerer
Another long-term negative consequence would be the kind of mindset/ thought
process they'll develop as they grow up. or eg: right now there's lot of
debate over use of computer vision, AI etc and Amazon,Google have already
faced backlash. Google corrected their stance at least for the time-being. No
such action at Amazon, also I never really got the impression that Amazon had
a employee driven culture. Google much better at that. Coming back to the
point, when these kids grow up to be 20-40 years old adult, they might not see
the negative consequences of state-wide surveillance IF they didn't face any
negative consequence/disadvantage due to cams in school. The reverse might
also happen,i.e, they face negative consequence/disadvantage at school and so
they are cautious about the tech when they grow up. Personally, as a 20 year
old, I am in a dilemma. On one side, cameras, vision have a lot of security
benefits etc. They seem to be inevitable. Bad actors are always there, it's
about how the safeguards which are put in place. what happens to the data and
who does what? That's the question. Maybe if there was some large-scale audit
system by non-profits, citizen groups etc , it would be better.

------
jacquesm
They don't really belong anywhere. Not just in schools.

~~~
monetus
I'm sad this view isn't more common.

~~~
ringbugger
Seriously. And all the emotional security arguments that cannot be verified by
hard evidence.

The probability of being a victim of terror against getting in a traffic
accident? No, please don't start warterboarding all bus-drivers...

------
Atreus
Do not judge their actions by their stated intentions. Assume that their
actions satisfy their motives perfectly, and look for the set of motives that
predicts the

If you do that, of course they do belong in schools, and they will stay in
schools, because the US is a post-privacy state, and is staging itself to be
as authoritarian as the Chinese "social credit" system.

What universe are you living in that you think modern American children want
to learn? They don't. They didn't want to learn 40 years ago, or any year
since. They don't look at public schools as refuge, or as empowerment. They
look at it as a requirement, and sometimes a bit like being in jail for 8-14
hours a day.

~~~
throwawayjava
_> What universe are you living in that you think modern American children
want to learn? They don't._

That's sad. You should spend time with some different kids. The ones I spend
time with have an _insatiable_ appetite for learning new things. Each student
is passionate about something different, of course.

 _> They don't look at public schools as refuge, or as empowerment._

That seems mostly orthogonal to the question of whether they want to learn and
whether they realize that learning can be empowering.

~~~
LeftTurnSignal
> What universe are you living in that you think modern American children want
> to learn? They don't.

Woof...

I know when I worked around the US (mostly midwest) at a different school a
week, there were always some kids who wanted to learn.

There were always some that didn't.

However, they may not have wanted to learn every single thing that the school
taught, but they did want to learn about stuff that could be applied to a
future career.

------
baldfat
As I work on what an appropriate computer for three to five-year-old children
to work in the classroom I have thought of using facial recognition as opposed
to login name and password or barcodes.

I also would like to see where the students spend their self-select time to
help improve the layout of the room and collect data on what students do for
the 90 minutes of free choice.

Now I can see how everyone would hate both pieces because of other issues and
I feel frustrated.

~~~
woolvalley
Enabling face-id login on student devices is a bit different than facial
recognition surveillance. Same with username based app analytics.

Apple face id reduces the privacy controversy by addressing how they keep
things private, device only and secure while improving your life. Bring that
upfront and it might go over better.

------
gorpomon
Absolutely disgusting, there's really not much more nuance to this. I can't
fathom a situation where we want to automate what should be human
interactions. Talking with children, in person, where they can learn from
their mistakes, is a great way to create long lasting discipline. Sure it's
hard work, but if we offload this to computers we're actively removing
humanity from the situation.

~~~
darawk
> I can't fathom a situation where we want to automate what should be human
> interactions.

Really? You'd rather call the taxi company and make an appointment with a
person than use Uber? You'd rather ask a librarian to do your google searches
for you?

~~~
tylerhou
He said "we shouldn't automate what we shouldn't automate", not "we shouldn't
automate anything."

~~~
darawk
Yes, which is a pointless tautology.

~~~
tylerhou
Okay, but then your response is just as pointless as his tautology.

~~~
darawk
My point is that "should be human interactions" is a meaningless standard. The
only way it could be meaningful is if it meant "anything that used to involve
human interaction, and now doesn't". I was rebutting that interpretation of
it, which seemed to me to be what the OP was implying.

~~~
candu
They provided a pretty clear example: actually talking to children is
important to develop socialization and maturity (read: executive function).
This is one area where machines absolutely cannot do the job anywhere near
equally well - i.e., something that should involve human interaction.

~~~
darawk
> This is one area where machines absolutely cannot do the job anywhere near
> equally well

Citation needed. You're neglecting the fact that there are millions of under-
interacted kids in the world. Providing a cheap source of interaction for them
could be extremely valuable. Just saying "their parents/teachers/whoever
should do it" does absolutely nothing to solve the problem. Building them a
robot does.

------
John_KZ
I've never been in the US and stories like this make me imagine it as a
bizarre authoritarian dystopia.

How are the parents not reacting to this? How is there no law protecting the
privacy of workers and children? Why is this not in the front page of a major
newspaper?

Please do something about this. By the time those technologies trickle down to
smaller countries there's very little we can do to counter the external
pressure to implement similar "solutions".

------
Havoc
Inclined to disagree with the headline (but not article).

Schools are exactly where you want this tech - to keep kids, parents & teacher
in and the rest of the public out. Leveraging face tech for that is 100% OK.

...you just can't have that data leak out in any way. i.e. utilize it for
access control & true security only.

~~~
jMyles
> keep kids, parents & teacher in and the rest of the public out.

Well, is that really what schools are supposed to be?

It seems like schools have an institutional identity which is
indistinguishable from a prison or military base.

I'm not sure that I want schools to be hermetically sealed from the rest of
society. I think I prefer that they be much more open and intermingled with
their communities.

~~~
mschuster91
> Well, is that really what schools are supposed to be?

Guess that's the reality in a country with roughly one school shooting a week.
As usual for US politics, the fix doesn't even go anywhere near the sources (a
lack of gun control).

> I think I prefer that they be much more open and intermingled with their
> communities.

Hmm. Honestly I'm not sure how this could be done in a productive way, and
that with threats like shooters and pedophiles not even being in scope. Having
random quacks peddle kids and their parents with crap like "alternative
medicine", anti-vaxx ideology, creationist theory or religious fundamentalism
of any kind (no matter if Scientology, hardcore evangelicals or salafist/other
radical Islamists) is already a huge problem, no need to further enlarge it.
And unlike with guns it's more difficult here to fight the actual problem
given the fundamental human right to free speech.

~~~
yellowapple
"As usual for US politics, the fix doesn't even go anywhere near the sources
(a lack of gun control)."

The lack of gun control is not the source. The source is people having a
desire to kill children in the first place.

We _can_ address the specific manifestation of that problem (by enacting
tighter controls on how guns may be acquired - something for which I'm very
much in favor), but these people can and will find other ways to kill on a
large scale (homemade explosives, for example) unless we actually address the
mental health issues that are the _actual_ sources.

~~~
jMyles
This is a beautiful comment. I think it's worth taking time to imagine (and
gasp, even empathize with) the incredible pain that must fester inside the
minds of people who do these horrific things.

It's so inhumane to say, "oh well, as long as they don't have guns, everything
is swell."

~~~
andai
If you can stomach it, you don't have to imagine, you can read their own
words.

[http://www.acolumbinesite.com/diary.php](http://www.acolumbinesite.com/diary.php)

[https://www.documentcloud.org/documents/1173808-elliot-
rodge...](https://www.documentcloud.org/documents/1173808-elliot-rodger-
manifesto.html)

------
throw2016
You will always find a use for surveillance, and in capitalism there is a
natural incentive for those developing these technologies to find all the
possible uses and push them aggressively and handwave objections. At this very
moment there are thousands of sales people pitching it.

But this is where civil society and principles like democracy, liberty and
privacy step in to temper the worst impulses and greed.

Unfortunately that tempering mechanism is currently nonoperational and anybody
with a tag of 'business' with zero interest other than profits can sell any
malicious technology and lobby for it. Everything has consequences, and
unintended consequences, most of these principles are lessons learned over
hundreds of years and it will be tragic to let petty self interest continue to
dilute and diminish them.

------
bwang29
I wonder which technology provider was able to convince a school to use a
system at 4mm in the first place and how did the negotiation go

------
gremlinsinc
How else is the NSA going to start profiling people before adult-hood and
build up their facial rec. databases? /s

------
pavel_lishin
What's the, uh, selling point of this? How is this better than a regular
surveillance system?

------
rhizome
They don't belong anywhere, because they don't work.

~~~
quadrangle
Okay, but once they work near-flawlessly they're okay??

~~~
rhizome
No, but that's not likely to happen any time soon, barring revolutionary
developments.

~~~
quadrangle
Right, just please don't set up people to think that if only they fix them to
work better that it will be okay! If the main argument is how poorly they
work, it will be a short time before they improve and we'll be behind on
focusing on the injustice of _accurate_ facial recognition systems in the
world.

~~~
rhizome
It's not going to be a short time before they improve.

~~~
quadrangle
what's a non-short time here? A few years? If it's a decade or two, that's
pretty soon still. We're not talking about generations away.

------
paulie_a
Apparently my kids will be wearing hipster glasses with numerous ir leds
installed in them

------
exabrial
I don't think kids belong in [government run] schools either actually.

------
Dowwie
Stop the contract. Split the four million dollars among the teachers.

------
poster123
The perpetrators of school shootings were often expelled before they went on
their sprees. Wouldn't you want technology to recognize the faces of expelled
students and other people who do not belong in a school?

~~~
gowld
Citation needed.

Also, How does identifying a shooter via a camera prevent them from
discharging their weapon? Are you going to install the cameras on a secure
wall at a 1000-feet radius around the school? Attach an automated or drone
gun-turret to the camera, _Portal_-style?

~~~
andai
_There you are._

------
chapium
Budget priorities aside, why is this inappropriate? What problem does it
attempt to solve?

------
LexGray
I can think of a few use cases. Identify when non custodial parents are on
campus. When sex offenders are in the area. Identifying drug dealers that
target school children. Identifying arsonists, vandals, kids triggering
emergency systems. It might even be used to track bullying, tardiness, and
absenteeism which is less workload on understaffed teachers.

If it was all confined to local systems and access was limited to a few
authorized persons it may make parents a lot more comfortable and give them
solid data for parenting.

~~~
decebalus1
> Identify when non custodial parents are on campus

Security staff

> When sex offenders are in the area

Security staff

> Identifying drug dealers that target school children. Identifying arsonists,
> vandals, kids triggering emergency systems. It might even be used to track
> bullying, tardiness, and absenteeism which is less workload on understaffed
> teachers.

All of these can be solved by not understaffing teachers.

> If it was all confined to local systems and access was limited to a few
> authorized persons it may make parents a lot more comfortable and give them
> solid data for parenting.

What kind of solid data and how exactly would it influence this data-driven
parenting? Are we talking A/B testing home packed lunches or getting a god-
view of their kids friend circle? I also dream of a world in which I can
collect metrics about my kids and their whereabouts to correlate them with how
much allowance they get and whether I put gluten in their food or not. Because
I didn't have kids, I birthed services/products which need to be optimized for
peak performance!

I can also think of a good use case. Prepare the future generation for
constant surveillance and indoctrinate people in schools that privacy is dead
and buried.

~~~
spike021
Not sure how security staff can solve any of those first two problems.

Sure, they might learn to recognize some faces, but if you have even 100
students in a small school, there could be at least 200 parents between
bio/non-biological parents.

As far as sex offenders, how do you expect a random security officer to know a
random individual's face? It just isn't possible. Either way they might need
to take a photo of the person to see if they can be found on any threatening
lists.

~~~
AlexandrB
> Sure, they might learn to recognize some faces, but if you have even 100
> students in a small school, there could be at least 200 parents between
> bio/non-biological parents.

Wait. So not only does the school have to buy these expensive camera systems,
but the parents also have to give up their privacy rights by feeding their
facial information into a database (a _very well maintained one_ I'm sure /s).

All for a system that has a significant false-positive/negative rate and
_might_ increase security _slightly_.

That's not to mention the corner cases. Last year I picked up my niece from
school because my sister was busy. Should everyone in the child's extended
family have to go into the database?

What about changes in the child's circumstances (e.g. divorce). Many Amber
alerts are one parent kidnapping their own child as part of a custody dispute.
Should the system track visitation rights too?

Finally, who's going to be held accountable when any of this complexity fails
and some random parent is arrested because "the system" thought they were only
allowed to see Suzy on Tuesdays but it's Thursday. My guess is no one.

Somehow we, as a society, have come to accept that it's ok for tons of
innocent people to suffer humiliation and degradation at the hands of the
state if it increases our safety (or even just our _perception_ of safety) by
0.01%.

~~~
decebalus1
> Somehow we, as a society, have come to accept that it's ok for tons of
> innocent people to suffer humiliation and degradation at the hands of the
> state if it increases our safety (or even just our perception of safety) by
> 0.01%.

Easy. Fear and social pressure.

"Voice or no voice, the people can always be brought to the bidding of the
leaders. That is easy. All you have to do is tell them they are being attacked
and denounce the pacifists for lack of patriotism and exposing the country to
danger. It works the same way in any country.”

– Hermann Goering

