
Bruce Schneier: Banning Facial Recognition Isn’t Enough - pseudolus
https://www.nytimes.com/2020/01/20/opinion/facial-recognition-ban-privacy.html
======
netcan
Very well put together article. I don't see why this is in "opinion,"
considering how factual all the points are.

Banning facial recognition doesn't even banning facial recognition. Anything
uploaded to FB/Google/Etc goes through software analysis including facial
recognition. Anything publicly posted is crawled and the same thing happens.
Even hipster polaroids can later be digitised and analysed. You need to
prevent photography, to prevent identification.

Whatever bans _are_ imposed will not be on facial recognition, they'll be on
specific use-cases: police, concert security, etc. That might be good, but it
isn't a ban.

My instinct is to say "tech illiterate politicians," but I suspect it's worse.

Our political culture just can't really deal with the nuance and ambiguity. If
it can't be sloganized to under 5 words, we can't even discuss it.

~~~
ieooe93jd
Throwaway to avoid stigma of what I’m about to say:

It’s worse. This site is a good example. It’s rules limit the noise like you
get on Reddit which makes it easy to see without automated counting.

Commentary here is often of the “if you disagree with what I see as a
fundamental truth, and contextualize, even though it’s a subjective political
opinion, well tldr: downvote.”

The group think here is insane. Protectionism of the ideas that we believe
prop up our individual identity are not allowed to be challenged.

And I’ve not peddled conspiracy. Just that my work and agency shouldn’t be
bent to the whim of elites.

It’s not “tech illiterate politicians”. It’s society addicted to shallow
consumerism.

Being addicted to novelty and such is fine, there’s something to be said for
that versus war, right?

But this site really doesn’t defend that. It defends social norms that
denigrate the masses. That put tons of effort into crap.

That’s by design. Propaganda research went from military to university and was
integrated into advertising, marketing, and journalism. That’s not a derpy
state idea. It’s plainly recorded in official records.

Every generation ends and we don’t plan for that. We knew fighting to cling to
their rules.

While we’re fetishizing their push for quarterly gains, we’re enabling the
continued march down the path we’re on.

Gather supplies, coordinate with neighbors, and online to pick a Monday and
stay home.

Even a visible enough sustained effort that direction, I feel, would create a
ripple that lights a fire under politicians butts.

But let’s discuss fixing tech debt problems the insane grind creates, avoiding
burnout it creates, how to get one more todo app of ridiculously shallow value
out there, or fetishize Paul Graham’s request for a start up to build a
reminder app that helps him send email? Like ffs be creative, Paul. Such low
effort amateur ideas from the anointed ones. It’s almost disgusting.

~~~
woodandsteel
I think the discussions here are far more open and intelligent than the vast
majority of the web.

------
dredmorbius
On the Engineer's Responsibility in Protecting Privacy by Paul Baran

 _A discussion of the engineer 's responsibility to protect privacy in an age
of increasingly automated personal and business documentation. Computer
systems could be designed more carefully than they are at present, but
safeguards that provide the protection of privacy are expensive. As the
engineer has been trained to focus his attention on carrying out the task
assigned to him in as inexpensive a manner as possible, concern for privacy
has been too often ignored. In the absence of an organizational structure to
enforce a code of ethics, a restructuring of the profession at the engineering
school level is indicated. The engineering school curriculum must be modified
to cope with large systems in which the citizenry are an integral part of the
system, and a new curriculum be devised that would provide course material on
the behavior of individuals and of organizations to balance the weight given
to training in quantitative methods._

Published: 1968

[https://www.rand.org/pubs/papers/P3829.html](https://www.rand.org/pubs/papers/P3829.html)

[https://www.rand.org/content/dam/rand/pubs/papers/2018/P3829...](https://www.rand.org/content/dam/rand/pubs/papers/2018/P3829.pdf)
(PDF)

"There are many amongst us who would not hesitate to build equipment to
compromise the privacy of any given individual provided the price is right.
These are the whores of industry. They would not hesitate building systems and
devices contrary to the public interest; their only concern is the buck."

From May, 1968.

~~~
horsawlarway
Personally, I think the problem isn't in the supply but in the demand.

We've structured modern society in a way that forces demand for this
information:

\- Companies must sell products to users

\- The more saturated the market for a good, the more incentive there is to
game these sales (instill false desire, target vulnerable audiences, find and
target users who might be convinced to buy your product through any means)

\- This kind of "gaming" requires knowing more and more information about your
potential customers, because the companies that engage most in those practices
are the most successful.

\-----------

The problem is that eventually this starts to look like a nightmare Oroboros
situation - We are all both employees of companies doing this targeting and
the targeted users.

The snake is literally eating its own tail. Users want privacy, Employees want
data. But Users ARE Employees, and Employees ARE users.

In my opinion, the only question is whether the snake chokes to death on its
tail, or eventually spits it out.

~~~
dredmorbius
This is a situation where the traditional economic focus on distinct "supply"
and "demand" functions and processes is limiting, and a cybernetics/systems
approach strikes me as more useful.

Or, to put this back into economic terms for a moment, the issue _is_ supply,
as the demand function you've described is _induced_ demand, by way of the
Jevons paradox: the information and capabilities _would not exist_ if the
costs of data collection, aggregation, processing, and utilisation were
higher.

Since they are _not_ higher, the data environment exists, and the demand is
induced.

Back at uni in one of my econ courses, the instructor who'd returned to
teaching from administration for one final term before retirement, made the
following observation about computer systems planning from his experience.
Roughly:

Every time a new computer system (think enterprise / academic database
applications, 1980s era) is designed, you can go through the most careful
capacity planning and modeling procdess, but once you actually _deploy_ the
system, you find that there are a whole set of new uses which emerge _because
people wanted to do them previously but were unable to._ And as a consequence,
the usage levels are higher than anticipated.

It was a throwaway comment in class one day, but is among the handful of most
memorable moments from my college education.

The point is that _if it 's not possible to do a thing_ then it isn't done. If
it _is_ possible, and cost-effective, then there's virtually no way (absent
draconian regulation fo some form: legal, social, moral, political) to
_prevent_ it from occurring.

And once a process is possible, _if_ it provides some benefit, then it's not
possible to _not_ do it _when competing with those who do_ because you simply
cannot operate competitively.

Which is where we are now with data, and surveillance capitalism /
surveillance state.

From the systems view, what you have is a positive feedback loop, where
increased efficiency => decreased costs => increase capability => increased
benfit => further research => increased efficiency.

There are further elements to this.

As to the competitive market element, once production or provisioning of goods
and services moves beyond strictly local markets, then the actor who can claim
a larger share of the market has an advantage to doing so. Being able to do
that requires mass production (or provisioning) capabilities, which themselves
are a feedback loop (increased capital utilisation, improved processes,
greater market share, greater market brand awareness, familiarity with
concepts, UI/UX, tools, support and supplychain dynamics, etc.) which further
increase benefit.

(And that's excluding unfair business practices, though those also provide
considerable advantage.)

An immediate consequence is that the battle for mindshare becomes increasingly
fierce, and with that a reliance on advertising -- simply having a good local
name, foot traffic, and a shingle on a busy high street isn't enough.

Hamilton Holt's 1909 monograph _Commercialism and Journalism_ describes the
impacts of this change over the previous half-century on the publishing trade,
both quantitatively and qualitatively. It comes at what we now consider to be
the _beginning_ of the mass media age, but does an excellent job of laying out
the dynamics of what's occurred since:

[https://www.worldcat.org/title/commercialism-and-
journalism/...](https://www.worldcat.org/title/commercialism-and-
journalism/oclc/639344712)

As to the emplyees vs. users dynamic, the error here is that employees are not
ultimately the control agents. Managers and investors are, and they seek,
increasingly, short-term direct profit with little concern to either overall
social welfare or long-term impacts. So that's a very red herring and
misleading analysis.

~~~
horsawlarway
>As to the employees vs. users dynamic, the error here is that employees are
not ultimately the control agents. Managers and investors are, and they seek,
increasingly, short-term direct profit with little concern to either overall
social welfare or long-term impacts. So that's a very red herring and
misleading analysis.

This is precisely why I used the snake eating its own tail as a metaphor here.
I believe that it's reaching the point where the managers ARE actually
employees stuck in the same trap - or at least a large percentage of them are.
(I'm with you that there is a class of investor that is still immune, you can
picture this as the head of the snake)

I've been in management, it's fairly clear that user data allows you
opportunities and a competitive edge. I am also very, very aware that my data
is included in that pool. I don't like it, but when asked to make a decision
that keeps my product profitable, I'm going to try to use any user data I can.

If I don't, I'm replaceable. If I'm not replaced, I don't have a company to
manage in 5 years, because it wasn't competitive.

Add on top that market consolidation is at (frankly scary) record levels, and
I'd argue that the number of humans who actually have any real agency and
control over how an organization behaves are dwindling precipitously.

So already I've been swallowed.

~~~
dredmorbius
Fair points.

Middle management is its own special hell. Sometimes even executive
management.

A key differentiator is that megacorps and (at least some) very-large-scale
investors are in a position to _shape the actual competitive landscape_ ,
through regulation and/or taxation. They're often not _trusted_ to do this
(another issue, and another current HN story -- Edelman's Trust Report), but
regardless.

The self-perpetuating and self-feeding nature of the process is disturbing.

------
seek3r00
“Focusing on one particular identification method misconstrues the nature of
the surveillance society we’re in the process of building.”

~~~
dangerface
I have watched as our privacy has been slowly eroded since 911 and the general
public just doesn't care.

I have complained about mass surveillance for years and every one said it was
made up, Snowden proved it and still the mass public can't accept it / don't
care.

This is the first time I have seen the general public have any interest in
protecting their privacy.

~~~
Silhouette
I don't think it's quite as simple as that.

I have many friends in their 20s and even 30s who have grown up in an
Internet-enabled, mobile-connected, social-media-driven world. Many of them
don't remember 9/11; they were too young to understand when it happened. They
certainly don't remember a world without the pervasive surveillance we suffer
today.

Many of them are also in relatively privileged social groups, rather than the
minorities that tend to suffer the worst from authoritarian measures and one-
sided deals.

Even so, having sometimes discussed these issues, it's clear that many of
those friends don't approve of or condone such measures. They certainly don't
like or encourage them. It's more a combination of not seeing better
alternatives, and often for the non-techie crowd (which is most people), of
having other priorities when it comes to any advocacy or political movements
they want to support with whatever spare time and money they are willing to
spend.

------
noarchy
So what happens if a ban occurs, and it is inevitably discovered (via future
Snowden-type leak, or whatever) that [government agency] ignored it all along?
Nothing, as usual? Will legislatures once again retroactively protect those
who violated the ban?

~~~
dangerface
> So what happens if a ban occurs, and it is inevitably discovered (via future
> Snowden-type leak, or whatever) that [government agency] ignored it all
> along?

People find this stuff, it gets leaked eventually, with a law against it in
place stopping it legally is a lot easier.

> Will legislatures once again retroactively protect those who violated the
> ban?

Most likely but it's a lot harder to do that once the law is in place.

~~~
frandroid
Noam Chomsky, commenting on 9/11 conspiracy theories, was saying that one of
the things that made them impossible in retrospect is that secrets are very
difficult to keep, that there are always people who will leak something.

But one thing that really stuck in my craw is how the military intelligence
Chelsea Manning had access to later on was accessible by thousands of people,
and there are very few who have followed her example. Similar with Snowden,
except that much fewer people had his kind of access.

~~~
Cpoll
Or all the scary declassified stuff from the 50s onward (MKULTRA being
probably the most recognizable).

~~~
LargoLasskhyfv
Manhattan Project. Where workers discovered only decades later what exactly
they did there, when they saw their workplace in a documentary, or even
themselves. Compartmentalization, need to know only, culture of fear and
mistrust, and _very strong_ disincentives to do anything about it.

Pictures:
[https://duckduckgo.com/?q=manhattan+project+signage&t=ffab&i...](https://duckduckgo.com/?q=manhattan+project+signage&t=ffab&iax=images&ia=images)

------
stjohnswarts
I really plan on people snapping out of it and realizing tracking of people in
public places (including faces) really should be completely banned. It does
nothing for the common good, it only helps advertisers and the government. It
could be outlawed if there was enough demand for it. The world lived perfectly
fine before it. Let China continue on down the road of madness, but
surveillance is the opposite of freedom.

------
rmtech
This made me wonder, what are the benefits of this kind of technology?

Suppose you had a way to know all this stuff about people; what could you do
with it that would actually be good?

China has a social credit score system and apparently they justify it on the
grounds of preventing bad behaviour of various types (fraud, crime, etc). It
would be interesting to think more about this and how we might be able to
maximize the benefits and minimise the downsides.

~~~
EGreg
The State behaves like an organism. This is the immune system.

Would the immune system want better ways to identify threats to the state?

Humans are treated as cells. The values of Classical Liberalism and Liberal
Democracy don’t apply in this model. The body politic is the main focus, as it
was under Fascism and other such ideologies. Corporations together with the
State.

Nowhere do we see this more than in China today. The social credit system for
example. In the early stages, it seems rather benign, since people will be
behaving better and crime will go down. But this apparatus can go far beyond
simply preventing crime. Once it is in place and the screws will be tightened,
you will see an AI-powered State, with fewer and fewer people at the top in
control, or maybe no one really in control, not even the leaders, perpetuating
itself against all competing organized thought, whether violent or not.

Already we see it with Falun Gong, Uyghurs, Tibetan nationalism, and now with
Christian churches. For example:

[https://amp.businessinsider.com/china-harvesting-organs-
of-u...](https://amp.businessinsider.com/china-harvesting-organs-of-uighur-
muslims-china-tribunal-tells-un-2019-9)

[https://www.theguardian.com/world/2019/jan/13/china-
christia...](https://www.theguardian.com/world/2019/jan/13/china-christians-
religious-persecution-translation-bible)

Falun Gong practitioners, for instance, may believe kooky things but don’t
behave violently. And yet, they have been arrested and incarcerated without
due process and held extrajudicially to re-educate them:

[https://en.m.wikipedia.org/wiki/Persecution_of_Falun_Gong](https://en.m.wikipedia.org/wiki/Persecution_of_Falun_Gong)

Some of the re-education camps can do positive things too, like teaching
farmers en masse new techniques how to grow food to feed billions. China has
to quickly innovate its way out of an environmental catastrophe and feed
billions:

[https://amp.theguardian.com/global-development-
professionals...](https://amp.theguardian.com/global-development-
professionals-network/2017/jun/02/china-water-dangerous-pollution-greenpeace)

The One Child Policy had mitigated potentially a worse disaster.

So it’s hard to say, really, that everything that an oppressive state
apparatus is bad. For now. It has to take extreme measures to deal with
population growth, for instance.

By almost eradicating child mortality 100 years ago we are bumping up against
other limits — of sustainability and ability of ecosystems to not get totally
polluted and exploited.

But in every other area, except population growth, this AI-powered state will
eventually look NOTHING LIKE the individual-rights-based societies we have
come to value so much in the West. And it will happen in the West, just more
gradually.

They say the only way evil can triumph is for good men to do nothing. Well,
when a large and powerful state with many people undertakes this kind of
program, it is tempting to just let it happen. Like when Nazi Germany started
treating its Jews in worse and worse ways, the world simply stood by, traded
with German companies etc.

Sanctions don’t work. Private trade is good, it provides a counterbalance to
the strength of the State apparatus. Sanctions just help radicalize a
population into more nationalism and support for eg State developing atomic
bombs to “defend itself”. States should not become Autarkies. So what is the
only alternatives?

I think international inspectors should not just be for Iran. They should be
the norm, including in the USA etc. They should be composed of a randomly
selected (nationality wise) groups of people, have access anywhere with
warrants, and should make sure that human rights abuses aren’t happening en
masse. Think of it as a program funded by the international community to check
that no individual “body politic” abandons human rights in favor of Fascism.

Culture is the other. We need a free internet, free spread of ideas. Free-ish
movement of people. Without censorship of any ideas (however, with mandating
that biased ideas that possibly those that radicalize populations to violence
can be hard to share alone, but easy to share accompanied with opposing
viewpoints... think Wikipedia rather than Twitter).

With no sanctions anywhere, free trade and culture, we may save human rights
from the AI-powered states, at least for a few decades. China is already
exporting the AI system to Venezuela and other places.

But I am afraid that, as with Climate Change, we won’t be able to coordinate
ourselves to take effective action.

~~~
rmtech
> an AI-powered State, with fewer and fewer people at the top in control, or
> maybe no one really in control, not even the leaders, perpetuating itself
> against all competing organized thought

Yeah, this is a serious risk. I don't quite think you have the right solutions
but you've certainly identified the problem.

I think the solution to this kind of thing might be found in zero-knowledge
technology. A good system doesn't just harvest all the data and basically hand
it out to any hacker or alphabet agency. A good system should only hand out
aspects of the data that are actually needed. You should not be able to run
arbitrary queries, but rather queries like "were any of these 5 suspects
within 1km of the crime scene". If the answer is "No", then you don't need to
know where those people actually were.

------
NelsonMinar
I'm disappointed by Schneier's presentation of requiring consent as a way to
protect people from privacy invasion.

The practice of consent on websites right now is ridiculous. Look no further
than the GDPR-required cookie notices every European website has. You know,
the popup boxes you only look at long enough to find the "I accept" button?
That's not meaningful consent. Nor is every clickwrap software whose 20 page
license has buried in it "by using this software you consent to letting us
track your every movement and sell it to data brokers" or whatever nonsense is
current.

The only meaningful application of consent would be opt-in consent. Where you
could use the software / website / whatever without agreeing to the invasion
of privacy, and where the balance is tipped towards requiring consumers make
an informed decision to opt _in_ for some significant benefit. Of course no
surveillance capitalist company, nor authoritarian government, would tolerate
that kind of consent requirement.

~~~
theptip
> Look no further than the GDPR-required cookie notices every European website
> has. You know, the popup boxes you only look at long enough to find the "I
> accept" button? That's not meaningful consent.

The good thing about the GDPR in this context, is that it grants you the right
to be forgotten; in the case of the OP:

1\. It would not be legal for a company to collect information on you without
your consent (if you are using their site, they are the data Controller).

2\. It would not be legal for Facebook to share this data with another
company, without listing the data collector as a Processor.

3\. If you later discover that a company has shared your data to a broker, you
can have them delete that data after the fact.

4\. You're allowed to opt out of any collection that isn't required to provide
the service; it's not permissible to just claim "I need to transmit your PII
to this data broker in order to serve you news articles".)

So at least we have remedies for the cases where a site is later found to be
acting outside of the norms that folks assume when they click through the
terms, unlike in the US where you cannot really put the genie back in the
bottle once your data has leaked.

I agree with your point that ideally people would pay attention to the
permissions that apps/sites requested, and closely analyzed them, but it's
quite clear to me that most humans don't care enough about privacy to take
these actions ahead of time. So insasmuch as GDPR lets you fix those mistakes
retroactively, it's providing a benefit.

~~~
esotericn
People do pay attention to the permissions but it's very often intentionally
obfuscated. Opting in is instant whereas clicking 'no' comes up with spinners,
complex trees of options, whatever else (i.e. it's not actually opt in; the
default, fast path has the box checked, so it's opt out, because the active
action is to have to opt out).

~~~
LargoLasskhyfv
The most ridiculous way are the popups labeled something like 'choices'
wherein you have to click every single one of up to 200 companies to _NO
/OFF_, by setting a cookie on their sites, which often doesn't work, because
overloaded, or intentionally? _That_ is schizophrenic!

------
jdkee
Jack Balkin makes a compelling case that in this age of Surveillance
Capitalism, the tradeoffs between free speech and privacy are complex.

[https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3253939](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3253939)

If you have the time, the first audio of his 2018 lecture is a treat:

[https://www.youtube.com/watch?v=4daIk8PCPIc&feature=emb_logo](https://www.youtube.com/watch?v=4daIk8PCPIc&feature=emb_logo)

------
jaclaz
Anyone finds interesting how right after the conclusion of the article:

>We need to have a serious conversation about all the technologies of
identification, correlation and discrimination ...

There is:

> Like other media companies, The Times collects data on its visitors when
> they read stories like this one.

------
wildermuthn
The idolization of privacy undermines American economic progress. Especially
the new virulent variation which defends privacy in public.

1) Privacy in Public is impossible

Besides being a contradiction in terms, imagine what an ideal society of
public privacy looks like: a society where everyone is masked, where voices
are obscured, where names are anonymized, where everyone is isolated from
everyone by a wall that says, “why do you need to know?” It is patently absurd
to promote privacy in public. And ultimately the winners are those who don’t
follow the law — those in power in the first place. Do you really think
governments and corporations are going to follow their own laws, or fail to
ensure loopholes which only their and the cronies can fit through? Yes, look
at China. The solution to advanced technology is not to pretend it can be
stopped. The solution is to make it available widely and freely. Technology
can’t be stopped. But it can be monopolized and misused.

2) Privacy in Public is economic sabotage

Economically, the more corporations know about their customers the more that
their business decisions — both strategic and tactical — will be made upon
data rather than intuition. The less they know about their users — as any HN
reader should know — the more their business is at risk of dying. The
presumption of corporations as evil is simply unintelligent. It is especially
galling to see such knee-jerk stupidity in a forum dedicated to making
corporations. The greatest engine of wealth mankind has ever invented are
corporations unhindered by market manipulations. The one exception being rules
against monopolization and labor-abuse — i.e, practices that undermine the
existence of corporations.

3) Privacy in private is overrated

I received an amber alert on my phone a few days ago. If even one child could
be saved from abominable acts of evil, even if it meant privacy was abolished
in public and private, I for one would be willing to make that sacrifice. Yes,
I’d be embarrassed. Yes, it would put an enormous weight on me to know that
anyone could find out about anything I’d ever do or am doing. But for me, a
society of truth is a society without a place for evil to hide — whether that
evil is kidnapping or corporate corruption.

3) Privacy is going to be lost.

The war is over. We shouldn’t pretend that those who stand most to abuse
privacy — the powerful — are the ones best suited to control their own
behaviors. Laws won’t stop those passing the laws. Only unhindered permeation
of technology will.

I recognize I’m saying something unconventional if not provocative. But when
people are scared they make irrational decisions. I’ve never seen so much
irrationality among the intelligent as when it comes to privacy. People’s
“thinking” on this topic is akin to religious devotion, and that’s what scares
me the most.

~~~
LeifCarrotson
Wow, that is provocative. Rather than downvoting, which I strongly want to do,
let me try to address each point in turn.

Privacy in public was possible 50 years ago, and no one was wearing masks or
using pseudonyms. The existence of huge numbers of cheap machines that can
identify people and which have perfect memories changed the system. We can't
roll back to a time when the technology didn't exist, but that doesn't mean we
have to allow its installation.

> _If even one child could be saved from abominable acts of evil, even if it
> meant privacy was abolished in public and private, I for one would be
> willing to make that sacrifice._

That's textbook emotional manipulation and a failure to rationally consider
the margins. You admit you'd be embarrassed, if two kids out of a billion make
a childish mistake and it can't be forgotten, and they chose to commit suicide
as a result, you've caused more harm than good. I don't even think that amber
alerts do much good, most of those kids are just with one of their parents who
lost a custody dispute, there have been zero proven cases where an amber alert
saved one child, and I'd be willing to bet that there are more children hurt
(not to mention adults, as they're less emotionally potent victims) when ten
million drivers look down at their cell phones.

And your defeatist attitude towards the application of laws ignores the fact
that we are not living in tribal caves ruled by the man with the biggest war
club. Humans are able to organize into governments and make decisions and laws
that limit some powerful people so that society is better overall.

My father assumes that a camera in a store is going to a fuzzy CRT display on
a CCTV system, maybe watched by a sleepy security guard, and that the tapes
will be overwritten in 24 hours. My son assumes that like a Facebook photo
he's automatically tagged as having visited the store. We're likely to have a
bit of a learning curve between the time that HN readers are aware of what can
be and is being done with tracking cookies, IMEIs, cameras, and databases, and
when the general public is tech-savvy enough to know what that means.

~~~
daxorid
_Rather than downvoting, which I strongly want to do_

The bootlickers don't bother leaving an explanation for their downvotes;
you're under no obligation to operate on a higher moral plane to placate the
demons who advocate for our subjugation.

~~~
dang
Please don't take HN threads further into flamewar.

[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

------
exabrial
Banning ${thing} only stops the honest people, which are usually the people
you don't have to worry about.

~~~
simion314
Isn't the entire point of a community to have rules on how to behave inside or
else we are kicked out?

There are stories we tell the chilodren where some people form a new community
without rules because they hate rules and then one guy is a jerk and
eventually they have to re-invent the rules one by one.

~~~
uncle_j
> Isn't the entire point of a community to have rules on how to behave inside
> or else we are kicked out?

We are typically talking about nations and states (which aren't necessarily
the same thing). You are born into it and typically normally stay within it
for the majority of your life. Community is one of those words that are
commonly abused these days to the point it doesn't really mean anything.

~~~
simion314
I was thinking at historical communities like tribes not at internet
communities.

------
mtgx
The public understanding as well as _awareness_ about these new technologies
is so much lower than it needs to be for it to start demanding that
politicians do something about it before these technologies permeate society
and new generations just get used to them (just like say the new generations
in China -- by and large -- get used to all the Chinese propaganda, the Great
Firewall, and so on).

The vast majority of people don't really understand what it means for Facebook
and Google to track every single click and action you do while on your device,
track everywhere you go, who you talk to (both online and soon offline, if
that isn't already the case, or even on the phone), and for every service on
which they register to sell their personal data, and to tap Accept on all of
the "free apps" that ask for contact list permission for no good reason.

They don't understand the implications because it's pretty difficult for just
about anyone to understand the thousand possibilities in which that data could
be used or who will be the thousand future owners of that data (legally or
illegally).

