
Brains are the last frontier of privacy - hhs
https://www.axios.com/robotic-brains-data-technology-companies-6ba7269e-1553-4395-a6db-3560fead7e24.html
======
segfaultbuserr
I remember some tweets from DJB.

[https://twitter.com/hashbreaker/status/709314886384427008](https://twitter.com/hashbreaker/status/709314886384427008)

> Fun game to play: Take statements from Comey et al. Replace "smartphones"
> with "brains"/"memories"/"thoughts". Technology will get us there!

> "Everybody is walking around with a Swiss bank account in his brain if
> government can't get in. You cannot take an absolutist view on this."

> "How do we solve or disrupt a terrorist plot if law enforcement can't access
> the memories and thoughts inside suspected terrorists' brains?"

Both funny and scary.

Time to rewatch the TV animation version of _Ghost in the Shell_ [0] again.
Released in 2000s, it portrayed and predicted our world remarkably well, and
it'll give you a lot of inspirations of what would the future society look
like when everyone uses an electronic brain.

[0]
[https://en.wikipedia.org/wiki/Ghost_in_the_Shell:_Stand_Alon...](https://en.wikipedia.org/wiki/Ghost_in_the_Shell:_Stand_Alone_Complex)

~~~
marmaduke
> Technology will get us there!

Perhaps but not quickly. Decoding brain activity is still a hard thing to do.

~~~
wisty
Lie detectors are already a thing that some studies have shown with up to 100%
accuracy (using fMRI and machine learning) -
[https://en.wikipedia.org/wiki/Lie_detection#FMRI](https://en.wikipedia.org/wiki/Lie_detection#FMRI)

Yes there's countermeasures, but these seem to be security through obscurity
(can the detector spot the use of counter measures?).

~~~
y4mi
> _fMRI was 21% more likely to detect a lie than the polygraph._

20% more likely than polygraph isn't very inspiring though. their success rate
is abysmal.

~~~
meowface
It'll likely remain only marginally better than polygraphs for decades or
longer, but it seems probable that eventually it's going to become so
effective that law enforcement organizations will feel like they need to use
it to keep up with other law enforcement organizations. Once one or two big
organizations see high success rates with it, many will feel like they're
letting a lot of criminals get off scot free because they're behind on
technology, and most departments/agencies will have very strong incentives to
jump on board as they start to see more and more of their peers adopting it
(just like with DNA collection and testing).

Even if consent is required, it'll just be like traffic stop no-win
situations: if they ask you to consent to a search due to suspicion of drug
possession and you say yes, they'll search, and if you say no, they'll wait
until a drug dog comes, and then the dog will effectively conduct the search
and/or will be used as a tool to permit a full search. Or if you refuse to
take a breathalyzer test, they essentially presume guilt and treat you like
you took the test and failed. Anyone who refuses to consent to the Truth-O-
Matic test will probably suffer similar consequences. It's mostly an illusion
of consent. I think this future is inevitable without some very serious
legislation; perhaps a constitutional amendment.

A counter-point is that one could imagine a potential distant future exigent
circumstance where some sophisticated criminal might have information that
could truly prevent an imminent attack that'll very likely kill millions of
people, or something, and which could be discovered through such a detector.
In those fanciful scenarios I honestly do find myself wanting to agree with
Comey: fuck privacy, siphon those fucking thoughts. It's certainly a much more
ethical and effective alternative to torture, at least, which is how the US
government would currently handle such a scenario (if they could cover it up).

Security and privacy/liberty are always a tradeoff, but even if you weigh
privacy exponentially higher, there's always still some theoretical risk that
would tip the scale back towards security, in my view. Nuclear armageddon, for
example. The privacy hill to die on shouldn't be Mega-Corpse Mountain.

This scenario is absurdly far-fetched in 2019, of course, but in 2079? 2119?
Who knows? If Aum Shinrikyo and ISIS can do what they did and plan what they
planned in the 1990s and 2010s, what about new incarnations of zealous death
cults in a world that may have extremely intelligent AIs and/or DIY-potential
for almost anything?

Even a single instance of losing the final bastion of privacy is such an
incredibly slippery slope that it's hard to imagine any safe way to allow its
use in very extreme situations like these without setting up the conditions
for horrific abuse and a Stasi/1984-esque kowtowing of all of society. But I
think this aspect of the law enforcement perspective needs to be taken into
consideration as well, even while simultaneously acknowledging that law
enforcement will inevitably cry wolf and exaggerate the likelihood or impact
of imminent threats. Of course this'll happen, but what happens if they do see
an actual wolf one day?

~~~
wolco
Terrorist attacks seen scary. Many will glady give away any freedoms to take
the fear away.

A terrorist attack is a result of pressure that has been building up finally
erupting. If you want to stop attacks new outlets need to be created to
channel that energy elsewhere.

The country is a democracy but when dealing with other nations things are more
of a dictatorship. That power inbalance makes the otherside powerless.
Powerless people without hope do stupid things like blow
themselves/friends/family up. Find ways to give them a voice.

The number don't add up. Being scared of terrorist attacks is like being
afraid of winning the lottery. Your chances are lower than you think.

~~~
zo1
> "The number don't add up. Being scared of terrorist attacks is like being
> afraid of winning the lottery. Your chances are lower than you think."

Don't want to detract from the rest of your comment. But, one of the
fundamental "points" of terrorism is to make this this the case. I.e. to
instill fear (even if not remotely likely for it to happen.) The fear is what
they want, unless they're trying to wage a guerilla war.

~~~
JetSpiegel
Which is why not fearing terrorist attacks makes them worthless.

The collective unconscious makes it real.

------
foxhop
I would say, this will never be socially acceptable ... Until my parents in
law installed multiple Alexa microphones in their house, and my family of
origin all submitted DNA to ansestry.com to be indexed forever...

Generally speaking, the way to buy a soul is to offer convinieance. It doesn't
even have to be major.

For all those thinking there will be regulation or protections, where are
those protections for consumers of the current privacy violating services and
products?

You sign on the dotted line and are bound by the terms, there is no big other
to protect you from you own choices.

You choose whether you sell yourself, but once you do, don't expect a bigger
force to keep you protected...

You are the only person who is going to advocate for you and the people you
care about.

~~~
Invictus0
> You choose whether you sell yourself, but once you do, don't expect a bigger
> force to keep you protected...

Even that isn't true. Plenty of information can be gathered on even the most
privacy oriented person due to the actions of others: see Facebook's shadow
profiles, or even the use of the DNA your relatives uploaded to learn about
your medical history.

~~~
foxhop
Yes, your friends and family can sell you as well... I have a huge shadow on
Facebook because of my wife, as do my children who are not even at the age of
consent.

If I ever commit a crime, or are framed for one and my DNA is used against me,
you better believe my family has sold me out by giving up our shared and
essential life force signature.

My point is, there is no big other, no religion, or government who will keep
you safe from yourself and your choices. (Or the choices of your family and
friends). Any human made laws are show through out history to be maliable,
very flexible.

Those that bend the rules take power over those who dont, or don't understand.

A great example of this is tax law ... Huge power disparity between those
wealthly enough to find ways to bend the law and those who cannot or do not...

------
needle0
Wow, thoughtcrime is ceasing to be fiction. Even in the original 1984 novel,
they could only be detected when those thoughts manifested as actual words and
action. This goes further.

I fear for how much restraint we will show when we become technologically able
to detect thoughts that are deemed reprehensible in current society. Some
ideas that were considered reprehensible in the past have become part of
ordinary norms in the modern day; I still don't consider ourselves today to be
infallible in always making those decisions correctly.

For now, even having ideas currently considered most heretical/reprehensible
is still legal, as long as it never leaves your head - but I'm not quite
confident it will stay that way in the face of mounting pressure.

~~~
bhouston
First thought: I think that given the polarization in politics these days,
everyone is likely harboring a thought crime according to someone.

But generally we are not an authoritarian society thus we will likely not
start this type of witch hunt. Other more authoritarian countries may -- those
that already do a lot of arbitrary arrests, and forced disappearances, thus
will unfortunately make those state agencies more effective.

Second thought: The first people to get subjected to stuff like this is the
same ones who are most likely right now to interrogated or have their social
media inspected when crossing the border. This will just be a continuation of
that existing trend. If you do not allow your neural state to be monitored and
inspected in response to various stimuli you will be turned away at the
border. Give it 5 years.

Third thought: There would be a lot of health and education opportunities. See
brain patterns changing over time -- like a fitbit-like attention/focus score
based on passive 24hr monitoring rather than active testing (e.g. brain
training.) One could use those metrics to optimize your live to maximize your
brain efficiency. Second, long-term one can see how patterns change during
development and it may lead to better detection and understanding of mental
illness, especially austim, bipolar, schizophrenia, depression and ADHD. One
could likely see early warnings of these and do interventions and judge their
effectiveness better than just self-reports.

~~~
77pt77
> If you do not allow your neural state to be monitored and inspected in
> response to various stimuli you will be turned away at the border. Give it 5
> years.

5 years?!

Not even 10. We are very far from that, even excluding practicality.

~~~
bhouston
It is possible now. Just not in great detail. It can not tell exactly what you
are thinking but given responses to various stimuli it could infer where one
non explicit loyalties lie. It would essentially be a super lie detector.

~~~
77pt77
> but given responses to various stimuli it could infer where one non explicit
> loyalties lie

I'm going to need a source for that.

------
Mirioron
I'm far less worried about commercial companies doing this than I am worried
about governments doing this.

Usually when it comes to privacy I can begrudgingly accept that governments
will violate it to some degree. This is a field, however, where I think
governments must be outright banned from. We _must not_ allow a situation to
appear where the TSA or the police will run a quick "brain check" on somebody.
It must be avoided. This would be a one way ticket to disaster.

~~~
bboygravity
> I'm far less worried about commercial companies doing this than I am worried
> about governments doing this.

I don't understand this statement. I hope after Snowden we all know that data
at a commercial company = data at government? What's the difference? I don't
see any practical barriers to various government agencies having access to all
data at all companies. With and without company knowledge, mostly secretly,
with and without company and customer consent, legally and illegally. This
goes for the US, all US allies and all US enemies. The intentions of
commercial companies when it comes to data protection from government are
pretty irrelevant.

If you (as a people) want to ban your government from access to anything
specific, you must first have control over your government. Which is
inherently impossible in a system that is anything other than a direct
democracy. Just my very unpopular opinion. There is no nation in the world
that comes close to a people having control over a government, except arguably
Switzerland. But even there I have my doubts...

~~~
atemerev
I live in Switzerland. Direct democracy works perfectly well. I don't
understand why it is not the default mode and no other country uses it -- to
me, it is the obvious way of running things.

~~~
tsbinz
I live in Switzerland as well and wouldn't agree that direct democracy works
much better than representative democracy. You can certainly argue that it
does but it's far from obvious IMO. There's less focus on lobbying and more
focus on spreading FUD about other parties' proposals, people have no time to
investigate consequences of proposals so they rely on what they read in
political ads, see on tv, etc. Men were voting that women shouldn't be allowed
to vote as recently as 1990(!) in one part of Switzerland.

I can't say it's worse than other systems, but it's also far from obvious to
me that it's better.

~~~
bboygravity
Have you ever tried living in another country / watching the political news in
other countries?

~~~
kmlx
what about japan? about 60 times bigger than switzerland and they seem to have
it quite good for quite some time.

~~~
glandium
Japan is not a direct democracy.

~~~
kmlx
exactly. you don't have to be a direct democracy to be successful.

------
arugulum

        IAGO
    
        Good my lord, pardon me:
        Though I am bound to every act of duty,
        I am not bound to that all slaves are free to.
        Utter my thoughts? Why, say they are vile and false;
        As where's that palace whereinto foul things
        Sometimes intrude not? who has a breast so pure,
        But some uncleanly apprehensions
        Keep leets and law-days and in session sit
        With meditations lawful?
    

(Othello 3.3.135-141)

~~~
mplanchard
For folks on mobile:

IAGO

> Good my lord, pardon me:

> Though I am bound to every act of duty,

> I am not bound to that all slaves are free to.

> Utter my thoughts? Why, say they are vile and false;

> As where's that palace whereinto foul things

> Sometimes intrude not? who has a breast so pure,

> But some uncleanly apprehensions

> Keep leets and law-days and in session sit

> With meditations lawful?

------
mattlutze
> And next month Chilean lawmakers will propose an amendment to the country's
> constitution enshrining protections for neural data as a fundamental human
> right, according to Yuste, who is advising on the process.

In the US, I have a right against forced self-incrimination. The rights
against indiscriminate search and seizure and protection of speech also seem
to apply.

Being able to read my thoughts would seem to negate my protection against the
disclosure of self-incriminating speech (presuming thoughts are considered
speech).

This frontier is all the more concerning while our dominant model of commerce
on the internet is the exchange of content for attention. The attention market
already drives businesses to develop sophisticated, targeted models of users,
so that the businesses can most efficiently encourage addiction to their
services.

Being able to tune those mechanisms in real-time could be disastrous. It is
encouraging to see at least one government making an effort to get ahead of
this, and their work can be an example for the rest of the world on how to
protect people from a looming dystopia.

------
cam_l
Was thinking about why privacy is important, reading the Snowden post on HN
this morning. And it stupidly occurred to me, the privacy is only important
insomuch as it is related to power.

I say stupidly, because it is such an obvious thing, but one that might be
ignored or forgotten, without a sustained reminder. In the face of such an
understanding, the distinction between government or commercial privacy is
meaningless. If that lack of privacy can have significant power over your
life, the source doesn't matter. The content doesn't matter.

~~~
nnd
Could you share the link to the post?

~~~
cam_l
[https://theintercept.com/2019/09/21/edward-snowden-
permanent...](https://theintercept.com/2019/09/21/edward-snowden-permanent-
record-book/)

------
perfunctory
The title reminded me of the sci-fi novel The Dark Forest, where potential
alien invaders could read all human communication accept of human mind. The UN
selects four men to be "Wallfacers". Each one of them is supposed to devise a
defence strategy known only to himself and they are granted access to UN
resources to carry out the plans.

~~~
ausbah
What's also worth noting about the Wallfacers is how for 3/4 of them, the
aliens and their human supporters were still able to figure out their
"inaccessible plans" simply by deducing from their actions.

In some ways it feels like a waste to try and directly read the mind when
humans can be pretty well predicted by their outside thoughts alone.

------
andrerm
Why worry? Facebook's Privacy Police says they value our privacy. And
Neuralink's home page doesn't even have a Privacy Police. And Neuralink's
paper [1] doesn't even mention the word privacy once.

My point is, we will never have privacy from tech because tech's most valuable
field is ourselves. So we all respect others people privacy but we need to
harvest every data point we can because otherwise what will we do, kernels and
drivers?

[1]
[https://www.biorxiv.org/content/10.1101/703801v3.full](https://www.biorxiv.org/content/10.1101/703801v3.full)

------
titzer
Our society still hasn't processed just how fundamentally different the
psychological experience of today is from the rest of humanity's existence,
just as recently, say the 1980s or 1970s. Prior to the advent of smartphones
and ubiquitous computation and surveillance, social media and the narcissism
economy, one could actually live their own lives from a first person
perspective without worrying about people looking over their shoulder every
single second. Now, today, we are literally carrying around an audience with
them everywhere, whether that be the photos they may consider posting to
Instagram, or the machine learning algorithms monitoring their location to
figure out what advertising to direct at them, or government surveillance to
determine if they are some kind of threat. Heck, people even install apps that
monitor their sleeping patterns and upload that data to a third party. It's
almost as if we have created either the all-seeing eye of Sauron, or a tiny
portable film crew following everyone around all the time for our egos.

I, for one, do not plan to put anything in my brain I have not designed
myself, and for sure nothing that auto-updates!

~~~
sixstringtheory
Hell, it took me hours to straighten out my update to iOS 13 because my backup
had trouble restoring... not the first time that’s happened either! Can’t
remember the last workday (programmer) where I got through the whole thing
without encountering bugs in my tools, bugs in my bank’s app/website, bugs
while trying to text a photo to a friend, and on and on and on.

If I want to hack my consciousness, I’ll go read a book. Naked Lunch was
enough of a brain bug, thanks. Or how about Black Mirror?

And once we get past not knowing wtf we’re doing, take a look at how we’ve
weaponized every major scientific advance and tell me this would be the
exception. Rabble-rousing and voter manipulation on social media will look
like arts and crafts hour.

------
rothron
We used to laugh at the tin-foil hat people. Maybe they were just ahead of
their times.

~~~
abacadaba
No the brainwave satellites have been operational for quite a while now.

~~~
quickthrower2
And sadly they could have used those satellites for faster rural internet
instead.

------
idclip
I am not worried about this. More amused and excited about the future of
cognitive exploration and expansion.

I believe in the way, and all that’s happening is natural. I don't think the
collective organism of us would really create a hell for itself. Human civil
courage and compassion is a golden guard against the truly reprehensible.

Let them try.

Also +1 switzerland. Hope to see that style of being on a global scale.

~~~
wongarsu
It's much easier to get into power if you are ruthless and without empathy. As
a consequence we are mostly ruled by psychopaths who don't have a problem with
creating hell for other people if it benefits them

~~~
guerrilla
They already do...

------
gateKeeper
From the article:

• Driving the news: Neuroethicists are sounding the alarm.

• Earlier this month the U.K.'s Royal Society published a landmark report on
the promise and risk of neurotechnology, predicting a "neural revolution" in
the coming decades.

• And next month Chilean lawmakers will propose an amendment to the country's
constitution enshrining protections for neural data as a fundamental human
right, according to Yuste, who is advising on the process.

• A major concern is that brain data could be commercialized, the way
advertisers are already using less intimate information about people's
preferences, habits and location. Adding neural data to the mix could
supercharge the privacy threat.

• "Accessing data directly from the brain would be a paradigm shift because of
the level of intimacy and sensitivity of the information," says Anastasia
Greenberg, a neuroscientist with a law degree.

------
SubiculumCode
You can have brain privacy, no problem, just as long as the FBI gets a master
encryption key, just in case.

~~~
taneq
Just make sure you only install TSA-approved implants.

------
etiam
There is certainly the disturbing possibility that "Welcome to Life" [
[https://www.youtube.com/watch?v=IFe9wiDfb0E](https://www.youtube.com/watch?v=IFe9wiDfb0E)
] will rapidly be looking less and less like satire.

------
lchiang
[https://www.wsj.com/video/under-ais-watchful-eye-china-
wants...](https://www.wsj.com/video/under-ais-watchful-eye-china-wants-to-
raise-smarter-students/C4294BAB-A76B-4569-8D09-32E9F2B62D19.html)

The intrusion to brains has already started. Schools in china are
experimenting on using brain-wave trackers to track whether students are
paying attention in class. But the parents do not seem to care. It is likely
to give false readings so the accuracy is unclear.

The picture is scary. They do not seem to care about the privacy issue even
though the benefits are unclear. They are OK to sell out for ... maybe
nothing.

------
drdeadringer
There are reasonable people today who buy "RFID wallets" and are not generally
considered to be overly paranoid.

It seems at least a possibility that we will enter a future where similar
people will buy the related product of "RFID hats" [as branded with whatever
acronym this technology adopts, and perhaps as an "also suggested by Amazon"
to boot].

------
aasasd
I don't agree that it's the ‘last frontier.’ Let's say the government and
corporations watch everything you do outside of your house, both physically
and on the web. What's left for you to do in your brain has a term, ‘mental
masturbation.’ It doesn't matter what you're imagining in your head if you
can't act on those thoughts.

------
nnd
Just out of curiosity, does anyone have a comprehensive argument of why
absolute privacy is essential? I understand the possibility of abuse of the
power granted to governments, but is there a solution for identifying
malicious actors that doesn't involve privacy compromises?

------
almost_usual
The only interface that makes sense to me for this kind of application is
something non-invasive and can be removed like headphones. Governments could
still forcibly apply it to you under certain circumstances but you might be
able to retain some semblance of privacy otherwise.

------
Medicalidiot
We're getting close to being able to tell whether someone is lying based off
of an EEG, it seems like it's just a matter of time until we're able to read
other people's thoughts. No one in this thread will see that happen in their
lifetime.

------
buboard
Well, actually ... you d have to incorporate the peripheral Neural system.
People may start using that to think, in an attempt to escape brain
surveillance. And also any extracorporal BMI device they might use to augment
their mind.

------
numbol
CGP grey have a cool short video "I, phone" about that topic
[https://youtu.be/e-ZpsxnmmbE](https://youtu.be/e-ZpsxnmmbE)

------
thrwayxyz
And people laughed when I surgically installed a farady cage inside my skull.

~~~
lostlogin
Some decent orthodontic braces and a couple of little screws in the skull and
you’ll keep the fMRI at bay for a while yet. Though it might be cheaper to
just move a little and ruin it that way instead.

------
andrewfromx
from [https://cyborg.st](https://cyborg.st) FAQ: Q. What exactly is the
surgery?

A. You can think "type these letters" and make your fingers type them, or you
can route that same electronic message via bluetooth to the phone near you.
This surgery is like going from dial-up internet speed to broadband. You can
type much faster without fingers!

------
LocalH
I don't have much faith for governments to get this right. They _already_
infringe on mental autonomy heavily.

------
juanuys
Neuralink will give the border agencies good bandwidth on our brains, so at
least the queue will move quickly!

~~~
octosphere
For those needing to know more about Neuralink:
[https://en.m.wikipedia.org/wiki/Neuralink](https://en.m.wikipedia.org/wiki/Neuralink)

------
m3kw9
Not really, there will probably still be ways to keep secrets, as that’s what
privacy basically are

------
carapace
"The computer says you're lying."

------
plaidfuji
In principle this makes sense; where the line around “your thoughts” is drawn
is much more difficult to define in practice.

What is a thought? Is it the words you hear inside your head as you think? The
images? Do those even have a concrete biological representation? Are we just
talking about patterns in neural signaling cascades? If so, I doubt that
“thinking about a tree” looks the same from one person to the next. Which
means that these patterns will have to be learned from training data (once the
sensing technology exists).

Brains exist to decide to take actions. In the above case, companies with an
interface to your brain will only “understand” your thoughts insofar as they
map to actions you can take through their services, or “API”, so to speak. In
that sense, It feels like we’ve already crossed this line.

Let me be more concrete: unless there were, say, a bomb manufacturer with a
brain-API that you “trained” and used regularly, nobody will be able to decode
that you think about bombs in your spare time.

Unless, of course, a (hypothetical) brain computer interface just translates
your thoughts into “stream of consciousness” strings of words - but if that
were the case, how does that add value over a speech-to-text google search?

TL;DR, if you’re worried about the thought police, look around you - we’re
already there. Your “thoughts” are just the actions you decide to take using
technology.

