
I'm Testifying to Congress about Data Breaches – What Should I Say? - Ajedi32
https://www.troyhunt.com/im-testifying-in-front-of-congress-in-washington-dc-about-data-breaches-what-should-i-say/
======
nathan_long
To me, the main issue is accountability.

A citizen's data can be collected, badly secured, stolen, and used by
criminals without the user ever being aware of step 1. Just like a citizen can
get ill from swimming in a river without ever being aware of the factories
upstream.

The solution is not to force citizens to constantly be on the lookout. It's to
severely punish polluters and leakers.

When a CTO says "let's collect geolocations", the CEO should should have legal
and business reasons to say "no way, it's not worth the financial risk of
losing them; it could destroy our company."

\---

Update: I do _not_ think the government should mandate specific practices;
it's too complicated, too fast-changing, and too hard to police.

It should be entirely results-based. You lose people's data, you pay big
bucks. Figuring out how not to lose it is _your_ problem. The government sets
the rules, and the market plays the game.

~~~
mratzloff
I had a non-technical friend whose fatalistic impression was that "these
things happen and there's nothing that can be done given a determined
attacker." Well, look, I said, these hackers aren't going in like Mission:
Impossible. Equifax was incompetent, and there's zero penalty for utter
incompetence. There must be.

The Equifax hack was the equivalent of data malpractice. It's like a hospital
mixing up the labels on saline and hydrogen cyanide and then saying, "Whoops.
Sorry about that." The cavalier attitude that companies have about data
security infuriates me. Americans will be dealing with the repercussions of
this for the rest of their lives.

Meanwhile, Equifax keeps making money.

Equifax's entire business was trading off of _our_ data, but protecting that
data was evidently not a priority for them. They should be fined into
oblivion.

~~~
RachelF
My opinion is that information about you should belong to you. Other entities
should not be allowed to keep or trade information about individuals without
paying them directly.

That way people will realise that their personal info is worth something.

~~~
astura
HN is all like "speech should be free and unlimited and protected always...
Unless someone's talking about me."

That sort of hard stance seems extremely unworkable. If I sign up for Hulu
they need my email and billing information at the least. So they are supposed
to pay me now because I gave them that to get a service? So then ot evens out
so my streaming is free now? Goodbye any useful paid service. Goodbye any free
service.

What you're really advocating here though is abolishing free society as we
know it.

You think I'm kidding? All court proceeding would have to be in secret and
unrecorded as saying "so and so convicted of manslaughter" is now illegal. All
public records abolished entirely, no accountability anymore. No journalists
could publish a story about anyone ever. Harvey Weinstein sexually assaults
hundreds of women? Can't tell anyone or publish that information, Harvey
Weinstein owns it. Even telling someone else about your date last night would
become illegal.

~~~
thatcat
I think gp ment commercial data, with the aim to prevent the resale rather
than collection.

~~~
cinquemb
Though, it would seem like data in general is more commercializable now than
it was a century ago.

I certainly don't want to be the person to have to draw a line in the sand
between "non commercial data" and "commercial data".

Then again, maybe I want to sell access to brain scans of people using
hardware I made to a 3rd Party in the future… :P

~~~
thatcat
Its not any harder than labeling things for tax purposes. If you sell it, it
is commercial and should have restrictions on how it's held and transmitted.
It's not unreasonable to require part of that process to involve payment to
the data originator, as a royalty for the unique data they created. This would
also allow those creating the data an easy way to see how it is ultimately
used.

~~~
cinquemb
> _Its not any harder than labeling things for tax purposes._

And how many countries are companies able to skirt through financial
engineering?

Yeah, people can create all the rules they want, but I'm more interested in
how people plan on enforcing it… because from where I sit, that's where
countries are lacking (esp with politicians accepting some kind of
bribes/kickbacks/revolving door/election financing/lobbying doors in them
all), and it isn't getting any better…

Even when the hardware for brain scans get cheaper and more open, the raw data
it self in the hands of the average WeChat/Facebook/Line, user is useless
unless one understands how to process it into something useable…

Companies that can attract talent will increasingly start making all the data
that they collect public by default… side stepping the pain of data breeches
when you keep costs down by making it public by default… how they process it,
that's a different story… maybe those companies will get their useds pissed
off enough to actually think about their choices of using their platforms and
not use them… though, that last part seems increasingly unlikely.

------
colemannugent
I've seen it mentioned below, but I feel this is deserving of its own comment.

If security researchers could report vulnerabilities with impunity it would
drastically reduce the incentive to sell vulnerabilities to black market
buyers.

This is a full-blown national security issue. If security researchers could
work with our three letter agencies in defending our infrastructure it would
go a long way towards securing the US against increasingly advanced opponents.

As it stands right now, if I found a critical vulnerability in a government
system I think I would just drop it and tell nobody. I think it's more likely
that I would be punished rather than rewarded, which destroys any incentive I
would have to help.

~~~
mtgx
Yeah, if only the three-letter agencies actually cared about cybersecurity.
The agencies don't want all devices and servers to have "iPhone-level"
security or better -- that much is clear. They're still fighting Apple over it
and they try to get them to weaken their security "so law enforcement can get
in" whenever it's convenient to them. Cybersecurity compromises be damned.

This is how they think. Their priorities are pretty far removed from
"cybersecurity".

They also blew their chance with cybersecurity, when they passed a _supposed_
Cybersecurity Act of 2016, that was meant to stop all of these data breaches
from happening. But as all the critics said back then, the law was nothing
more than additional surveillance powers given to the NSA and DHS/FBI. They
never actually intended to use it to stop any of the data breaches - they just
wanted to collect more data on people and add to the big stack of hay, in
which they want to find their needle.

~~~
rallycarre
The iphone isn't as secure as Apple wants people to believe.

~~~
jjevanoorschot
Do you have any examples of how the iPhone is insecure? Genuinely curious.

~~~
rallycarre
In the age of the IME and the hundreds of billions of dollars the US spends on
US covert operations. I'll let you guess what the path of least resistance is.

~~~
willstrafach
I think “examples” refers to hard proof. Anyone could make an unsubstantiated
claim about anyone or anything else if they wanted to, so it is not very
productive to include.

------
kodablah
"Data maximisation is the norm" is a big point. Emphasize that. If they are
looking for a punitive legislative approach to curb breaches, remind them it's
not just the "how", but the "what". Between HIPAA and SOX and the like, they
are familiar with the burdens that come with different data types. Hopefully,
between differentiating types of data and discouraging storage of data for
other (law enforcement?) use, we can just have less data stored in general. At
some point, companies should see data as a liability.

~~~
ryandrake
> At some point, companies should see data as a liability.

This is key. Currently, there appears to be no business downside to collecting
PII, so companies collect as much as they possibly can. If, through regulation
or some other means, data became a liability (or at least came with some
downside risk) then perhaps companies would become more thoughtful about what
they tried to collect and store.

~~~
robinwassen
Seems to differ between companies. We have seen PII as a liability since early
on, but more so as we move closer to GDPR.

GDPR forces us to justify and be transparent about each data point we collect.

This has increased the urgency of cleaning up any non required data points,
adding expiration to data and moving data from subcontractors in house.

Guess that a lot of companies are in our position.

~~~
syrrim
GDPR seems to be particular to europe. It might be better to say it differs
between countries, rather than companies.

~~~
robinwassen
GDPR affects any company that has european customers. So a lot of non-EU
companies will be affected.

Guess it will be harder to enforce when a company does not have an EU sub
though.

------
snarfy
Say there is no such thing as identity theft. Identity is not a material thing
that can be stolen.

The issue is bank fraud. When someone uses stolen personally identifiable
information to make fraudulent purchases/accounts/whatever it's the banks
liability for allowing the wrong person to perform those actions.

~~~
SAI_Peregrinus
There's also the libel issue. The bank reports (falsely) in writing that you
have a debt: that's libel. The credit reporting agency reports (falsely) that
you have the debt. Still libel.

~~~
ryanwaggoner
[https://en.wikipedia.org/wiki/Mistake_(criminal_law)](https://en.wikipedia.org/wiki/Mistake_\(criminal_law\))

------
richard_todd
The article says: “Incidentally, I've decided not to mention specific data
breaches but rather to focus on the patterns...”. When trying to influence
people who won’t be subject matter experts, I tend to think the opposite
approach would be more effective. People readily latch on to tangible examples
of what goes wrong, and while legislation is too often a knee-jerk reaction to
single events for that reason, here is a chance to use that phenomenon to your
advantage. If my measure of success was positive influence rather than pure
education, that’s how I’d go.

~~~
jerf
I would seriously consider taking the list of Congresspeople who are going to
be running this hearing, extracting out the information you already have on
them from your dumps, and preparing them a personal sheet of paper for each
one showing them what you have about them online.

If it's in public, you can't read it to them, but you might be able to hand it
to them personally.

This will be far more powerful than almost anything else you could do. It's
not a problem "Americans" have... it's a problem _they_ have.

~~~
jordanthoms
Produce that list with a note next to each piece of info for which company
lost it...

~~~
kronin
And when. And the lag between when the breach occurred and when the public was
notified of the breach, along with all relevant information the company
provided to help customers determine the extent of their information that is
now public.

------
tareqak
I have one area / worry that I would like Troy to introduce as a question and
to give his advice / opinion as a subject matter expert:

1) The role and responsibilities of security researchers in discovering and
investigating data breaches (maybe also discuss the spectrum from white-hat to
black hat, yet the tools they all use are effectively the same).

2) The role and responsibilities of journalists in reporting the said
breaches.

3) The impact of laws and litigation leveraged by governments and corporations
to protect themselves in the event of data breaches.

4) Why a fair and happy balance between the interests of 1), 2) and 3) is a
necessary part of mitigating and reducing the possibility of data breaches
along with unhappy examples and their consequences.

I'll admit that four questions above are a kind of scope creep to the intended
discussion, but my concern is pretty real. The laws and norms that we have
today, while imperfect, are the reason why Troy is being asked to appear as a
subject matter expert. Whatever the laws and norms of the future are, they
will need to sufficiently flexible to allow future subject matter experts to
learn and operate, so that they too can make meaningful contributions to the
issues of their time.

Edit: formatting

~~~
zAy0LfpBZLC8mAC
Not to forget the DMCA anti-circumvention provisions which make it illegal to
report on security breaches you discover on your own devices.

------
mtgx
Look to GDPR for ideas. The EU is moving into a "privacy by default" direction
of making companies increasingly more liable for hoarding user data that's not
absolutely necessary for the functioning of the service. I believe a technical
committee of the EU Parliament has even proposed to encourage end-to-end
encryption use for services.

Here's an excerpt:

> _The providers of electronic communications services shall ensure that there
> is sufficient protection in place against unauthorised access or alterations
> to the electronic communications data, and that the confidentiality and
> safety of the transmission are also guaranteed by the nature of the means of
> transmission used or by state-of-the-art end-to-end encryption of the
> electronic communications data. Furthermore, when encryption of electronic
> communications data is used, decryption, reverse engineering or monitoring
> of such communications shall be prohibited. Member States shall not impose
> any obligations on electronic communications service providers that would
> result in the weakening of the security and encryption of their networks and
> services_

[http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-%2f%2f...](http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-%2f%2fEP%2f%2fNONSGML%2bCOMPARL%2bPE-606.011%2b01%2bDOC%2bPDF%2bV0%2f%2fEN)

And here's an article, too:

[http://www.bbc.com/news/technology-40326544](http://www.bbc.com/news/technology-40326544)

------
Jach
Trying to make data breaches much harder has some unsettling implications.
"Make CEOs responsible" or "make companies in general responsible". Ok. Sucks
if you're a startup, but ok. The big ones are going to turn around and put a
clause into every engineer's contract to the effect of "you are now directly
responsible for the code you write, if the company is sued because of a bug
you caused, we're suing you". Then we'll either accept it, or go against our
profession's history and form a union to fight it, which will inevitably lead
to the union representatives accepting a compromised version on our behalf
that's the same at all companies and now in addition to that there are various
additional requirements that increase the barriers of entry to the profession
and slow down getting work done. Either way has its downsides.

An alternative approach: assume everything is compromised all the time.
Identify the material harms of such compromise, and work to minimize those
harms. The SSN-is-proof-of-identity system is obviously a big source of harm,
not so much applied to whoever was deceived, but inflicted on the actual
person trying to put everything right again. There are many many changes to
this one system that would help minimize the damage, ranging from complete
overhauls to just minor things like being able to change one's SSN with more
ease, or putting more pressure on companies to verify people instead of
trusting the SSN. This is probably one of the few cases where doing just about
anything to improve the system even a little is much better than doing
nothing.

I doubt anything will come of it though. Congress routinely talks to domain
experts who warn about the problems in the future should they not do what the
expert suggests. Nothing gets done, and the problems manifest, sometimes worse
than predicted.

~~~
voidmain
> if the company is sued because of a bug you caused, we're suing you

It's totally implausible that companies are going to start regularly suing
their engineers personally for writing bugs. They'd recover very little money
and make it impossible to hire. Programmers already write bugs that create
liability in businesses all over the world every day. Worry about something
else.

~~~
Jach
Probably so, but an indemnification/hold-harmless agreement becomes a much
more important part of the contract negotiation and individuals and businesses
have to pay a lot closer attention when so far everyone's been happy enough
with the "don't sue me, this software has no warranty" clause we slap on just
about all our software, open or proprietary. If that is ever threatened, do
you think a professional liability insurance industry won't pop up and become
effectively mandatory even if you're just building a CRUD app? Now each
company's gotta pay up. And you would have to pay for your own individual
coverage too if you don't plan to work for the same big company your whole
career like classical engineers often do, or don't want the risk of a company
laying you off and then hiring you back as an independent contractor as is
happening to some electrical engineers.

> Worry about something else.

Oh this is a very long way down my list of worries. But programmers should
look very carefully at all the drawbacks of professional engineering before
trying to shape software engineering into it, which is what regulation on
something so domain cross cutting as data is doing. The fallout of GDPR will
be interesting to watch. With the potential fines being in the millions, have
any insurance companies appeared yet to offer insurance for US companies
wanting to do business in the EU but not wanting to spend the time and money
making sure they're fully compliant with each rule?

~~~
voidmain
I'm not sure I think it would be a bad thing if proprietary software licenses
stopped disclaiming all liability, but there's no evidence that creating
liability for data breaches (for the entity that collected and stored the
data, not whoever wrote the software they used to do it!) is going to change
that. Plenty of companies already _use_ software in ways that creates
liability for them, and yet as you say shrink wrap software rarely comes with
any warranty.

Individual liability seems much less likely to me, in the absence of statues
explicitly forcing this liability onto individuals. If you are a consultant,
and the market equilibrium actually shifts so that customers are demanding
security warranties, your insurance costs would go up, but unless you are less
competent than average your equilibrium income should go up to match. The
legal incidence, and plausibly the economic incidence, is all on companies
that actually handle personal data.

------
paulgb
When Troy Hunt is testifying before congress you know that at least some part
of the government is functioning :)

I don't know where it fits in but I wish congress understood how silly it is
that knowing our birthdays and SSNs is still treated as proof of our identity
in 2017

~~~
jstanley
> When Troy Hunt is testifying before congress you know that at least some
> part of the government is functioning

That's if it's not just a ruse to lure him to the US where they can then
arrest him for being in possession of too much illegally-obtained personal
info.

~~~
Fej
That would be an international scandal. Heck, it's not like we're unfriendly
with the Australians anyway.

------
moduspol
To take a different perspective from the others here:

I think it is fundamentally problematic to try to regulate the sending /
receiving of information. That's why the movie and music industries have had
so much trouble, that's why classified info leaks are more common, and that's
why cryptocurrencies haven't been squashed (yet). Trying to regulate this will
have far-reaching bad side effects for all parties.

I also think the vast, silent majority is not actually as concerned about this
stuff as perceived. They'll say so in polls, but if they're allowed to choose
between the status quo or paying (even a little) to use Facebook, Google, etc.
without data collection, they'll choose the free option almost every time.
People don't like that their data is collected, but they like not paying for
things more.

That said: I think the fundamental issue that needs to be addressed is that
the data needs to be valueless.

The Experian hack is a problem because the data leaked has value. Some of it
is public record stuff (effectively valueless), but SSNs follow you forever
and cannot be changed. If it could be changed (or destroyed for a
replacement), then the SSN would hold little value.

Mailing addresses are similar. In 2017, we should have a way of giving
anonymous (perhaps even re-assignable) addresses to the organizations we
interact with so we're not explicitly telling each of them where we live. If I
could generate a new "pseudo mailing address" for each of these companies, I
could then destroy it if it is ever compromised--and as a bonus, I'd be able
to see how it's being shared (since I'd see other companies using the same
one). At that point, having someone's "mailing address" becomes nearly
valueless.

Some of this responsibility falls on the consumer, too. Obviously companies
are still going to stockpile data to cross-reference and provide value, but
that's reality. We've got ad / tracking blockers, anonymizers, and VPNs for
the people who truly care, which help to make their own collected data
valueless. But the way I see it is that the government needs to ensure they're
not creating a system where citizens' data _must_ have value, which is what
has been done with immutable SSNs and mailing addresses.

------
unabst
I would add two things.

1\. Encryption is the only way to secure data, and without it, data will
always get stolen. And encryption with a backdoor is not encryption.

2\. Data needs to be owned by the people. I should be able to go to Target or
American Express and ask them to delete everything they know about me. Them
not doing so should result in a, say, 10,000 dollar fine that goes all to me
with admittance of insurmountable guilt.

A 200 million dollar fine may still have banks and telecom gaming the system,
but if they had to admit guilt, the math becomes easy. Their egos won't permit
it!

And with that, finally our data will be kept safe.

------
confounded
I’d suggest:

\- Regulate that pseudonymous use of transactional services must be permitted
(with exceptions for, e.g. banks, universities). You do not _need_ my legal
name or telephone number to ship me a package, or accept payment, for example.

\- Get explicit, arduous permission from users whenever PII is obtained, if
you intend to store it in any way, with exactly what it will be used for.

For existing data, over some long period of time (say, a couple of years) get
users to explicitly agree to each use, or delete it.

This could work pretty much like default-off OAuth2.0 scopes. For example, you
could give Facebook the ability to know your name for display on the site, but
not for exchange with third parties for additional data on you, or for the
purposes of advertising.

While the ‘default off’ requirement would reduce the amount of PII in
circulation, it would also reduce the _false choice_ between security and
privacy that Facebook, Google, Amazon, and others present between privacy and
security. Want my telephone number for 2FA? Possibly, but that doesn’t mean
that you can use it to buy/exchange a file on me from
Equifax/Acxiom/MasterCard/whatever.

~~~
folknor
> You do not need my legal name or telephone number to ship me a package, or
> accept payment, for example.

That's not always true in my country. Sometimes if a package is not labelled
with a name connected to the delivery address, the post office will simply
return it once it reaches their sorting facility.

Which has annoyed me on two occasions. And I can't see a legitimate reason for
it. And I don't know if it's connected to actual law, or just a practice.

------
curun1r
One thing to make clear about the Equifax breach is that it wasn't caused by a
single security update that was never applied, as has been put forward by
Equifax. It was caused by an insecure information architecture that let web
front ends have direct database access. Good security requires defense in
depth, not an unbreachable perimeter. Equifax's security was a failure long
before it was hacked.

Obviously the technical details may not be the appropriate level to discuss
with Congress, but an analogy that they can understand might be helpful. What
Equifax did was the equivalent of keeping the country's nuclear codes at an
outpost in Afghanistan and then blaming a sentry for falling asleep when an
enemy slipped in and stole them.

------
monksy
I would suggest to be very visual.

Bring a few things:

1\. A box with a lock and key 2\. A cardboard box+tape 3\. An adult education
device + a clear plastic bag.

Put the adult education device (texas terms) inside the clear plastic bag. Put
that bag in the cardboard box and seal it with tape. Put the box in the lock
box. (Kind of like a Russian doll situation).

(If you're allowed to do this at all.. do it after security)

Once you're up and presenting mention:

This is the best analogy that I have: This lock box illustrates an honest
attempt at protecting data. What you have in this box is your data and is
potentially embarrassing. It's yours but should the public know about it?

Tell them how you and only very specialized individuals know how to open the
box. Open it.

Then explain that the next layer illustrates a company that doesn't know or
doesn't want to invest much on protecting the data: anyone can tear open that
box.

Once you have the box torn open and the bag+item is out on display. Ask
congress: This is how well Equifax protected your data. Are you going to make
them pay for the FCC fine that CSPAN is going to be hit with?

~~~
zionic
tomorrow on hackernews:

"Does anyone know a good lawyer?"

------
criddell
I think there are really only two possible ways to fix the problems with data
breaches.

1\. Impose a fine for every individual account leaked. Even if it were only
$10, the recent Equifax loss of 143 million records is a $1.4 billion fine. It
should probably be higher if gross negligence is involved. This would create a
new industry (data breach insurance).

or

2\. Make it so simply having all the leaked information on an individual isn't
enough to cause any harm. I'm specifically thinking of some PKI-based scheme
where I verify my identity by signing something with my private key.

There may be other variations, but the choice seems to be between forcing data
brokers to be responsible or make it so that their irresponsibility is
harmless.

~~~
cm2187
How do you get a grannie to use a public private key cryptography? It would
probably have to be a physical device.

~~~
criddell
Yeah, some kind of card. I think Estonia was issuing something like this.

------
wyldfire
The discussion is a valuable one but the only buttons congress has are
"regulation" and "funding". Some regulations might not make much sense,
although I think the accountability one is a good place to start.

But the bar for gathering my data is so low! Read a few tutorials on how to
write software with Your Favorite New Web Framework and off you go on making a
new site that will happily leak data once discovered. The core competencies of
my power company and my local cinema are definitely not IT -- nor security.
Can we expect good results here? (I hope the answer is 'yes' but I think it's
unlikely).

So to take a different tack -- could funding help here? What if there were
funding and/or accreditation of some libraries or frameworks? These would use
best practices regarding minimizing data loss (salt e.g), regular auditing of
actual deployments of this technology, fuzzing of the underlying software,
etc. A marketing/branding effort regarding the accreditation could also be
helped w/funding. It needn't be a US-local solution, nor even a US-local
agency. Though that would certainly minimize the bureaucracy to "only" the
level of US Congress.

Instead of "Stop, Drop, and Roll" or "Only You Can Prevent Forest Fires", it
could be "Never Roll Your Own User Account Database" (and OMG please help us
if you rolled your own crypto).

------
politician
Tell them to look no further for a solution than Europe's GDPR. The GDPR gives
individuals the right to the privacy and security of their personal data.
Companies that collect this data do so at their own risk, and have an
obligation to secure and not proliferate that data.

This legislation has huge, sharp teeth. It comes into full-effect in May 2018,
and every single multinational is running around in a "pants-of-fire" panic
trying to figure out how to comply.

If Equifax had to pay 4% of their global annual turnover per day of non-
compliance, would they act? Yes, of course.

\---

> How is the fine calculated?

> Article 58 of the GDPR provides the supervisory authority with the power to
> impose administrative fines under Article 83 based on several factors,
> including:

> The nature, gravity and duration of the infringement (e.g., how many people
> were affected and how much damage was suffered by them)

> Whether the infringement was intentional or negligent

> Whether the controller or processor took any steps to mitigate the damage

> Technical and organizational measures that had been implemented by the
> controller or processor

> Prior infringements by the controller or processor

> The degree of cooperation with the regulator

> The types of personal data involved

> The way the regulator found out about the infringement

> The greater of €10 million or 2% of global annual turnover

> If it is determined that non-compliance was related to technical measures
> such as impact assessments, breach notifications and certifications, then
> the fine may be up to an amount that is the GREATER of €10 million or 2% of
> global annual turnover (revenue) from the prior year.

> The greater of €20 million or 4% of global annual turnover

> In the case of non-compliance with key provisions of the GDPR, regulators
> have the authority to levy a fine in an amount that is up to the GREATER of
> €20 million or 4% of global annual turnover in the prior year. Examples that
> fall under this category are non-adherence to the core principles of
> processing personal data, infringement of the rights of data subjects and
> the transfer of personal data to third countries or international
> organizations that do not ensure an adequate level of data protection.

------
zilitor
Data Breach Transparency

    
    
      - At a minimum, their should be a penalty that grows from the time the breach was learned to when they disclose it publicly.
      - There should also be penalities for not being transparent about what exact data was leaked for what users.
    

Social Security Number

    
    
      - SSN is similar to a password- you want to keep it hidden, and if it leaked, you should change it. However, we can't change it. Perhaps it should be considered more as a password?
    

User Data Rights

    
    
      - People should know what personal data companies have on them. A good example of this is Equifax storing peoples home addresses- this could be disclosed. On the other hand, a it is probably fine to exclude other types of data, such as an advertiser storing your zip code- people probably don't care as much.
      - Should people have a right to have certain kinds of data (e.g. SSN) removed from websites?
    

Adoption from USA Nutrition Label

    
    
      - Is it a good idea to mandate companies disclose the security they use? For example, at one time reddit had their passwords stored as plaintext and they got hacked. Disclosing basic security hygiene (e.g. password storage) somewhere standardized in the website would make it much less outrageous.
    

Technology Improvement

    
    
      - Certain technologies enable hackers more than others. SQL seems to enable a lot of hacking. Should we discourage it?
      - Get rid of Intel ME technologies     - https://schd.ws/hosted_files/osseu17/84/Replace%20UEFI%20with%20Linux.pdf
      - Get rid of Intel hidden instructions - https://www.youtube.com/watch?v=KrksBdWcZgQ
      - Get rid of Simon and Speck           - https://www.reuters.com/article/us-cyber-standards-insight/distrustful-u-s-allies-force-spy-agency-to-back-down-in-encryption-fight-idUSKCN1BW0GV
      - What is "best for National Security" is actually worst for our own. It feels like people don't have a democratic say in the right balance either.
    

(edit trying to figure this formating out)

~~~
thethirdone
> SSN is similar to a password- you want to keep it hidden, and if it leaked,
> you should change it. However, we can't change it. Perhaps it should be
> considered more as a password?

I assume you meant

> SSN is similar to a password- you want to keep it hidden, and if it leaked,
> you should change it. However, we can't change it. Perhaps it should be
> considered more as a _username_?

~~~
zilitor
Yeah that is good point. Either way we need a universal password.

------
WBrentWilliams
The important thing to remember is that everything of substance has already
been written up and will be read by congress members and their staff. Keeping
that in mind, the more important object is to make use of the five minutes of
microphone time.

I think studying how other people have effectively used their five minutes in
instructive. I'd start with the testimony of Fred Rogers where he gave a
Senate statement on PBS funding:
[http://www.americanrhetoric.com/speeches/fredrogerssenatetes...](http://www.americanrhetoric.com/speeches/fredrogerssenatetestimonypbs.htm)

------
galeforcewinds
Having technical knowledge and entering a room of less technical policy-
makers, it can be particularly important to leverage existing industry
messaging rather than winging-it.

I would focus on the CIA triad + Accountability + Assurance. It's helpful to
use standard terminology that is understood by existing privacy practitioners.

Personal information should be Confidential from unwanted disclosure.

Personal information should have Integrity with the creation, modification,
and deletion of personal information only as authorized and intended.

Personal information should be accessible readily by authorized parties.

Personal information should have Accountability, with traceable ownership to a
party responsible for Confidentiality, Integrity and Access.

Personal information should have Assurance, with appropriate audit of
Confidentiality, Integrity, Access and Accountability; including the right to
inspect.

Just as the Amendments to the Constitution form a latticework of protection
for each other -- e.g. that freedom of press helps ensure other rights are not
eroded -- the elements of CIA+A+A do the same.

Recommendations can then be framed for direct implementation:

* Confidentiality: Requirements for timely breach notice

* Access: The right of the consumer to be aware of and to have access to access data about them

* Integrity: The right of the consumer to repudiate data about them and demand removal

* Accountability: Direct ownership and legal teeth (fines, jail, and barring of eligibility from data or business management roles, etc.) to compel the presence and adherence of an appropriate privacy management program

* Assurance: Standardized audit reporting, guaranteed consumer right to inspect, etc.

Folks noting "accountability" often mean the entire CIA Triad + A + A, not the
technical term "Accountability". This is likely the gap to bridge -- turning a
sentiment that businesses are not operating appropriate privacy management
programs in to an actionable path to compel existence, adherence, reporting
and audit of such programs.

------
_Codemonkeyism
That CEOs should be accountable for lazy security.

That everyone does security theater, no one does real security. Mostly for
convenience to marketing data is not encrypted in REST, data is not segmented
into different data stores (own store for passwords etc.) but stored in the
same MySQL database, employees can dump millions of records instead on one-
record-at-a-time, people let fly unencrypted Excel sheets everywhere etc.

~~~
Spivak
Why? What does a CEO know about InfoSec? Why make CEO's punching bags for
fields they aren't experts in?

~~~
pbhjpbhj
Then they manage the company and make sure they have someone in charge of
overseeing InfoSec that does.

------
lifeisstillgood
There was a meme a while back that the USA should act upon Data Piracy the
same way it acted upon High Seas Piracy at the turn of the 20th Century -
acting unilaterally to clean up the high seas, making trade safer cheaper and
faster for all nations.

A similar approach might be the best tack to take here

* Have a public register of breaches, with auditor sign off of the details of the event so we can all learn

* publically registering the breach gives some degree of protection from liability / punishment, but there is expectation of competence and good practise (very much like accounting)

* Work with EU over Data Protection definitions and approaches - if both US and EU are singing off same hymn sheet it will become globally de facto

* probably the biggest area to push in that is that personally identifiable information should belong to the person identified - and treated like an asset held in trust by those who hold it...

* beef up whistleblower laws and roles of researchers

* have the NSA buy back some of the world's trust by identifying and hunting down cyber criminals the same way actual violent terrorists are

------
Glyptodon
Point out that the criminalization of "hacking" creates strong incentives
towards negligence _and_ both weakens and distorts the pool of cybersecurity
talent, while making people afraid to report. Responsibility for data needs to
be with its owners, not people who see it through open windows.

------
hoosieree
The biggest problem is that we've become conditioned to using our
name/address/SSN/birthdate/etc for everything, so we give this info, without
batting an eye, to services who have no legitimate need for it. We're
basically using the same username/password for all of our most sensitive
accounts.

For example, an orthodontist in my area asks for SSN, employer, marital
status, spouse's SSN, spouse's employer, and states "you must complete the
entire form". I only enter name/phone/insurance info, but I bet most people
will just do what the form asks.

So part of the responsibility is on the user to not willingly give away
irrelevant data. Part of the responsibility is on services to be good stewards
of data.

What should Congress do? How about unifying PII and IP? Give Equifax the
Napster treatment.

------
JepZ
While my first intention was to set a high penalty too. After a few minutes I
thought there might be an alternative.

How about prohibiting breachers (companies which have had data breaches in the
past) to collect data which is not essential for their business for a limited
time span?

Something like: Hey you have not secured your customers data? So why do you
want to store that data anyway? You want to promote only the relevant products
to your customers? So we will give you some time to get your storage security
right and therefore, you are not allowed to collect any non essential data for
the next two years.

Yes, the essential data is probably the more important one, but at least it
would bring companies with low security to store less data for a while.

Just an idea, what do you think?

~~~
AnimalMuppet
Two years is too short. I'd try maybe five.

------
TheCondor
I’d advocate mandatory disclosure, from all levels, if you know of a breach,
you should be compelled to disclose it. I’d close the legal loopholes using
attorney client privilege to hide breaches. I’d impose extremely stiff fines
for coverups, potentially criminal ones. The same for willful ignorance of a
breach.

I suspect that the unintended consequence would be that Eula’s and various
business relationships would be adjusted to attempt to limit liability. Maybe
let liability be a court matter, just knowing about the breaches would be a
huge step for consumers though.

------
unimpressive
So I'm going through this thread and what I'm reading over and over is "huge
penalties", "personal responsibility for people involved" and what I'm
wondering to myself is if you've all gone mad.

Okay look, I get it, it is absolutely despicable what these companies are
doing. But _think about ordinary website operators_ for a minute. A lot of the
proposals in this thread would basically criminalize running a basic web forum
unless you're some kind of security ninja. Please, think before you write.

------
richardknop
Ability to delete all data related to me is a key. It should be possible for
me to go to Google or Facebook and ask them to delete all data they have
collected on me over the years (emails, photos, text messages, list of
friends, phone number, job history, geo location, search history etc). And by
delete I mean actually get rid of the data so it is not recoverable. NOT soft
delete.

If a company fails to do so (or doesn't provide some relatively simple way to
do it like an online form), there should be harsh financial penalties.

------
thrownaway954
My personal opinion is that in order to make sure that breaches like the
Equifax one are preventable and handled correctly, these measures should be
imposed

1) An entity storing personal data, must be audited by a government approved
3rd party and their rate must be made public. 2) Any breach (or suspicion of a
breach) must be reported to this 3rd party within 24 hours.

The issues I saw with the Equifax breach was that there was NOTHING that told
us how bad their security was and they were allowed to not report the breach
for a month.

------
cdevs
The immutable data they hold on us has always scared me. I have always
imagined a middle man or "middle service" would be helpful. This service would
be like a database and for my example I'll just use email addresses but
imagine also passwords, birthdays, maiden names "encrypted of course" but
that's beyond the detail I'll go into. This email service would let me tell
the mailman my email is somethingUnique :myRealEmail@domain

Well if I start getting spam at somethingUnique:myRealEmail@domain then I can
call the person out about it I gave it to "I'm looking at you Facebook". So
some spammers now have that address but I can shut it down or change it, I
also could have had a list of people who had that address and if they still
use it I will still get messages. There's accountability and even though
something static was given away it was only a "pointer" to a static item so
that I can change the pointer. I should know who knows what, how they think
they know it, have the option to take it away, etc. but that's not really
possible when companies like equifax collect data on us and it never really
seems like they ask, but I'm sure we signed our life away while buying a car
and thinking about picking up chicks in it signing papers in a daze.

~~~
geogriffin
I've been doing this for a few years using my own domain and
"recipient_delimiter = -" in postfix and actually (annoyingly) have yet to see
an email address leaked. considering all the spam i got on my Gmail address, I
was thinking it would happen fast, but I guess 1) I'm probably being more
careful and 2) these things take time.

~~~
cJ0th
I experimented with different email addresses for different purposes and came
to the same conclusion.

Then again, perhaps SPAM isn't that lucrative anymore. You might earn much
more by selling profiles these days. This only dawned on me after disqus has
been hacked and 'lost' personal information such as e-mail addresses (my
personal one among them).

What criminals could do now is simply collect breaches, put them into a
database and make "e-mail" the join criteria for a query. Based on the output
they can generate comprehensive profiles of people that weren't available on
the market before.

Date from 'shop a' may reveal my DOB and my real name, Data from disqus allows
them to extract what values I hold and so on...

In a way they bank on _not_ sending you spam, because if they would you may
change your e-mail and they are thrown off the track ;)

------
glossbar
Treat it like HIPPA. You will will be banned from doing business if you fail
to secure data properly.

------
evantahler
I'm happy to see that the context for this testimony is related to ID
verification. The bank example Troy gives is a simple & understandable problem
that we all can wrap our heads around (Congress too!), which provides a
concrete scenario to explore.

The way I always approach this problem is one of ownership. Congress,
implicitly, assumes that individuals own their data. That is why it used to be
possible to ask individuals to prove their identity by asking them to verify
data they own, and presumably, haven't shared too broadly.

Identity theft is really a robbery then, because data which belongs to you is
stollen. The government's opening position then needs to be that data belongs
to consumers.

What isn't clear is the physical metaphor for what happens when you give your
data to a third party, explicitly or implicitly. Am I 'leasing' my data to
Facebook? Am I granting them shared ownership? Or is it more like a Bank...
I'm allowing Facebook to "re-invest" my asset, they can make money off of it
while they have it, but in exchange, they have to keep it safe.

I personally really like the Bank metaphor, and like banks, you can get
certified as safe by the government to be a Bank, and have that certification
taken away... We already have data protection rules like this in BTW, called
HIPPA

~~~
evantahler
I think I also agree with @nathan_long's point, in that perhaps the government
shouldn't be concerned with the _how_ data is kept safe, but rather than track
breaches and complaints... that would be what leads to a revocation of status.

------
bb88
The security triad is the following:

1\. Something you know 2\. Something you are 3\. Something you have

Unfortunately, the security of an SSN is it's something which the government
gave you, and doesn't fall into the security triad.

So when a loan is approved, it's approved with data that may have been made
public, either through public records, or through data breaches. The SSN and
birth date were never meant to secure financial loans with, and should never
have been in the first place.

------
netman21
It is time for a federal breach notification law. There are 30+ State breach
disclosure laws and Congress has been working on a Federal law to supersede
them since 2005 with no success. A Federal law must be stricter that
California 1386 or Massachusetts's law in order to be relevant. It will make
compliance easier and companies like Uber will think twice about covering up
breaches. Nothing as drastic as GDPR but something.

------
dailyvijeos
We need a suite of “HIPAA/FOIA”-inspired bills that radically shift power over
private/financial details and information security with these changes:

\- decommodify personal details by making them illegal to resell or distribute
without permission

\- restore individual control: require giving distribution and
update/modify/delete rights back to the individual

\- mandatory incident reporting requirements with personal criminal
liabilities to executives for failing to disclose breaches

\- mandatory compensation, determined by independent government risk
management, to customers based on the expectation of risk incurred plus
insurance against losses for 5 years

\- ensure minimum security requirements, similar to PCI-DSS, by formation of a
federal information security standards agency that produces practical,
effective configuration and architectural standards and collects
external/internal audits and conducts spot-check compliance audits similar to
the IRS for taxes

\- institute an opt-in national identity virtual & physical card with provably
secure public/private key management, open-source infrastructure that isn’t
based on social security numbers. Perhaps managed by a non-profit which
includes security researchers and consumer advocates, with industry advisers
with less power.

\- phase out use of social security numbers as a primary key, eventually
making it illegal to use and require a unique identifier generated for each
service by previous system that is not shared by any other system. To connect
two identifiers requires the person’s approval, the person can change their
per-service identifier at any time and it changes once per year anyhow

------
sova
Troy you have bulleted out most items of interest, I would also advise you to
add mitigation methods: as a computer scientist I am shocked that more
conglomerates are not legally obligated to encrypt information and also ensure
that information is only decrypted while it's being used and looked at. Data
that is not being currently used should be hashed and inaccessible to someone
without the credentials (unless you're the NSA and have the brute force to
work the search space). Data, like a person's address and social security
number, can be encrypted such that we only need a small portion of the correct
data to decrypt the entry, but we need to tie the ends of the conversation
together: "can I have your secret answer and last 4 digits of your SSN?"
should actually initiate the decryption of the data, and should be a real
preventative layer instead of just a delay for the clever social engineering
ploy.

So please recommend that conglomerates be forced to encrypt data in ways that
protects the vast majority of accounts in case of a data breach.

Thanks a lot for taking the initiative to ask the community, regards to you my
friend in freedom.

------
overhang
Define 'breach'.

Did Equifax obtain information about me via a breach?

Show me where in my mortgage contract I gave the bank permission to disclose
anything to Equifax.

Show me any agreement with my employer that authorizes them to disclose salary
information to Equifax.

Out of the thousands of points of data Equifax has collected about me, how
many of those were obtained through 'breaches' by some definition?

------
munk-a
Companies have no inherent right to data collection, if your business is based
off of information regarding other people then it's entirely on you if that
information gets compromised. In the case of equifax et all, nobody ever
walked out their door and asked some shadowy company to track various data
points about them, equifax decided to do that of their own volition, thus all
the damages related to this issue should be put entirely on their shoulders.

In addition, incentivized and high-pressure opt-in have the same issue,
companies are stealing data about customers to improve their business without
offering compensation in kind. The business should be liable for any
mishandling of this information and any fallout from this mishandling, if a
loan was taken out in your name falsely due to the equifax breech they should
have full liability and, due to their voluntary role in this incident, they
should be presumed guilty unless they are able to prove their innocence.

------
xorcist
I think the starting point should be some sort of ownership of personal data,
that all my data is mine and any company processing that data should do so
only with my approval. It is not their data to give away. Obviously one can
not physically own data, but the similarities are closer than with, for
example, intellectual property.

If anything happens outside the boundaries of what data processing is allowed,
by negligence or malice, this then becomes a matter of civil law. In the case
of data leaks it would be up to a court to assess any economic value to the
data. A multiple of the value it could have been sold for is what settlements
are based on in similar matters, and this also scales well to multiple
claimants.

European data protection laws have taken small steps in this direction and I
think it is a sound principle.

------
sabaker
Not to put you out of a job, but I do think it's important for the government
to take responsibility to coordinate and inform citizens of known issues (via
CFPB) and provide a clearing house of information and research for citizens.
CERT, who tracks vulnerabilities, is almost useless it seems today, but if it
would coordinate with the CFPB to determine impact to citizens by various
business, creating a proactive service for business as well as citizen.
Information coordinated between CFPB an NSA on the vulnerabilities they find
as they float their way thru the internet, goes to CERT and CFPB, then used to
both inform the companies of their issues and make for speedier
fix/notifications/reparations. While NSA wants to exploit what it finds for
purposes of cyberwar advantage, has a duty first to protect it's own citizens.

In such a system both business and citizen gain value. The information from
this arrangement can give business advantage of the expertise and leverage the
information quickly to resolve issues they may not have $$(expertise) to find,
so they appear to customers proactive and responsive. Customers gain with
visibility and accountability of the companies who don't resolve issues by
making public that same information. I'm not opposed to reasonable grace
periods between telling a company they have a problem, and before it's made
fully public.

Companies generally only focus on security to the extent they understand the
risks of the impact. By making impact very clear and those impacted given a
strong selection of reparations (a credit monitor being only a pale solution -
useless to many today). I think CFPB could come up with additional
recommendations on reparations for companies that citizens would accept.

Weakness of this argument. Gov't is slow, evidently even with information.
Coordination and competing purposes between Govt Orgs are nothing to sneeze at
- they are hard to resolve. Businesses who feel they cannot quickly or
inexpensively resolve a problem given to them will attempt to hide it, working
against the idea business will gain value.

------
maweki
What I find important to talk about is the narrative that "if you think your
company hasn't been broken into, you just don't know it".

Of course no system is 100% secure, but the narrative that it's inevitable
anyways is often used as a defense for bad practices (like in the Equifax
case).

Breaches might happen but it is not inevitable. And good practices can still
have an impact on how often breaches happen, how much is stolen, and how
useful the data is to the intruder.

I always get angry when I see some company head like "we will be broken into
anyways, why even try?". They won't do that with their physical company
location, why with their network? Because they don't really care about other
people's data.

------
bpchaps
I'd love to see a discussion around data breaches with the use of public
records laws. There's a huge lack of responsibility and auditing around it,
which lead to large batches of information being wrongly released to the
public. This is a systemic problem whose fix likely sits somewhere within a
strongly enforced legal process and accountability framework.

For example, when Seattle accidentally gave me millions of emails:
[http://crosscut.com/2017/10/seattle-information-
technology-d...](http://crosscut.com/2017/10/seattle-information-technology-
department-email-leak-city-scrambles/)

------
hotsauceror
\- Current penalties do not weight the cost-benefit equation sufficiently to
overcome the financial and agility costs of implementing strong data security
policies.

\- Current remedies are designed for the convenience of the entity (write a
check to a credit monitoring service and issue a public apology). The burden
of action remains on the person whose data was collected, possibly without
their knowledge or consent.

\- Lack of accountability. It is difficult to overstate the impact of a breach
like the Office of Personnel Management's SF-86 database, or any of the NSA
leaks. That degree of negligence could arguably have been treated as a
treasonable offense.

------
mjevans
Power and prestige are another aspect.

If companies like Equafax held only collections of public records (facts) then
a 'breach' like this would have no consequences; all of the data would already
be public.

What presently gives them power and what makes this breach so bad, is that
these facts are used as proxies for an actual form of
identification/authentication.

A national ID based on strong cryptographic solutions and issued to all
citizens, preferably with their own private keys being also signed by the
government if they desire, is how we properly enable digital signatures and
progress to an age where forgery of identity is far more difficult.

------
SenecaCarthidge
Shouldn't consumers have privacy and anonymity by default? Unless there is a
specific, explicit permissions to collect data that can be personalized it
shouldn't be created.

Its fine for companies to collect basic usage and telemetry data--but when
they start personalizing it without a user's permission/consent (i.e. when
they start tying it together with personally identifiable information [as
Facebook and Google do] then it becomes weaponized and can then do great harm
to individuals. Privacy by default--and anonymous by default would essentially
prevent this.

------
BeetleB
Storing passwords in plaintext or without salting is something even teenagers
don't do.

I'm not kidding - some teenager using Django will have a site with better
security than what we've seen from some large companies in their data
breaches. This is inexcusable. The narrative often is "smart hackers", when
it's really "we did less than my teenage kid did in securing the data"

------
rrggrr
Extend existing product liability/tort legislation to data privacy.

Product safety in the US is the world standard precisely because plaintiff
attorneys extracted enough cash from manufacturers that shareholders, banks,
investors, board members and insurers forced reform.

Make failure expensive. Limit liability for small business. Make officers
personally liable in cases of gross negligence.

This and about 5 years will deliver results.

------
hh2222
The CEO and company management should eat their own dogfood. Their personal
data should be stored in the same insecure systems as their victims.

~~~
JumpCrisscross
I don’t think LifeLock’s founder regrets making his Social Security number
public. Sure his identity was stolen over a dozen times [1]. But he made
millions. Making businesses liable for data loss is the only stable long-term
solution.

[1] [http://www.businessinsider.com/lifelock-symantec-ceo-
identit...](http://www.businessinsider.com/lifelock-symantec-ceo-identity-
theft-ftc-charges-2016-11)

~~~
MiguelHudnandez
Was it stolen because he literally broadcast it, or because it was stored in
his service? Big difference.

~~~
JumpCrisscross
Point is the cost of putting his information at risk outweighed the benefits
he derived from the company.

------
jstewartmobile
Most of our representatives are so subservient to money that this seems more
like an exercise in exploit-the-nerd for CYA than anything constructive.

I think the best you could do is condense some of the worst offenses into
tweet-worthy " _sick burns_ " that will _hopefully_ be remembered for more
than a few minutes.

------
mjevans
Advice AGAINST reference solutions or examples for how to achieve compliance;
they will cause paralysis and market stagnation.

An example of this can be seen in data-science related to FDA activities;
there is an incredibly heavy bias towards specific proprietary software and
data-storage formats (and I don't mean Microsoft).

------
walid
Derive from other people's testimony:
[https://www.schneier.com/blog/archives/2017/11/me_on_the_equ...](https://www.schneier.com/blog/archives/2017/11/me_on_the_equif.html)

------
garfieldnate
Start by typing the names and email addresses of every congressman/woman into
[https://haveibeenpwned.com/](https://haveibeenpwned.com/). If the results are
scary, share them in your speech.

------
yndoendo
One thing that could limit the exposure would be to allow for individuals to
limit and set controls on the information about them that is being collected.
Less information collected and stored means less of a liability for both
parties.

------
johnhenry
DATA BREACHES by an unauthorized third party ARE A likely RESULT OF forcing
companies to use weak encryption and BACK DOORS in order to allow government
access.

Also, if you can throw something in about continuing net neutrality, that
would be great.

------
w8rbt
Tell them to pass a law that makes it illegal to set obvious passwords. That
is negligent and it harms others.

It's illegal for a bus driver to get drunk and drive a bus full of people. Why
is it not illegal for a sys admin to set 'password2017!' or a developer to set
'developer2017!' as the admin password on a website and increment the year
when it's time for a change? I've seen first-hand (on multiple occasions) how
bad passwords like these harm people on-line. It ought to be against the law.

If you do security basics right (patching, passwords and logging) you'll be
fine 99% of the time. But people won't even do that (it's tedious and boring
and not sophisticated). Instead, they obsess over APT's, zero-day exploits and
nation state actors, when they really just need to start by patching and
setting decent passwords.

------
bmh_ca
SIGINT resources are under-allocated for domestic defence of civilian assets.

------
cjhanks
Tell them that the data security of US companies is a matter of national
defense. The US military would not stand for foreign powers attacking US
interests abroad (shipping ports, oil wells, fisheries, etc), and they should
not stand for it here.

A reasonable amount of money should be dedicated to an intelligence service
attempting to penetrate companies which are of significant national interest.
Fines with increasing severity should be assessed to the responsible parties -
and the vulnerabilities should be communicated in private.

For cases of the most persistent gross incompetence and negligence, companies
should not be permitted to continue operation. Such powers exist in other
agencies.

------
specialist
Tell them the truth:

All demographic data every where is vulnerable, because it must be stored as
plaintext, because we don't have nation wide unique identifiers.

------
tryingagainbro
Make it like health records: huge penalties, based on what you store /gets
hacked. Then watch them try to get insurance without security.

------
DougN7
Please make sure they understand if they mandate any encryption 'backdoors',
these breaches will happen much more often.

------
zymhan
Social Security Numbers are broken as a "unique secure identifier". We need
some sort of certificate-based solution.

------
tofflos
Encourage them to start using password managers so that it becomes more
difficult to steal the identities of congressmen.

------
ngold
We never signed anything giving permission to sell our private lives and
personal data to anyone that is selling it.

------
swiley
It should be illegal to use knowledge of ID numbers to authenticate things
that will effect your credit report.

------
anonymous5133
Why should a company have all of our user data to begin with? Why do you store
it in the first place?

------
JohnnyConatus
Tell them you have their private browser history.

------
hguhghuff
Say "Well I don't know, but I asked hacker news and here is what they said"

------
StreamBright
Not even a single company should have our private data. Probably this.

------
make3
tell them to return to paper voting

------
puppetmaster400
: "People or companies that need to be accountable need someone to be made
accountable - for example a CIO/CSO".

------
wang_li
Tell them that data breaches will be a thing of the past if they pass two
laws:

1\. Criminalize moving PII out of the country. 2\. Personal liability for
every person involved in gathering and protecting the data, and those involved
in managing the teams and companies.

The fact that people can get paid while externalizing the downsides of their
failures is why this is a problem. Make the personally responsible and it goes
away.

~~~
jstanley
What's the point of "Criminalize moving PII out of the country" ?

Most of the risk comes from people who are already inside your country, or
from people who break into your servers and therefore won't care about moving
it outside your country.

~~~
wang_li
Point 2 becomes meaningless if a company can just move their operations off
shore while fully operating in your country.

