
Sen. Wyden proposes bill that could jail executives who mishandle consumer data - walterbell
https://www.theverge.com/2018/11/1/18052254/ron-wyden-privacy-bill-draft-consumer-tracking
======
techntoke
Wyden has been doing a great job lately keeping current with changing
technology:

[https://gizmodo.com/sen-wyden-urges-dhs-to-adopt-new-
encrypt...](https://gizmodo.com/sen-wyden-urges-dhs-to-adopt-new-encryption-
tech-to-pr-1830001179)

------
joatmon-snoo
Official press release: [https://www.wyden.senate.gov/news/press-
releases/wyden-relea...](https://www.wyden.senate.gov/news/press-
releases/wyden-releases-discussion-draft-of-legislation-to-provide-real-
protections-for-americans-privacy)

Links below are from the above

Text of the draft:
[https://www.wyden.senate.gov/imo/media/doc/Wyden%20Privacy%2...](https://www.wyden.senate.gov/imo/media/doc/Wyden%20Privacy%20Bill%20Discussion%20Draft%20Nov%201.pdf)

Official sparknotes:
[https://www.wyden.senate.gov/imo/media/doc/Consumer%20Data%2...](https://www.wyden.senate.gov/imo/media/doc/Consumer%20Data%20Protection%20Act%20section%20by%20section.pdf)

Official 1-pager:
[https://www.wyden.senate.gov/imo/media/doc/Wyden%20Privacy%2...](https://www.wyden.senate.gov/imo/media/doc/Wyden%20Privacy%20Bill%20one%20pager%20Nov%201.pdf)

~~~
joatmon-snoo
Super nice to see legislation being proposed about this. Miscellaneous
thoughts:

\- S7.b.1.D seems nice, but I can't help but wonder what it means to "verify"
a consumer, and the potential for breaches that this poses.

\- S7.b.1.A "establish and implement reasonable cyber security and privacy
policies, practices, and procedure" devil is in the details here, particularly
what "reasonable" is defined to be

\- S6.b.1.B (paraphrasing, because the original text is a clusterfuck to
comprehend, and I think I've got it right) companies must respect a consumer's
opt-out, unless they allow a consumer to pay a fee, capped at the estimated
value that the company would get out of said consumer's data

I have no idea how S6.a.1 is going to go: "implement and maintain a ‘‘Do Not
Track’’ data sharing opt-out website— (A) that allows consumers to opt-out of
data sharing, view their opt-out status, and change their opt-out status; "

No policy expertise here, sadly, so I have no idea how this will fare. As a
techie, good first step.

------
slededit
Seems to be the only solution in the USA. Throw people in jail. Both parties
seem to agree on this, they just argue who specifically should get sent.

~~~
FranzFerdiNaN
Sending rich people to jail is the only real punishment, because fines don’t
matter for rich people and large corporations.

ING was fined almost a billion euros in the Netherlands for willfully and
knowingly laundering money for many years. A month after paying the fine they
already made enough profit to have earned back that money. So who cares about
the fine? It’s just the price of doing things.

Same with fines for rich persons. Who cares about a fine for parking in the
wrong spot if it is the equivalent of 10 minutes work. But if you’re poor that
fine represents perhaps a whole working day, which is a lot, and thus fines
work for poor people.

The solution is of course to make fines depend on your income and wealth. So
Jeff Bezos will pay 10 million for a traffic ticket, and Joe Schmoe who earns
minimum wage perhaps 50 dollar.

~~~
orblivion
It depends if the fine is commensurate with the damage done and goes toward
correcting it. That may arguably be the case for parking tickets. It could
just as well act as a convenience fee for rich people.

------
onetimemanytime
Not gonna work. The CEO can't know everything, and unless people testify
against him (they'll avoid paper /email trail) he'll plead ignorance. It's 38
pages and if it passes it will 380, everyone will want their 2 cents in it.

Hold companies responsible with 10% of their revenue as fines--for first time
violation. In cases of absolute negligence and or malice jail C level execs.
But just because Google gets hacked, we can't jail their CEO.

~~~
titusjohnson
Nah, lock up the CEO. Who cares if they can't know everything, that's not a
good excuse. If the company is so big that the executive team can't keep a
firm grasp on what's going on, then maybe it's too big to exist.

Locking up executives (a risk for which they are richly rewarded already) puts
a natural cap on how large a corp can get.

------
tzs
I'm a bit unclear on the criminal penalties section. (I'm using the draft
linked in joatmon-snoo's comment)

There is an annual report that the company must file with the FTC. That report
must be accompanied by a written statement from the CEO, the chief privacy
officer, and the chief information security officer certifying that the report
fully complies with section 5(a) of the Consumer Data Protection Act.

The criminal penalties for those three officers arises if they so certify the
report knowing that the report does not comply.

What confuses me is that there are two levels of possible penalties.

The first level is if one of those officers "certifies any statement as set
forth in subsections (b) and (c) of this section knowing that the annual
report accompanying the statement does not comport with all the requirements
set forth in this section".

That can subject them to up $1 million or 5% of their largest annual
compensation of the past 3 years (whichever is bigger) and/or up to 10 years
in jail.

The second level ups those limits to $5 million, 25%, and 20 years. The
conditions for the second are word for word the same as the first level except
the word "intentionally" is inserted before "certifies" so it becomes
"intentionally certifies any statement..." instead of "certifies any
statement...".

I'm having trouble seeing how an officer could run afoul of the first without
running afoul of the second. How do you certify the report without
intentionally certifying it?

At first I thought the distinction was going to be whether or not the officer
knew the report was bad, but no, both cases require that the officer knows
that the report is bad. The lower penalties are for certifying a report you
know is bad. The higher penalties are for intentionally certifying a report
you know is bad, so the distinction is whether or not the certification is
intentional.

~~~
thechao
Negligence. Well-written law always includes the notion of _mens rea_ : did
you do it on purpose, or just do it? As an analog consider the difference
between manslaughter & murder.

~~~
tzs
What's confusing me here is that I'm having trouble seeing how it is possible
for a CEO or other officer to _unintentionally_ produce a statement certifying
that the annual privacy report meets the legal requirements.

Suppose Alice is the CEO and Bob is the CPO. Their minions produce the annual
privacy report, and give it to Alice and Bob to certify and send to the FTC.

Alice and Bob both know that the report does not meet the legal requirements.

Alice and Bob nevertheless both certify it and it is sent to the FTC.

I don't see what Alice and Bob could do different from each other that could
lead to one of them being liable for "intentionally certifying" the bad report
and the other being liable for merely "certifying" it.

Compare to, say, killing a pedestrian by driving through a crosswalk that you
know is occupied. It would then make sense to distinguish between

• driving through a crosswalk that you knew to be occupied, and

• _intentionally_ driving through a crosswalk that you knew to be occupied.

You could do the former by seeing that the crosswalk is occupied, intending to
stop when you got closer, getting distracted by something, and losing track of
where you are.

The intentionally modifier makes sense there because you can unintentionally
drive through a crosswalk that you know is occupied.

With certifying a report...what? You were not planning on signing the
certification, but your secretary accidentally put it in a pile of things
awaiting your routine signature and you signed it without realizing what it
was?

------
dbg31415
I think you need a few prongs, but not keen on CEO jail time. CEOs likely
don't have enough visibility into day-to-day to have an impact on data
security. They set the tone, but they won't have time to personally validate
the work is being done to proper security standards.

Few ideas, to be used in conjunction:

1) Fine the shit out of violating companies. Base fines with the same
aggressive tactics they use to compute fines for piracy -- leak 1M records,
pay $100 x 1M in files. And have to set up identity protection for your
consumers -- all that good stuff.

2) Reward good behaviour. Set a third-party standards, if a company submits to
that, and passes, give them 1 strike per year where they don't have to pay the
files. Worried about how to set these? Don't me, EY or Accenture or Deloitte
will set fairly comprehensive standards, as long as they are allowed to charge
for the audits.

3) Protect whistle blowers and offer rewards. If you go through proper
channels, you won't be fired. And if your company fails to act, and then you
narc on them, you should get a bonus. A massive fucking bonus. You see shady,
you report shady, you get a hero's bounty.

~~~
tzs
The jail time is only for people who know that the required annual data
protection report the company files with the FTC does not comply with the
requirements of the law, but certify to the FTC that it does.

------
Cyclone_
Jail seems a little extreme, the punishment doesn't fit the crime here.

~~~
Apocryphon
Powerful corporate elites need a deterrent. If only more financial evildoers
had joined Bernie Madoff in prison after 2008.

~~~
jbob2000
There are two outcomes to this:

1, where quality executives say "fuck this" and the only people who hold the
roles are fall men for invisible puppet masters.

or 2, the role is split up over a bunch of departments so nobody can be held
responsible. The side effect of this is that the products the company produces
become much more expensive.

~~~
Apocryphon
1\. The law, and its enforcers, are presumably flexible enough to not simply
prosecute the figureheads, but any accompanying accomplices as well. Puppets
can be made to accept plea bargains and squeal.

2\. Corporations are people, my friend, and should be punished accordingly as
persons, if there isn't an actual human that can be made liable.

And given the current fracases over outsourcing, immigration, big box chain
stores, vertical integration, monopolies, ad-funded tech, etc., etc., it isn't
simply to shoot down a proposal simply because "the products become much more
expensive." Not even the public is as simply mollified by cheaper goods as
they were a decade or two ago.

------
nil_pointer
This effort is welcomed! Sen. Wyden is doing a great job pushing this.

------
jacquesm
In other news, Mark Zuckerberg just resigned as Facebook CEO.

For companies whose shareholders are not executives the executives will now
have an incentive not to mess up. The problem is that executives rarely have
enough overview over what is happening at the lowest layers of a company to
ensure that data leaks could not happen, they might want to set the tone and
environment but that by itself is absolutely no guarantee.

Though it is good - and probably effective - that it isn't just the
shareholders of a company that are punished (they have even less insight!)
through fines, I highly doubt that jailtime is the right solution.

Personally I'd rather work with the carrot than the stick:

Any employee that comes forward with reports of mishandling of consumer data
or the failure to report a data leak should be awarded a %age of the fines
levied. That way there is a much bigger chance at actually learning about
these things happening, where they happen and companies with a sick culture
would still end up being reported on.

~~~
setr
>The problem is that executives rarely have enough overview over what is
happening at the lowest layers of a company to ensure that data leaks could
not happen, they might want to set the tone and environment but that by itself
is absolutely no guarantee.

But given the law and personal risk, they’re now incentivized to find a
solution to that problem (in the same fashion that employee coming forward
creates a PR risk). If this weren’t a possibility, and a viable one at that,
then data leaks would be an inevitability, and it wouldn’t be sensible to fine
the company (they can’t get insight into the matter, so they can’t do much to
stop it from occurring; you’re going to fine them for succumbing to a natural
disaster?)

If you don’t expect top-down insight to be possible, then I’m not sure how
your carrot would plug the leak. It would announce it and let people prepare
for the damage, but not stop it fron occurring again (the only people with
insight are middle and lower; the lower you go, the less affected by the
fine).

But ofc it is possible, and it is sensible to fine the company, and I don’t
see why it wouldn’t be sensible to expect a c_o to have some insight into the
matter. Ofc such a strong punishment as jailtime, on _any_ data loss is too
much, but data loss after not reaching some defined minimum standard of
protection (essentially leaving it in a state where loss is _expected_ ) for
the type of data? It seems reasonable to enforce larger punishments against
the people in charge of ensuring that those minimums are met. Eg if security
audits aren’t ever done, its negligence. If the data is important or even just
large enough, its criminal negligence.

tldr; your carrot has the same dependency as the stick: if one can hope to
effect change, the other should too. (which is more fair/effective is a
different matter)

~~~
jacquesm
> But given the law and personal risk, they’re now incentivized to find a
> solution to that problem

The only thing an executive can really do is issue a directive along the lines
of 'we will do things this way', and then leave the interpretation of that
directive to the lower ranks. Any mistakes in the implementation - even if all
the interests are aligned - could lead to later accusations of mishandling of
data.

I think that by now we can conclude that security is a hard problem, and that
even companies that do their best (not EquiFax) are at risk.

Criminal negligence for executives is going to be very hard to prove,
especially when a CEO could point to some directive that had the right
intentions.

Also great to see a 'defined minimum standard of protection', but that
definition had better be ironclad if you expect people to serve jailtime when
those standards are violated.

I see another company every week and it is quite surprising how wide the range
of security implementations is, from 'perfect' to 'incredibly sloppy', and
I've yet to meet an executive that understood things well enough to know what
risk exposure they had. Jailing such a CEO would just be petty revenge, and
would not have the desired effect as an outcome.

If you are going to enforce jailtime then punishing the CTO/CISO/CCO would be
more effective, at least they have the relevant knowledge, and would be far
better placed to effect meaningful change.

We already have similar mechanisms for lower level employees violating various
reporting laws regarding suspicious transactions (notaries, accountants, bank
employees).

~~~
jimnotgym
> The only thing an executive can really do is issue a directive along the
> lines of 'we will do things this way', and then leave the interpretation of
> that directive to the lower ranks.

That is not really true.

1) Executives can allocate sufficient resources to data protection.

2) Executives can set up a data protection committee that has similar powers
to the audit committee. In the UK this would normally be placed under the non-
exec chairman's control.

3) Executives can insist on an external audit

Corporate governance is complicated, but not impossible. Executives manage to
keep control off the shareholders money, despite never touching it themselves.
They can learn to do the same with data.

