
CCPA Will Hit Dev Teams Harder Than GDPR - icoe
https://www.tonic.ai/blog/ccpa-will-hit-your-dev-team-harder-than-gdpr
======
jarvuschris
Counting an IP address as PII is kind of crappy, you need a court order to
turn an IP alone into PII.

Operators should be free to log traffic at the network level, PII should only
come into play once you're asking someone to provide personal information.

~~~
ehnto
Yeah it is odd. You decided to hit my server, I should be able to record the
occurance. How am I suppposed to deflect DoS attacts if I can't maintain a
list of nefarious IPs. I know that's a fairly low tech attack, but they still
happen constantly. Is Fail2Ban no longer compliant?

I wouldn't be surprised if some policies pertaining to record keeping in some
sectors contradict that requirement as well.

~~~
IanCal
Not sure about this law but that sounds completely fine under GDPR. You need
to keep your log files secure and not longer than necessary for what youre
doing though.

[https://termsfeed.com/blog/gdpr-
recitals/#Recital_49_8211_En...](https://termsfeed.com/blog/gdpr-
recitals/#Recital_49_8211_Ensuring_Network_Security_as_a_Legitimate_Interest)

------
lxe
> if a data breach occurs, the law permits consumers to recover up to $750 per
> incident

This is great!

~~~
kevin_b_er
Simple, you just add this to clickwrap agreement:

The Parties mutually agree that any and all disputes arising from or relating
to this Agreement, including the interpretation or application of this
Agreement will be submitted exclusively to final and binding arbitration
pursuant to the Federal Arbitration Act. The arbitration will be conducted the
state of Delaware or such other location as the Parties may agree, by a single
arbitrator in accordance with the substantive laws of the State of Delaware.

Boom. No more pesky California law.

~~~
carbocation
I dislike how the minute someone mentions a legal hack, the responses are "oh,
are you a lawyer?"

Why not consider this reply on its merits?

~~~
ChrisSD
"Legal hacks" are rarely, if ever, as clever as their proponents think.
Scepticism is natural and warranted.

Judges aren't complete morons and will take a dim view of "hacks". There could
be loopholes somewhere but you'd need a lawyer to spot them.

~~~
jordigh
One of the most famous "legal hacks", Richard Stallman's copyleft, had to be
rewritten by a lawyer. rms wrote GPLv1 by himself and you should never use it.
GPLv2 is the version that was actually vetted by a lawyer.

A similar thing happened with Perl's Artistic License. Its version 2 is
basically also a lawyer-approved rewrite.

In other words, hackers, don't try this at home. There are professionals who
can do this for you.

~~~
nickpeterson
I find it somewhat sad that law is basically a guild where arcane language is
used to gatekeep what should be a much more straightforward exercise.

~~~
Karrot_Kream
It's not. It's the equivalent of saying "I can do this better" and producing
unreliably, buggy code. Sure you can, but a more experienced professional can
point out all the corner cases you missed.

~~~
sbov
Then when it fails, you blame the programming language rather than your
experience in programming.

------
nixpulvis
"It defines de-identified as “information that cannot reasonably identify,
relate to, describe, be capable of being associated with, or be linked,
directly or indirectly, to a particular consumer.”"

I'd love to know what they mean by reasonable... I've seen some demos of tech
that can do some pretty amazing things at de-de-identifying.

~~~
icoe
So, huge caveat (I'm NOT a lawyer), but right now most interpretations seem to
suggest that masking and synthesizing would constitute appropriate
deidentification even if a motivated adversary could reverse engineer given
appropriate time and resources. Again, this is something that will likely be
clarified over time.

------
gwbas1c
Great article, until the end.

Who uses PII in test data derived from real customers? That's just an absurd
practice to begin with, and no one who takes security seriously would even
consider doing this.

~~~
alkonaut
I have never seen a “dev” instance of a DB that wasn’t just a snapshot of the
prod DB from earlier. I admit haven’t seen many - but I have seen zero of any
other kind (e.g. anonymized or synthetic)

~~~
sjjshzvuiajhz
Just going to throw out there that I’ve never seen a dev database that was
anything other than fake data, or internal dogfood data. Have worked at major
public tech companies and late-stage startups.

~~~
alkonaut
I think one reason might be that this was never sensitive personal data. Phone
numbers, emails and addresses mostly corporate. But real passwords (hashes)
from real users, on 50+ laptops with unencrypted drives was pretty normals.

I think culturally there may be a difference since I'm in a place where some
data (addresses, phone numbers, ...) is public info, i.e. given your name I
can get your address and phone number from a public DB anyway.

------
kodablah
Going into effect in a year? Seems like a business opportunity. Someone let me
pay them $X and review my systems every so often and give me a seal saying I'm
compliant with all these laws, and include some insurance up to $Y. Especially
given the selective enforcement, there's money to be made from the chill
alone. Compliance audit companies can probably just roll this into their
package.

Also, I'm a bit annoyed at laws only affecting companies of a certain size. At
some point right at crossing the line, there's a negative effect to having
50,001 users. (really I'm annoyed at how these data protection laws are
implemented in general and I wish the discussion would be about that instead
of being idealistic and only looking at the supposed intent)

~~~
dmitriid
> how these data protection laws are implemented in general and I wish the
> discussion would be about that instead

Let’s do that, shall we?

Before GDPR there were laws in each European country protecting private data
(GDPR is basically Sweden’s data protection law in that regard).

Not a single “poor company that will need comply” gave a damn.

Then GDPR was introduced, discussed, amended. Quite publicly. Not one of the
“poor devs that would be hit by it” gave a damn.

GDPR was passed and companies were given two years to adjust their
software/systems/business practices to comply. Hardly any of the “let’s have a
discussion shall we” devs gave a damn until the last few months of the
transition period.

And only when they realized that they had to actually do something, something
they should have done literally _years_ ago, we had (and still have) this fake
outcry of “boohoo these laws make us work hard and do right things and we
don’t wanna”.

Cry me a river.

~~~
SomeHacker44
As a top engineer of a EU headquartered company, I can be one instance of
saying this was not true of us. We started our preparations almost a year and
a half in advance of the March 2018 deadline. Once we engineers and our GC
were done interpreting the extent of what we believed we needed to do and the
resources to do it, we were basically ordered by the CEO to do as little as
possible as late as possible, automate as little as possible, and just wait to
see if anything came of it. I left the company a few months after GDPR-day so
cannot say how it worked out, but it was the CEO’s company and his choice to
do it in a way that it then became my responsibility to implement.

Compliance/legal is a company risk and as I indicated in the challenger
article here a few days ago, as an engineer I can advise on hat the risks are
and the potential consequences of bad outcomes, as well as the costs to reduce
them. The business decides what level of risk to take. I personally would have
preferred a robust response to GDPR and thorough internal procedures, but it
was not my call to make.

Of course, I personally believe that we humans should own our data and digital
footprints, so I agree with a lot of the concepts behind GDPR and CCPA even if
I do not agree with all and as an engineer may think some are ...
silly/overzealous/misguided or what have you. Case in point: the IP tracking
discussion above. If I hit your network, thats on me (barring externalities or
bad actors, etc.). Retention periods and use definitions are fine, but a
requirement to treat it as PII or other super sensitive data seems a bit much
to the engineer in me.

~~~
dmitriid
Yes, true. In the end it comes to business decision. My focus on devs is
mostly because it's devs who comment and complain on HN, so my comments are
mostly geared towards them.

It's true, businesses (or people who run them) will in the end judge the
direction where the company will go, and their judgment is often worse that
that of developers.

So yes, I would replace "devs" etc. with just "companies" in my comment.

~~~
SomeHacker44
I wouldn't say "worse" judgement in general. Just "different" in general. I
have had both "worse" and "better" cases.

However, the better integrated and communicated the company's goals and
rationales are, the more aligned the judgements become.

------
afpx
When using personal data is outlawed, only the outlaws will use personal data.

What about all of the state actors (and 'hackers') who are cracking
corporations for data and building a massive database on everyone?

~~~
chias
> When using personal data is outlawed, only the outlaws will use personal
> data.

This argument only works if you feel the thing being outlawed is _good_ (it is
most commonly used in the context of privacy). To your statement I would
respond the same way as I would respond to "When shooting people is outlawed,
only the outlaws will shoot people": sounds good to me!

~~~
afpx
I meant, in jest, that if laws get tougher on the private sector, I hope that
the government also throws a lot more money at data crime, too.

------
DiabloD3
So, devil's advocate here: why not just require your ToS to state that if the
user is from the state of California, that they are to not use the service and
find a local alternative?

It is a state law, they can't hassle you if you're not Californian and do not
service their target market. Most of America doesn't live there, and
California seemingly doesn't want you to do business there.

~~~
Novashi
>So, devil's advocate here: why not just require your ToS to state that if the
user is from the state of California, that they are to not use the service and
find a local alternative?

Silently redirect them to a similar-enough site run by a partner company
that's based in another state/country.

~~~
anticensor
That would be considered an anticompetitive behaviour.

------
bcheung
Does this mean another swarm of privacy popups everywhere again?

~~~
coldacid
If we're lucky.

------
fuzzy2
ot: What's wrong with this website? It loads super slow and behaves very
weirdly on my iPhone.

~~~
icoe
Apologies. We're using wix right now. We'll be moving off shortly.

~~~
Groxx
Ah. yeah, that's normal for wix :| It works for making sites, but it never
works well.

------
briffle
This seems like a good discussion to ask.. Have any of you used a tool like
pg_Anonymizer [0] to mask your data when building test/dev databases? I see
several tools on there, but have no idea where to be begin on them..

[0]
[https://pgxn.org/dist/postgresql_anonymizer/0.0.3/](https://pgxn.org/dist/postgresql_anonymizer/0.0.3/)

------
rco8786
Already starting to deal with this where I work. It’s gonna be interesting...

------
zestyping
If a company distributes a program or app that processes your address,
personal data, geolocation, etc. _on your device only_ and the sensitive data
never leaves your device, are they subject to the CCPA?

------
raverbashing
So are we going to have websites blacklisting CA IPs and answering back with
some vague "this content is not available in your region?"

~~~
Isinlor
Hopefully not, unless we speak about some hyper-local businesses.

It's now GDPR + CCPA, so you are cutting off EU and California. Probably, more
to come.

For example, seems like LA Times does not block EU anymore.

------
newman8r
The way this came into existence is what scares me.

~~~
ahartmetz
The "if they knew what we know" part or the you can (kind of) buy policy part?
If the latter is shocking to you, I have some very bad news for you. (The
process would have been more difficult and more expensive if the law wasn't
genuinely benefiting the people, but still possible.)

Also, Mr. Mactaggart, what a guy!

~~~
newman8r
The part that bothers me is how hastily the "compromise" was drafted, without
any public debate. I don't like the idea of an individual holding the
legislative process hostage.

There are limits on campaign contributions, perhaps there should be limits on
individual contributions for these signature drives, which are essentially
just large marketing efforts.

And just because this guy got x number of signatures, I don't see why he
should now have the power to make compromise deals with the government.

This isn't my area of expertise so I may be missing something here.

------
truesy
IMO the biggest difference between CCPA and GDPR is that GDPR does not
distinguish between large and small companies. Everyone needs to comply. At
least with CCPA you can bootstrap a company and not have this be another thing
you need to worry about, on day 0

~~~
coldacid
No, just once you reach 50k visitors to your site.

------
dmitriid
Can we stop talking about how privacy laws are hitting devs, and start talking
how they will benefit people?

Boohoo, poor devs need to finally pay attention to people’s private data.

~~~
ThrustVectoring
You need to talk about both costs and benefits when discussing public policy.
Otherwise, you end up with a ton of terrible policy that looks good due to an
obvious tangible benefit, but nets out to more harm than good.

For example, minimum bedroom sizes for rental units. Seems nice to have enough
space to live comfortably, right? End result though is the $20M apartment
complex has 35 units instead of 40, and is only built later when rents have
gone up to make the project make sense financially, exacerbating a housing
shortage.

~~~
dmitriid
Let’s look at the cost, shall we?

Invasive and pervasive surveillance. Private and sensitive data sold wholesale
not even to the highest bidder, but to anyone.

Hell, when news about NSA surveillance broke, it was a huge scandal that was
the focus of attention of all media for more than a year. Now Facebook alone
is reported to have the same level of maliciousness and willfull ignorance on
a monthly basis, and it’s business as usual.

So yes, I don’t give a rat’s ass about the “poor developers” who couldn’t get
their shit together and provide privacy and security to the common people. And
who now pretend they are being unfairly punished by governments.

And yes, I’m a developer myself.

~~~
ThrustVectoring
"these costs fall on people who I feel deserve it" isn't a good reason to
completely ignore the size of the costs being imposed. Especially since these
costs are sublinear with respect to organization size, causing the tech
behemoths you complain about to get a free competitive advantage against
upstarts threatening their business model.

------
mark_l_watson
I like the general idea and I like that it specifically applies only to
organizations with more than $25 million in revenue. Give small startups a
break.

Does the GDPR also have a lower limit like this? It should.

~~~
fooey
The criteria is a "one or more of the following" not a combination of them all

So if you make more than $25 million, OR your have more than 50k users or
devices, OR you make more than 50% of your money selling data

~~~
AnthonyMouse
Seems like the second one is the real problem. "50K users or devices" is less
than 0.02% market share, even if you have only US customers, and for
businesses with margins in the $1/user/year range it doesn't even cover one
full time employee.

You can end up with that many users on a side project all of a sudden if it
gets posted to the front page of a site like this one.

~~~
pkaye
50K California customers.

~~~
AnthonyMouse
So less than 0.13% market share then.

Assuming you have any way to reliably identify which state your users are in
-- which means we're back to "privacy regulations" encouraging companies to
collect more data on their users.

------
IfOnlyYouKnew
“He started worrying about data privacy after talking with a Google engineer
and spent nearly $3.5 million in 2017/2018 to place an initiative on
California's November ballot.”

That Google employee must be somewhat nervous these days...

------
Novashi
>Process personal information of >50k consumers, households or devices

>Derive >50% of revenue from selling PII

So if I forward all of the data to another company outside of CA, does my
company count as processing data?

What if the code that forwards that data is written by another company and I'm
just hosting it on my site? Everything goes through their code and I'm paid to
just setup a website to host their code.

Maybe I do collect info in CA but I sell the data for $1, but the company also
buys some consulting services for the actual price of that data that I'm
selling them?

~~~
figgis
> So if I forward all of the data to another company outside of CA, does my
> company count as processing data?

You are still processing that data. Part of processing that data involves you
shipping it off...

> What if the code that forwards that data is written by another company and
> I'm just hosting it on my site? Everything goes through their code and I'm
> paid to just setup a website to host their code.

You are as responsible, if not more, in making sure that compliance is met.
You are the one hosting the code. The data is moving through your servers.

> Maybe I do collect info in CA but I sell the data for $1, but the company
> also buys some consulting services for the actual price of that data that
> I'm selling them?

That's just being a jerk. But better hope you don't pass the 50k mark...

~~~
Novashi
>You are still processing that data. Part of processing that data involves you
shipping it off...

>The data is moving through your servers.

So if a random company gets breached, everyone involved from cloud providers
to ISPs are also responsible because they facilitated moving and storing the
data and they are just hosting code?

This is problematic. Cloud providers give you permission to publish code. I
could position myself to allow another company to publish code on my popular
website to collect data and my role is basically no different than a cloud
provider. We don't have to agree that is what it's specifically for, I just
need to give them access to upload their own code for whatever expensive fee.

~~~
figgis
>So if a random company gets breached, everyone involved from cloud providers
to ISPs are also responsible because they facilitated moving and storing the
data and they are just hosting code?

ISP's aren't (supposed to be) "storing" that data. They are transferring bits
between computers. You on the other hand are hosting a website with some sort
of form that people input PII into. You are accepting that PII, whether or not
it gets forwarded or not is irrelevant. You are processing it. So do your due
diligence, contact your users and let them know what is going on, and speak
with a lawyer for more information.

~~~
Novashi
>You on the other hand are hosting a website with some sort of form that
people input PII into.

That's what cloud providers do! If there's a spirit-of-the-law that is
supposed to protect them, this would be a good time to write that in!

~~~
heavenlyblue
Do they specifically mention rental cars in the code of law, when they say
that the driver can't drive over the speed limit?

~~~
Novashi
"Process PII" is incredibly vague. You could define that in a hilarious amount
of ways with the amount of complexity we introduce to our software products,
especially with code we don't even write ourselves that widens your security
surface.

This is especially true if you use a service that allows others to inject code
into your code base. If NPM has a security failure that leads to a breach at a
company, who is at fault? Both? Or only the company that chose to use the
code? An NPM package might be processing PII after all. Does that mean NPM can
never be held responsible for security breaches?

Secondly, your example would be backed up by historical cases and this law is
brand new, so it is not clear. I'm not even sure how you guys can confidently
argue that the new law ISN'T outright vague.

~~~
heavenlyblue
>> You could define that in a hilarious amount of ways with the amount of
complexity we introduce to our software products, especially with code we
don't even write ourselves that widens your security surface.

You could define in a hilarious amount of ways in which your chef can pee in
the broth you ordered in a local diner. But it generally doesn't happen, does
it?

