
New study: The advertising industry is systematically breaking EU law - robin_reala
https://www.forbrukerradet.no/side/new-study-the-advertising-industry-is-systematically-breaking-the-law/
======
Jolter
"– Every time you open an app like Grindr advertisement networks get your GPS
location, device identifiers and even the fact that you use a gay dating app.
This is an insane violation of users’ EU privacy rights, says Max Schrems,
founder of the European privacy non-profit NGO noyb."

Yes, insane seem like it's not actually an overstatement for once.

~~~
toohotatopic
In which way is this a violation? It is not illegal for companies to process
the data which you have signed away with a contract. It is allowed if [1]

>they have a contract with you – for example, a contract to supply goods or
services (i.e. when you buy something online), or an employee contract

You just have the right to request that your data is deleted but as long as
you don't do that, why should your data not be processed?

Additionally, it is allowed to process private data without a contract if it
is necessary. Then, an excessive use of private data is illegal.

But most services will have a contract and thus it is legal.

Minor detail: services cannot be restricted if you don't agree to share your
private data. But that doesn't touch sharing the data that was signed away
with a contract.

[1] [https://europa.eu/youreurope/citizens/consumers/internet-
tel...](https://europa.eu/youreurope/citizens/consumers/internet-
telecoms/data-protection-online-privacy/index_en.htm)

~~~
anonymousab
> Minor detail: services cannot be restricted if you don't agree to share your
> private data. But that doesn't touch sharing the data that was signed away
> with a contract.

That's not minor, it's major. It invalidates any sort of "let us use your data
in order to use the app" profiteering clickwrap nonsense, and any kind of
"contract" derived from that would be void.

~~~
toohotatopic
You don't have to exclude people, you just have to make it inconvenient to not
click ok. The majority will accept the default value.

It is the same with ad-blockers. Youtube would be bankrupt if people weren't
lazy.

~~~
kuschku
Also wrong, the "not share" option has to be default, and displayed larger and
clearly visible

~~~
toohotatopic
Can you give me a reference for that, please. I have only found [1]:

>It states that you should integrate data protection from the designing stage
of processing activities. Article 25 of GDPR lists the requirements for data
protection by design and default.

But that's the processing, not the agreement.

And about consent [2]:

>Freely given - the person must not be pressured into giving consent or suffer
any detriment if they refuse.

>Specific - the person must be asked to consent to individual types of data
processing.

>Informed - the person must be told what they're consenting to.

>Unambiguous - language must be clear and simple.

>Clear affirmative action - the person must expressly consent by doing or
saying something.

But freely given doesn't forbid using the default or does it?

[1][https://www.cookielawinfo.com/gdpr-privacy-by-design-and-
def...](https://www.cookielawinfo.com/gdpr-privacy-by-design-and-default/)
[2][https://www.privacypolicies.com/blog/gdpr-consent-
examples/](https://www.privacypolicies.com/blog/gdpr-consent-examples/)

~~~
jlokier
You said yourself:

>>Clear affirmative action - the person must expressly consent by doing or
saying something

It's not "clearly affirmative" if it's a default that's difficult to find the
alternative to, or easy to select by mistake.

~~~
Supermancho
Thats a subjective interpretation of "clearly affirmative" that matches more
closely with "obvious and clear or straightforward". An option can be obvious
to find and yet worded unclearly as we have all encountered "do you want to
opt out or not?" Y/N - wait what did the text blob above this say?

Pretending this isnt a delicate issue creates more loopholes and bench time
than helping consumers.

~~~
jlokier
I think "clearly affirmative" is only possible when it's "clear" \- the
consumer cannot clearly affirm to an unclearly-worded question.

You can also cover it under "easy to select by mistake [the unintended
answer]".

I think the root of the problem in most cases is that companies _don 't want_
to help consumers, they actively want to mislead them and then claim plausible
deniability.

I don't understand which sense you mean by "pretending this isn't a delicate
issue..." here. I can't even tell if you are pro-GDPR or anti-GDPR from that,
and which of those positions you consider to be helping consumers more.
Ironic, Y/N? :-)

~~~
toohotatopic
Can you tell if I am pro- or anti-GDPR?

They, like I, are not arguing for a side (I assume). We are pointing out that
the legal situation is not as clear as the article suggests.

Judging by the lack of won cases, a default ok button seems to be 'clearly
affirmative' enough.

The law states [1]:

>It shall be as easy to withdraw as to give consent.

Nothing states that consent has to be more difficult than non-consenting.

>the request for consent shall be presented in a manner which is clearly
distinguishable from the other matters, in an intelligible and easily
accessible form, using clear and plain language.

The request has to be distinguishable, not the consent.

[1] [https://gdpr-info.eu/art-7-gdpr/](https://gdpr-info.eu/art-7-gdpr/)

~~~
jlokier
> Nothing states that consent has to be more difficult than non-consenting.

Nobody is arguing that consent should be more difficult.

The complaint is that _non-consent_ is often much more difficult than consent,
sometimes ridiculously so.

In my personal experience I have been unable to find the no-consent option at
all on some sites. Just links that go around in circles, sometimes to hundreds
of ambiguous and mixed-polarity yes/no-or-was-it-no/yes-style options (one for
each of hundreds of "partner sites" I've never heard of), with the only clear
option being consent-to-all.

If I eventually click on "ok" that is _not freely given consent_ , it's
coerced due to me being unable to find or understand how to decline it.

It is technically easy to provide a "decline-to-all" option whenever they have
provided a "consent-to-all" option.

Therefore, clearly companies which provide an easy consent-to-all but make
decline-to-all virtually impossible to select, or actually impossible, are
doing so deliberately, intending to frustrate the consumer from exercising
their rights.

The law says that a person should be able to decline if they choose, that it
should be easy enough to do, and easy to understand which option they are
choosing. Such sites are not compliant with that principle, and it looks like
deliberate non-compliance to me.

> The request has to be distinguishable, not the consent.

Well, "the request" is what we've been talking about. It means the UI. Things
like "Ok" and "decline" buttons, how the options are presented, how they are
explained clearly and unambiguously, the ease and accessibility of selecting
the freely chosen option, that sort of thing.

~~~
toohotatopic
>Therefore, clearly companies which provide an easy consent-to-all but make
decline-to-all virtually impossible to select, or actually impossible, are
doing so deliberately, intending to frustrate the consumer from exercising
their rights.

Yes, that's their business concept and it is legal.

>The law says that a person should be able to decline if they choose, that it
should be easy enough to do, and easy to understand which option they are
choosing.

The law states:

>>It shall be as easy to withdraw as to give consent.

>Such sites are not compliant with that principle, and it looks like
deliberate non-compliance to me.

I rather think that they follow the law to the T. People would love if their
behavior would be illegal but they forgot that companies are involved in the
law making process, too. The EU wants its companies to be competitive on the
internet. Making it impossible for companies to finance themselves with
advertising in their home market would kill their already weak internet
economy. Who would accept the sharing of private data if a rejecting would be
as easy as accepting?

GDPR is a compromise between the protection of the netizens and the business
interest of the economy. As such, it protects against the worst abuse but the
world is not free. In one way or the other, somebody has to pay.

~~~
jlokier
>>>It shall be as easy to withdraw as to give consent.

>>Such sites are not compliant with that principle, and it looks like
deliberate non-compliance to me.

> I rather think that they follow the law to the T.

I think "as easy" is _plainly_ incompatible with "much harder" or
"impossible".

You cannot make something plainly much harder than something else, and still
pass the "as easy" test in the law to a T.

You also cannot pass the "accessible" test that way.

>> intending to frustrate the consumer from exercising their rights.

> Yes, that's their business concept and it is legal.

I don't believe it is legal, because these are statutory rights.

To use an analogy that involves another statutory right, it would be like a
company preventing you from exercising your right to return a broken product
"because it's their business model to ship defective products and we cannot
kill the economy by preventing that business". Companies do get away with
that, because people can't find the energy to pursue it, especially for small
violations, but when sued those companies do lose.

You cannot determine that it's legal just from the fact that companies get
away with it.

~~~
toohotatopic
It's one thing to reject consensus and another to withdraw consensus.

The companies can make the rejection difficult as long as the withdrawal is as
easy as the giving.

~~~
jlokier
Not if the rejection is made so difficult, or impossible, or inaccessible, or
incomprehensible, or ambiguous, that the consent fails to meet the standard of
_freely given consent_.

Clicking the "ok I consent" button _does not count as consent under the law_
if the user believes they have to click it to use the service, assuming what
is attached to that button isn't technically necessary for delivery of the
service.

And holding PII for marketing and tracking purposes does not count as
necessary, despite any economic argument that it pays for the service. That
argument is disallowed.

~~~
toohotatopic
>the request for consent shall be presented in a manner which is clearly
distinguishable from the other matters, in an intelligible and easily
accessible form, using clear and plain language. [1]

On which part of the law do you base your first paragraph? The text that fits
for me is all about the consent, not the rejection. It must be easy to
understand to which a person consents, but the rejection can be difficult.

There is also:

>When assessing whether consent is freely given, utmost account shall be taken
of whether, inter alia, the performance of a contract, including the provision
of a service, is conditional on consent to the processing of personal data
that is not necessary for the performance of that contract.

Services have to point out that consent is not necessary. If that's usual not
done, then this abuse can be ended by notifying the EU. I thus assume that
most services offer that notice. Then it is very difficult to argue in court
that a user still believed that they didn't mean to give consent. People have
to argue for their legal incapability if they want to get out. Who would do
that?

The compromise of the law is that people in general mindlessly click ok so
that targeted advertising is possible. People who mind tracking can easily opt
out. This leaves the ignorant to be tracked. How else should free services be
financed? The only other option is making people pay for everything which is
ok but a radical shift for the internet.

[1] [https://gdpr-info.eu/art-7-gdpr/](https://gdpr-info.eu/art-7-gdpr/)

------
brootstrap
I have family in the industry (digital ads, bidding for ads) and some of the
stuff they have explained to me is pretty gnarly. One cousin was explaining an
idea he had for a new business and it was essentially just about collecting as
much personal data as possible. Imagine a simple DB table with 'user_id' as
primary key. Then you just start adding data on top of data with new columns
or joins. After a while you have hundreds (thousands) of datapoints per
person, then you start selling that data.

So i want to help my family out b/c i have the tech skills and building some
prototypes would be easy. But i'm also not sure i want to feed the advertising
beast. Shit is whack right now man they listen to you talking to your wife
through your phone about buying a mattress, going to a concert, taking a trip
somewhere. Next time you open the browser you are bombarded with relevant ads.
too creepy no thanks!

~~~
cj
> they listen to you talking to your wife through your phone about buying a
> mattress, going to a concert, taking a trip somewhere. Next time you open
> the browser you are bombarded with relevant ads.

I've heard this from non-tech family/friends.

Is this actually happening though? If so, who is doing it and how?

Edit: I'm referring specifically to ad tech that targets ads based on
overheard conversations.

~~~
s3r3nity
Without doxing myself by giving too much info about myself away, I can say
with 99% certainty that this is not happening with Facebook or Apple.

The remaining 1% is if policy changed in the past few years (read: < 3 ), or
if there was some top-secret team that few knew about working on this.

In general, these cases are coincidences based upon one or some combination of
the following:

1) Search history that became "subconscious" and the person forgot.
Annecdotally, check your history and see if you remember _everything_ you've
ever searched for...you may be surprised. Android especially saves more than
you might think

2) Related searches or characteristics relevant to the market. Example: during
Black Friday in the 'States, many people are searching for TVs, so Amazon or
Walmart will probably serve you ads for that in anticipation. Let's say with
10% odds, 1/10 people will see this ad and think "huh - I was just telling my
wife we need a new TV. How did they know?" Law of large numbers and such...

3) The person installed malware that IS collecting sound samples, feeding that
data to an ad-server, and actually performing this malicious behavior - but it
isn't necessarily FB/Apple/etc. You tend to see this more on Android, as the
Play Store has too many apps with malware like this, but it can happen on iOS
too if the user isn't careful about privacy permissions.

Hope that long response helps answer your question =)

~~~
harry8
You are saying, "Trust me I work with apple and facebook"

No, no trust, none at all, zero. This is not at all in any way personal and
you're anon so it couldn't be.

Do you understand how that works and why? Pathological lying has taken place
in the world's most successful bait and switch. Nobody agreed to this. Not a
single person agreed to a surveillance state and the creation of a turnkey
facist enforcement solution. The stasi couldn't have dreamed of having so much
power.

Zero trust. Less than zero. Facebook and Apple (and others) have now been
caught and are desparately trying to pretend it's all ok. It isn't. Not even
close.

We now have to assume the content is lies, we don't have a choice. The fact
you need to be anonymous in claiming everyting is really ok is telling.

~~~
gmanley
Do you not trust anything on Wikipedia? Of course you can't ascertain 100%
whether a comment is true or not but going around saying everything is a lie
because Ad companies lie to you doesn't seem very helpful. OP wasn't saying
trust me I work at Apple/Facebook they were saying: I had pretty good
visibility into internal projects and the codebase and from what I saw that
type of tracking wasn't going on. Of course you can only take that type of
comment at face value but to assume it's a lie seems silly.

Not believing a PR/damage control statement from an Ad company on the other
hand is probably the right thing. Now at some point Ad companies may start
doing huge disinformation campaigns on social media with payed commenters but
that doesn't seem to be the case yet.

------
blakesterz
The technical report is pretty interesting. They break down what's going
where:

[https://fil.forbrukerradet.no/wp-
content/uploads/2020/01/mne...](https://fil.forbrukerradet.no/wp-
content/uploads/2020/01/mnemonic-security-test-report-v1.0.pdf)

~~~
dgellow
Also, the non-technical(?) report is also a great read, and can a good one to
share around to managers/non-engineers:

[https://fil.forbrukerradet.no/wp-
content/uploads/2020/01/202...](https://fil.forbrukerradet.no/wp-
content/uploads/2020/01/2020-01-14-out-of-control-final-version.pdf)

~~~
mkarlsch
Agree! Both reports are a pretty good and reflect the status quo in the
industry. There is an overcollection and oversharing of data without proper
consent and that has to stop. For Europe - forcing the ad-tech industry to
adhere to the GDPR is the correct next step as self regulation did not work.

Having said that the report paints a pretty dark and one-sided picture. Let's
see how far the authorities will follow their argumentation / conclusion.

(full disclosure: I work in that industry.)

------
iamaelephant
All this invasion of privacy and still little evidence that user targeted
advertising is substantially more effective than simple content based
advertising.

~~~
alanlovestea
It has been proven very effective.

Quote: "So successful, by the way, that between 2000 and 2004 with their IPO
[Initial Public Offering] documents going public, the first time we got to
learn exactly what the impact of this new logic was. And the impact was a
revenue increase of 3,590%, just during those years 2000-2004."

source: [https://www.econtalk.org/shoshana-zuboff-on-surveillance-
cap...](https://www.econtalk.org/shoshana-zuboff-on-surveillance-capitalism/)

~~~
iamaelephant
That's comparing tracking advertising with bad data to tracking advertising
with good data. Where's the comparison of tracking to non tracking
advertising?

Tracking advertising costs the advertiser more (and makes the ad provider a
lot more), but there's little evidence to suggest it's significantly more
effective than content advertising. And this isn't even accounting for the
enormous social cost of private surveillance.

[https://techcrunch.com/2019/01/20/dont-be-
creepy/](https://techcrunch.com/2019/01/20/dont-be-creepy/)

------
sixtypoundhound
Get ready for ads to get more annoying. The hidden benefit of using
personalized data to target and track prospects was an advertiser could use
"soft sell" to build a brand over time.

As that market gets shut off, we're back to aggressively using interruption
marketing "shock jock" ads and auto-play video. Click now or forever hold your
peace...

The problem with contextually targeted ads is there is no real guarantee of
repetition and brand building...

~~~
matheusmoreira
> we're back to aggressively using interruption marketing "shock jock" ads and
> auto-play video

Advertisers thought they could get away with pop-ups until browsers shut them
down by shipping pop-up blockers by default. I have no doubt ad blockers will
shut down that market as well. Hopefully it will shut _all_ of it down,
driving advertisers out of the internet forever.

~~~
mjevans
I am only OK with one type of advertising.

1) I have a problem or a need that I am looking to fulfill.

2) I ask (E.G. a search engine) about fulfilling that need.

3) The results are no BS, no hidden fees, directly up-front responses that
tell me how much something is going to cost, where it is, and maybe why that
will solve my needs/desires.

That is the only place at all that 'ads' belong, informative messages that are
intended to actually help BOTH the consumer and any service provider.

~~~
matheusmoreira
What you described is reasonable and I'm fine with it too. Though is it really
advertising if you _ask_ for it? To me, ads are the stuff they shove down
people's throats whether they want it or not hoping that a fraction of them
will appreciate it.

------
Mirioron
I wish GDPR had some exemption for small companies or companies that are
starting out. Maybe limitations on revenue, amount of users and scope of data,
eg strict rules still apply (to a small company) when dealing with sensitive
data such as health information, but email addresses are not as strictly
governed. Perhaps require a plan of action to abide by the full rules by X
future date even.

Take for example some one person indie developer. A common way to monetize
that is through ads and in app purchases that turn ads off. With the current
rules it's difficult for a single person to know that they abide by all of
that while still monetizing in such a way.

~~~
belinder
That would be way too easy to take advantage of by a big company making lots
of small side companies

~~~
ThomPete
How would that work out in practice? Sounds highly unlikely.

~~~
MiroF
Have you heard of contracting practices in the US? Rather than "highly
unlikely", this is the norm for tech companies that want to skirt corporate
responsibility.

~~~
AnthonyMouse
That's an unrelated issue. In this context that could be avoided simply by
counting more than half time individual contractors as employees when
measuring entity size.

------
jka
The proposed ePrivacy Regulation[0] looked like it was set to introduce some
very positive recommendations to curb the worst of advertising cookies and
pop-up dialogs.

Article 10 of the draft regulation suggested moving consent settings into the
browser so that you could specify whether you will accept various forms of
cookies centrally and then have those settings apply to all sites.

A ruling[1] related to shady consent practices by a website called Planet49
seems to have shifted the regulatory window towards the idea that users have
to definitively prove informed consent.

Meanwhile the latest draft[2] of the ePrivacy regulation has removed Article
10 and the mention of browser-based controls for cookies entirely and thus
consent stays per-website.

I really wish the choice of privacy related to advertising was baked into the
browser and enforced there. Given the above developments, it's the only route
I can see that avoids pop-up fatigue for users. The number of pop-ups everyone
has to deal with causes user experience friction and wastes everyone's time.

It'd seem reasonable to me for sites to be allowed to pair with advertisers to
request _additional_ consent via pop-ups if they want, but with the defaults
in the browser.

That way a site would have to make a conscious decision that it's worth
getting consent from a user in order to monetize them -- and users would only
need to be informed and provide their consent when something outside their
expectations is being requested.

I'd love to hear from anyone who's tracking this - I'm not a lawyer and all
this is the bits and pieces I've picked up while reading on the web and trying
to determine an analytics consent strategy for a project I'm developing.

Edit: NB: I realize there's a context of apps rather than websites in the
article, but I'd hope and suggest that the fundamentals are the same,
especially if & when PWA's blur the distinction between browser/mobile-OS as
host.

[0] -
[https://en.wikipedia.org/wiki/EPrivacy_Regulation_(European_...](https://en.wikipedia.org/wiki/EPrivacy_Regulation_\(European_Union\))

[1] - [https://www.cookiebot.com/en/active-consent-and-the-case-
of-...](https://www.cookiebot.com/en/active-consent-and-the-case-of-planet49/)

[2] - [https://eur-lex.europa.eu/legal-
content/EN/TXT/PDF/?uri=CONS...](https://eur-lex.europa.eu/legal-
content/EN/TXT/PDF/?uri=CONSIL:ST_13808_2019_INIT&qid=1577719123146&from=EN)

------
Cartonju
Another source:

[https://noyb.eu/three-gdpr-complaints-filed-against-
grindr-t...](https://noyb.eu/three-gdpr-complaints-filed-against-grindr-
twitter-and-the-adtech-companies-smaato-openx-adcolony-and-atts-appnexus/)

 _Research by the Norwegian Consumer Council (Forbrukerrådet) shows that many
smartphone apps send highly personal data to thousands of advertising
partners. The report uncovers how a large number of shadowy entities are
receiving personal data about our interests, habits, and behavior, every time
we use certain apps on our phones. This information is used to create
comprehensive profiles about us, which can be used for targeted advertising
and other purposes._

 _“These practices are out of control and are rife with privacy violations and
breaches of European law. The extent of tracking makes it impossible for us to
make informed choices about how our personal data is collected, shared and
used. Consequently, this massive commercial surveillance is systematically at
odds with our fundamental rights”, says Finn Myrstad, director of digital
policy in the Norwegian Consumer Council._

 _“Every time you open an app like Grindr advertisement networks get your GPS
location, device identifiers and even the fact that you use a gay dating app.
This is an insane violation of users’ EU privacy rights”, says Max Schrems,
founder of the European privacy non-profit noyb._

------
lidHanteyk
CCPA cannot kick in fast enough. We need consumer protection in the USA too.

~~~
PeterisP
I haven't delved into a lot of detail of how CCPA works, but it seems to me as
a step in the right direction that won't achieve much practical change.

It's worth to look at this from the perspective of EU privacy legislation
_before_ GDPR - it already included all the things that data should be
safeguarded, customers should be informed, that consent matters, etc. However,
it didn't really have a meaningful effect on actual privacy before GDPR (and
IMHO we'll need a GDPRv2 not that long in the future), because companies were
able to comply simply by adding legalese, privacy policies and notifications
without actually changing the privacy-relevant behavior.

One aspect that I feel that CCPA is getting wrong and GDPR got right is the
"default condition". According to GDPR, companies are prohibited to process
your data unless they can provide a specific legal basis that allows them to
do so, one of which is opt-in consent. According to CCPA (correct me if I'm
wrong) the default position is that companies are allowed to do whatever they
want with your data, but they are required to inform you and honor opt-out
requests. This means that it's plausible for a company which currently is
doing shady stuff to legally continue doing so, as long as they can write up
some fancy words to 'inform' and ensure that most people don't do the opt-out
process.

The second difference is the distinction between having the data and using it
for a specific purpose. It is a very common scenario where certain private
data are clearly needed for a specific purpose (e.g. your address to deliver
some goods), but we'd want to restrict the company's ability to use the same
data for other purposes e.g. targeted advertising. With the way how GDPR is
structured, this separation is built in; but CCPA doesn't make such a
distinction, so (for example) the company is able to share data with a third
party that's needed for fulfilling some business purpose (even if the customer
had the opt-out for selling data, because it's not selling), then (as far as I
understand - correct me if I'm wrong) according to CCPA the third party is
legally free to use data for whatever other things it wants. And we should
expect that business arrangements will be intentionally designed to
'accidentally' make transferring such data (without a sale) to be part of many
deals, so that the big data aggregators would still continue to get all the
same information that they had; that companies will be able to comply while
continuing to do the same thing.

~~~
lmkg
> _This means that it 's plausible for a company which currently is doing
> shady stuff to legally continue doing so, as long as they can write up some
> fancy words to 'inform' and ensure that most people don't do the opt-out
> process._

CCPA requires you to have a link with the exact words "DO NOT SELL MY
INFORMATION" visible on the homepage. That is an explicit provision. This
somewhat limits company's ability to legalese their way around it.

~~~
mondoshawan
Still pretty crappy requiring users to take action to stop the selling of that
information. Worse, it should be stopping the collection of it. Ugh. :-/

~~~
mgreg
While that is true - a consumer must take action to protect their individual
information - it may be that if enough people do this that the cost of
compliance (responding to the ccpa requests) will make it less profitable to
hold on to a consumer's information.

There are also lists with links and contact info of important businesses to
exercise your CCPA opt-out or delete rights with like data brokers such as
Experian, Epsilon, TransUnion as a start. Even ccpa email templates that may
be easier to use rather than going through a form (not sure if they will
work).

------
shadowgovt
Yep. It's almost like the privacy law in the EU was written with no mind paid
to how the world of online interactions actually works.

I anticipate enforcement will go about as well as enforcement of drug policy
goes in the United States.

~~~
stubish
The products are not illegal, just the business model. If the businesses can't
follow the law, they will get blocked or banned, and their market share will
become available for the same or similar product backed by a legal business
model. Unlike drugs, where the product itself is illegal.

~~~
shadowgovt
I don't think it's nearly so clear that the product isn't illegal.

------
fnord77
thanks for reminding me to reset my advertising identifier on my iphone.

------
fergie
The world is beginning to rely more and more on Scandinavia to defend human
rights and democracy.

~~~
mrweasel
That's not really the failing you get living here. I'm not saying it other
don't have it much much worse, but at least the Danish government really like
invading peoples privacy.

~~~
Gwypaas
In what ways do they invade peoples' privacy?

Swede considering moving across the sound.

~~~
mrweasel
Currently telcos are required to collect data on EVERYONE, not just criminals.
That's clearly illegal, according to EU law, but the Danish government and
minister of justice have basically said that they don't care.

There's an increased interest in CCTV and privacy concerns are ignored, even
though crime rates are lower than they ever have been, but you know: Terror!

You can't drive around the country without the police knowing where you are,
because license plate scanners are everywhere.

Oh, and the Danish government have exempted itself from the GDPR.

------
bsenftner
I realize we need formalized reports to produce legal action, but this sure
reads like "News! Water is wet!"

~~~
Finnucane
The overall conclusion is hardly surprising, but the details matter in
formulating a response.

------
aguyfromnb
Every time there's a corporate scandal of sorts (Boeing and Wells Fargo come
to mind as recents), the people here scream for accountability.

So what say ye, Googlers, Facebook employees and others? Should fines or
prison fall on your shoulders? Why do you continue to do it, knowing you are
breaking the law? Knowing you are harming people? Glass houses and all that...

~~~
stjohnswarts
No but the government should start some crippling fines like "google thou
shalt be broken into 3 companies and pay 50% of last years revenues". If it's
just a couple million bucks that's nothing to google but permanent damage to
the company would put the fear of Dog into them.

~~~
harry8
Probability of succeding in that aim is small. Probability of making that an
immaterial expense to google by burying it with lawyers and taking it as far
as the supreme court is close to 100%. Personal accoutability or bust. It's a
very different management analysis if "I could go to jail."

