
Invisible Manipulation: ways our data is being used against us - crunchiebones
https://privacyinternational.org/feature/1064/invisible-manipulation-10-ways-our-data-being-used-against-us
======
ex3xu
I get annoyed even by the purported advantages of data collection, like
Youtube or Amazon recommendations. Just because I curiously clicked on a
stream of data points six weeks ago does not mean I suddenly want to be
inundated with more of that random tangent today.

So I hate logical positivism. From logical positivism we get the philosophical
foundation of modern tracking and advertising, that every data point obtained
about a subject is necessarily meaningful and actionable, and that which
cannot be measured does not exist. From there we get the unhealthy Nielsen-
style reductionist tendencies in marketing that believes all it needs to know
about you is your age, race, gender, sexuality, and income bracket. I think
Frankl's discussion on dimensional ontology is the most complete picture I've
found of why this is a flawed point of view, though I think any person who has
been discriminated against based solely on one of those criteria should be
able to understand the limitations of that model.

I hope we can start to understand better as a tech culture that,
unfortunately, human beings are decidedly unscientific in the sense that past
data collected will not always provide accurate conclusions about future
behavior. People grow and change, and collecting data silently in the
background and trying to draw actionable conclusions without active user input
and feedback is a fundamentally inefficient process, with harms that can be
quite difficult to measure, though this article articulates some of them quite
well.

I suppose I can imagine that if we lived in a society free from the incentives
generating the kind of abuses this article lists, and someone has generated an
algorithm that could help you find meaning and success and fulfillment and
loving relationships based on all the tracking data you provide, I can imagine
this to be the dream of big data for which the sacrifice of privacy might be a
necessary one. But given that we still live in a society that prefers
exploitation to altruism, what we can salvage of our privacy is still probably
the best option until we can reach that next stage of our evolution.

~~~
candiodari
You realize that you're just complaining "they" are not using your data well
enough. You certainly can't realistically sort through youtube yourself, nor
can you realistically find what you need on Amazon.

Fact is you cannot use these sites realistically without these algorithms. It
just can't be done. Or at least one can say, you can use Amazon and Youtube
better when guided by these algorithms. I do think that it is the difference
between not being able to use them at all and using them, but maybe you think
it's just a bit less than that.

(also logical positivism and inferring things through statistics are pretty
much diametrically opposite things. No logical positivist is ever going to
accept statistical evidence about anything, but no matter)

> I suppose I can imagine that if we lived in a society free from
> exploitation, and someone has generated an algorithm that could help you
> find meaning and success and fulfillment based on all the tracking data.

That's exactly what these algorithms do. They try to find what YOU would find
meaningful and fulfilling and they really do try to give it to you. When that
means money will be exchanged they prefer one vendor (that has paid them) over
another (which hasn't, or paid less), but does that really change anything ?

~~~
neffy
No, actually, I as an adult human being, am quite capable of using search, to
find what I want.

Now, I agree, search is also governed by algorithms - but I'm also quite
capable of sitting there for a few minutes, paging through until I get past
them to find what I want. I do not want anything controlling my filters,
except me. Edge cases can be dealt with. Latex lovers need to learn that it's
a text editor first.

As for Amazon, I have never found their recommendations helpful, or bought
anything via them. If Amazon wants to help me, it really needs to clean its
house on fake recommendations, and bogus products.

~~~
candiodari
A simple google search for, say, "string" confirms to me that what you say is
not true. I program, so it shows me ... well not clothing. Things I would
_not_ want to sort through.

So you should realize that even something as simple as search cannot work well
without "using my information". And I very much want it to do that.

~~~
l0b0
If that were true, people would not be able to find anything useful on
DuckDuckGo.com and similar sites. Tracking is definitely not a prerequisite
for quality search.

~~~
candiodari
[https://duckduckgo.com/?q=string&t=h_&ia=web](https://duckduckgo.com/?q=string&t=h_&ia=web)

[https://www.google.com/search?q=string](https://www.google.com/search?q=string)

Judge for yourself whether context matters. I suppose the results will differ
from person to person, but I feel comfortable saying that while Duckduckgo
isn't a lost cause, it contains a lot of bullshit to filter through. Example:

> STRING: functional protein association networks > STRING is part of the
> ELIXIR infrastructure: it is one of > ELIXIR's Core Data Resources. Learn
> more > ...

I mean ... really ? I suppose it must be a string, but, and maybe this is just
me, I'm not looking for genetic manipulation tips or databases when I search
for "string".

~~~
paulgrant999
What I find funny, is that you think some crappy algorithm, can beat the real
neural network, your mind ;)

You convert a fixed problem ('locate search terms that have a high correlation
to what I am searching for') to a fuzzy problem of "how can I guess what this
person is searching for, based on a sequence of search terms, where I
manipulate the results, thereby gauranteeing that the search sequence is
poisoned by what I presume they are searching for').

Oddly enough, the latter, is how you get high dwell on a search engine (COUGH
COUGH).

as soon as I started using duckduckgo - bang, back to 1 minute searches as I
just altered my search terms and actually got back, what I wanted.

~~~
candiodari
Actually, when you link up 2 neural networks, or even a neural network and,
say, a search algorithm, the neural network rapidly learns to use the external
algorithm to do what it was planning to do anyway.

So there is no "beating" a neural network. The reward for having such a search
algorithm is simply the ability to put thoughts into your mind. Your mind may
still reject them, and if you don't feed it enough relevant data, it will
rapidly abandon you.

~~~
paulgrant999
presuming you don't just abandon the search is irrelevant. :)

which is why I don't stress that "agree to disagree over similar premises".
can be both a hindrance, and a benefit.

------
mario0b1
>Soon the default will shift from us interacting directly with our devices to
interacting with devices we have no control over and no knowledge that we are
generating data.

That's already the case. The big problem is that no one cares about it,
because it seems like everything just work fine. I don't know and I can't tell
if bad things will happen to us with our data everywhere, but it's possible.
But how could we possibly convince "normal" people that giving your data out
for free is bad?

~~~
diminoten
Because everything _is_ fine. The big conceit from privacy advocates is they
think people don't know their data is being collected and used, but the
reality is, they do know. They just don't care.

Show me a concrete negative impact on my life as a result of the passive data
collection perpetrated by Google, Facebook, et. al., and you might change my
mind, depending on how negative it is.

I get a _lot_ out of the products I pay for with my data. It feels like a
bargain, honestly.

~~~
blub
You believe this is all about _you_ , when in fact it's about our societies
and vulnerable categories of people (mostly the poor, and various religious,
political, ethnic, etc minorities).

You're basically declaring at first they came for everybody else than me, but
that's okay because I'm still okay!

If the effects of pervasive surveillance are a negative to society it doesn't
matter that you're benefitting - your "bargain" will have to cease existing.

~~~
diminoten
My "bargain" is _my_ choice, and the choice of anyone who wants to make it,
and no one gets to rip that away from us because they're afraid of some
boogeyman who will never come.

The real conceit comes from people who think the rest of us don't know what
we're doing; we do, and we don't care because it doesn't matter. Those people
have no special information, and they're not smarter than us.

There is no boogeyman, there are no "dire consequences", and the people
peddling that nonsense are tilting at windmills.

~~~
blub
No one will come to you and forbid you from using Google or Facebook.

If everything goes right, government and society will instead pressure those
companies to offer honest alternatives to their existing spying-based services
and make it cristal clear how one pays.

The masochist, careless and indifferent will likely still be able to use the
services they know and love, but their lack of interest will no longer be paid
for by everyone.

Smoking was also pretty cool at one point...

~~~
diminoten
"Spying-based services" is some truly disingenuous labeling.

------
louprado
How is _calculation of health insurance premiums_ not #1 on this list ?

Recently I purchased a DIY mole removal product from Amazon. An expansive
actuary table would show a "concern about moles" has a slight positive
correlation with "skin cancer". Arguably my premium should go up because of
this.

It also blows my mind that Amazon won't allow you to delete a purchase from
your Amazon order history. It is there forever and there is no reason that
data can't be sold even if the account is closed. To be fair I am making
assumptions without having read the privacy policy in Amazon's EULA.

~~~
lightswitch05
I agree that this is a glaring omission. Car insurance companies already have
monitoring devices you can plug into your ODBII connector that monitors all
your driving behavior. OnStar also has the ability to report driving behaviors
to insurance companies. Or how about companies that claim your credit
worthiness can be judged by your Facebook friends and posts. How long until
these extremely invasive practices become so profitable that companies make it
required? I worked for a company whose Heath insurance offered $50 to
employees that got a heath screening through them. The next year it was no
longer a discount, but a fee if you did not participate.

~~~
lotsofpulp
Insurance makes sense for random events, especially not under the insured’s
control. Ignoring healthcare, since it’s not even an insurancable risk, people
who drive safely are subsidizing people who drive in more risky ways, but only
because in the past, the technology wasn’t there (or cheap enough) to provide
all of the data to accurately price them. Now that it is possible, I don’t see
why people who take more risks on the road should be subsidized by those who
don’t.

~~~
lightswitch05
My comment is on the loss of privacy. I am not an aggressive driver and I
suspect that I might get a discount if I joined one of these programs, but I
value my privacy too much for that. I do not want my every move monitored by
my insurance. OnStar would even have the ability to report GPS location. How
much longer then until the locations you visit are also factored in- or
perhaps that info is used to ‘enrich’ other insurance types. What if your car
insurance shared with your health insurance that you visit fast food
restaurants twice a week? That’s hypothetical right now, but my point is that
the data is so valuable that companies will become more and more invasive to
get it- especially insurance. At some point these optional privacy violating
practices will become required unless we have legislation protecting us.

------
pasbesoin
Just look at the asymmetry. These organizations push to collect as much data
as possible on every aspect of our lives.

But when it comes to information on what they are doing -- and the truth of
their internal world? No, that's proprietary, a trade secret, etc., etc., and
they slap non-disclosure and non-disparagement strictures on everything and
everyone they can.

In other words, look at their _actions_ , not their words. That should tell
you something about the value of all this data. And about their self-serving
hypocrisy.

------
pq0ak2nnd
I wish this article had actual substance and not just a listicle of off-the-
cuff one-liners.

~~~
ex3xu
You can click into the links to see the full case-study length explanation.

------
mattbierner
Instead of trying to prevent data collection, what if we made simple tools
that let anyone make their data less useful: data pollution or telemetry
vandalism if you will. Either generate a bunch of noise or try exploiting
analysis such that a machine would not be able to tell real from fake.

Simple example: instead of an ad blocker, everyone runs a script that “clicks”
on every single ad they see

~~~
jstanley
Here you go: [https://adnauseam.io/](https://adnauseam.io/)

------
ddebernardy
> Cities around the world are deploying collecting increasing amounts of data
> and the public is not part of deciding if and how such systems are deployed.

Really now? Aren't city officials elected in North America and Europe - and
most other places nowadays?

Go vote if you mind it.

------
noja
A short video would reach the non-choir audience better.

~~~
nukleosome
they've already done that here:
[https://privacyinternational.org/campaigns/uncovering-
hidden...](https://privacyinternational.org/campaigns/uncovering-hidden-data-
ecosystem) as part of their campaign

------
crunchiebones
why did '10' just disappear from the title? pretty sure I didn't edit it by
accident ._.

~~~
pault
HN has an explicit rule against "$N reasons why..." headlines, AKA listicles.

