Hacker News new | past | comments | ask | show | jobs | submit login
The Fantasy of Opting Out (mit.edu)
611 points by pshaw 20 days ago | hide | past | web | favorite | 195 comments



This is one of the better articles on the subject, not written in inside-tech-bubble terms. I think they also describe something we may call "obfuscation of power": formally you can opt out, in practice you cannot, and people who have power over you are lost in a soup of legal entities. So first, you are alienated to face surveillance alone as an individual, taking all blame for losing your freedom because you "chose" it. Second, it's hard to even name the surveillers (one of the 500 "trusted partners"?). There's now a parallel discussion on AI black boxes which can further serve similar purposes. ("It was not our decision, some AI did it".)

The sooner it will be commonly framed as a question of freedom (i.e. absence of arbitrary power of other people above you [1]), and as a question in the interest of society as a whole, the better.

[1] I'd take the opportunity to plug Quentin Skinner's "How should we think about freedom", "A genealogy of liberty" or a similar talk, where he challenges the tradition of understanding freedom as absence of direct interference. I think this stuff would be beneficial for politics nowadays.


I'm reminded of Foucault's work, Surveiller et punir. He discusses this obfuscation of power, specifically that we are being conditioned by unseen entities in our daily lives in ways which are entirely unnatural. He contrasts the more modern (and arguably unnatural) levels of discipline imposed by surveillance (specifically with respect to panopticism) to those imposed by the fear of bodily harm that existed in the days of executioners.

When we held public executions, we could at least see who was wielding the power, by whom we were being conditioned, and what there was to fear. Yes, the body was being tortured, but the soul remained intact. In the surveillance age, we are subjugated by forces we cannot untangle, see, or fully understand. We discipline ourselves in fear of these unseen entities who may or may not be constantly watching; what they do with our information remaining a mystery


For regular folk, the book is Discipline & Punish in English translation

You're explanation is fairly academic and I think we can be a little more concrete. Focault's point is really that discussions of power in the West tend to revolve around law and right, jurisprudence, when in fact the exercise of power qua something that alters your behavior is in almost every relationship we enter into with other human beings, no matter how anonymous: when we go to school (which is modeled on factories), when we go to work at the factory (where they wanna know how long you take having a piss) or (as many of us here are familiar with) the office, when we pack onto the train for the commute (which to me is a place where our behavior definitely meets up with the practical definition of panopticism, e.g. we act as if everyone is watching everyone else), when we "come out the closet" (despite being a 'practicing homosexual', for lack of a better term, Foucault was not comfortable being labeled gay, and its speculated that the conditions around his death by HIV were exacerbated by this; as well, I find sexuality to be strongly policed on all sides in America).

None of this is new in history, but modernity ramped up the scale and intensity to previously unseen levels (much like everything else)


"when we go to school (which is modeled on factories)"

I thought they were modeled on prisons.


Both, but the similarities in the three can be striking regardless of ordering.


But framing things differently has been done, yet we have smoking, drinking, gambling, mcd and coke.

To me it fells like certain people (a percentage of any population) are just wired to exploit human weakness.

Sometimes they are unconsciously doing it and when made conscious don't know how to exit that path. And society needs to make it simpler for exits to happen.

Those that are consciously doing it...well we are just stuck with them like we will always have bugs in code.


...which is the value of robust systems designed to limit the impact these people can have.

This is precisely why mechanisms like separation of powers and whistleblower protections matter. This is also why social conventions around honesty, reciprocity, fairness, etc. matter.

When these mechanisms and conventions operate smoothly, you get a sort of "security in depth" against abuse and exploitation. People acting in bad faith find their worst impulses thwarted, both by societal / legal systems and by average citizens who are empowered to help hold them to account.

When they don't, you get dangerous patterns of corruption and indifference; these weaken trust in the system, making it harder to fix over time.


"To me it fells like certain people (a percentage of any population) are just wired to exploit human weakness."

A theory is that all humans do it in various nuanced situations whose categorizations are specific to each individual. Sometimes its as simple as getting in physical fights when you know you are stronger, or playing political games at work to get ahead. Sometimes its as nuanced as choosing to run an ad campaign to make people feel bad for being fat so they buy your product. Sometimes its negging, or sussing out if an employee might get pregnant soon, or choosing to make tests for ovarian cysts not covered by health insurance.


I would rather blame the systems we all create and enable. Our reward systems in particular. These are the systems that make the exploitation so enticing and irresistible.


>The sooner it will be commonly framed as a question of freedom (i.e. absence of arbitrary power of other people above you [1]), and as a question in the interest of society as a whole, the better

I agree but I don't think I'd use the word "soon"

There's a reason this is an opinion piece by a person who specifically cares about these issues and not an article.

Things are going to have to get worse before they get better. Right now the slices of society closest to these issues (tech and academics) don't seem to care at all as long as it's not overtly being used to oppress the poors or some specific minority. Said slices of society are also mostly wealthy enough that they can comply with most laws most of the time, they're not gonna get robo-tickets for not having their car inspected. Nobody's gonna care until it hits them personally or has the potential to do so imminently.


The few privacy laws we do get are usually way off target.

It used to be easy to deal with medical things. You could ask about your spouse or parent or sibling, and you would get an answer. Now a family gets treated like random strangers who lack any sort of relationship with each other.

It used to be that grades were publicly posted. This was a powerful incentive to do as well as possible.


>Nobody's gonna care until it hits them personally or has the potential to do so imminently.

It's for this reason that I really, really want the US to implement some kind of formal, stated facial recognition across the board. I honestly don't think we'll see any mass movement on privacy issues from regular folks until something that big happens.


I think you’re overestimating what the public response to such an event would be. The vast majority of the public doesn’t care if their privacy is utterly violated by the government as long as there’s some flimsy justification for it catching bad guys or saving children.


Just, Devil's Advocate - but what happens if you allow such a program to go forward, and no one outside the usual suspects objects? How would you reign it back in?

I believe it would be best not to allow such a program in the first place, because counting on the population to be opposed to it is, well, let's just label it "unwise".


Naive to think our entire online and physical location data isn't already indexed by name, phone, email ids. Snowden said as much. The accelerometers on your phone already give gait.


I'm very pessimistic about all this. It seems to me that the horse has already bolted: we are all under constant surveilance and there is not much we can do about it at all. Many of us don't even think it's a problem and even many who do think it is don't want to lose the benefits that are attached to the constant automated surveilance.

As citizens of democratic countries we have uttely failed to protect our rights against private and public intrusions. We have failed to vote against politicians who introduce legislation to legalise pervasive automated surveilance. In the UK, Theresa May, who brought the Snooper Charter into law became Prime Minister instead of being kicked out of her political career in disgrace.* A majority of citizens of Western democracies continue to use social media and online services like Amazon and Netflix etc.

The EU has introduced some legislation of course, but even in the EU state institutions are free to spy on citizens and non-citizens alike. Everytime I take the Eurostar for the continent I'm offered a "fast track" through a set of automated gates that use facial recognition. I keep declining, but my partner who also normally declines was coerced to go through by a border guard. Pseudo-scientific deep-learning lie-detectors are deployed at the borders of my own country, Greece, as well as Hungary and Latvia.

But all this started long ago. When surveilance cameras were first put up on every shop and every street corner in every major EU city, only hippies and weirdoes opposed them. And now we're all watched by them and there's no changing that.

Who is going to stand up against the new threat to our rights? A few dispersed academics that nobody listens to? Anonymous?

___________

* She was, but only later, because of Brexit.


History has consistently shown that the rights tend to arise after the violations, not the other way around. The Magna Carta was written in response to an awful king, the FDA was established as a response to widespread problems with people getting poisoned, slavery wasn't abolished until millenia after it was invented, and labor organization (leading to such modern niceties as the 40 hour week) did not arise until well after the industrial revolution. Give it time.


I think you’re right but the problem is that this time it’s different: maybe the rights will arise, but for many people it’ll be too late.

Assume there is a privacy law enacted, all the data that was previously collected is still around. So essentially people around prior to that law are still somewhat screwed. It’s good for those coming afterwards, so eventually it washes out, but it won’t be like the FDA or slavery where everyone benefits quickly.


>> The Magna Carta was written in response to an awful king, the FDA was established as a response to widespread problems with people getting poisoned, slavery wasn't abolished until millenia after it was invented, and labor organization (leading to such modern niceties as the 40 hour week) did not arise until well after the industrial revolution. Give it time.

> I think you’re right but the problem is that this time it’s different: maybe the rights will arise, but for many people it’ll be too late.

This time is not different. Take the FDA example for instance: it's founding (or later gaining the power to oversee drug safety) didn't help the people who'd already been poisoned [1]. The reforms were too late for them.

[1] an example: https://en.wikipedia.org/wiki/Radithor


Oh of course, but those who had not yet been poisoned were protected. I guess I'm agreeing with you and adding the caveat that nearly everyone is already poisoned.


I guess the hope at this point is that our children won't be.

I miss the vibe of the late 90s and early 00s. I remember feeling like sure the world is fucked both socially and environmentally but that things would improve. The bad old lot would be kicked out, the internet would spread freedom and understanding, science would advance and we would all prosper. Hell I remember being a kid back then and being frustrated by all the environmental themes in media back then because of course we should all care about that sort of thing, it was obvious. Same as I remember getting sick of all the "don't be racist/homophobic" very special episodes for the same reason - obviously they were preaching to the choir right?

Well now environmentalism seems to have fallen off and the planet is fucked.

People are saying white men are to blame for everything or instead saying it is the fault of women and immigrants.

The internet is instead a collection of platforms that are safe space echo chambers and is full of toxic culture wars.

Science marches on and it brings us drone strikes, China's social credit system, a surveillance state, and the expectation we are always available to answer work emails.

I want to get off this ride.


There will have to be time, but also immense harm and death before something will be done. It wont just take decades of the discomfort of wondering what they're doing with the data. It will take some sort of massive act against the people done with the data before the outcry will happen, as with any technology. People didn't invent and require seatbelts becasue "you could die" people did die.

The question is what will happen, or how bad will it get? The worst fear is some sort of automated genocide.


I think that one of the pervasive problems of these kinds of technology is that each individual violation can be individually tailored to the person in question.

In my opinion, it's not going to be some blanket thing that is immediately obvious, but rather a collection of individual actions taken against individual people in exactly the way that is the least suspicious for that person specifically.

Plus with the level of surveillance, those in control of the centralization will also know the dissidents well before they become a serious threat and can take preventative measures should they need to.


What's the ultimate end game though if you're one of these future (not so distant?) powerful elites in control? You turn everyone else into slaves? Will they remain happy and productive? Will they still invent shiny new things for you if they know they're slaves? Is your only job now to prevent the slaves from figuring it out? Is that even a fun job or is it sort of like a prison? Can you still find meaning? How long can you conspire with your peers until some of them get bored and rock the boat?

Sounds like a moronic shit show to me. We either find enlightenment before the earth swallows us up or we don't. I think it'd be cool if we figured it out.


Interesting point, it's a hard problem to define exactly what the rules should be especially when the violations are so personalized and well obscured. And an awkward transitional period while companies that built their business around the assumption that they could monetize models of their users on behalf of their customers either collapse or totally change their business model.

I hate to look at specific outcomes, but perhaps a useful question would be... is there a way to both pay your tolls with a RFID thing on your car or license plate reader and also not give the DOT a complete record of exactly when you go through each toll plaza? What would the counterargument to this be?

I have the feeling the counterargument is going to be something inane but emotionally based like "So that police can find your car if it is stolen" (which I strongly suspect is nonsense and that it is totally possible to do make this impossible without you actually giving them some secret code to do so) rather than "so that when we get hacked the mob knows when you're home".

Maybe what we need then is a regulatory system where you must assume that you have been hacked by the worst possible opponent and that they have administrator access on your servers. Then have a checklist for what things absolutely must not occur -- like that if Facebook can tell when you're not at home because of your phone's GPS under absolutely no circumstances should this information ever be linked to your home address via any identifier. But as noted, the line is currently comically fuzzy, and where that line should be needs to be a lot clearer.

Do I really mind Amazon knowing my purchasing habits on their website, even if they use algorithms to make recommendations or modify prices for me? Personally, not especially as long as it's anonymized in their model well enough to not leak my PII if someone hacks the computer with their datasets.

Do I mind if them getting hacked will let someone figure out my shipping address and when packages are getting mailed? Or get my order history? Yes. So I'd say an Amazon employee should be totally unable to access that information and would need to have my explicit permission to access my account information. I don't think that's currently the case, and I think it should be enforced by regulation that doesn't allow them to collect it if they can't or won't do that. Of course, currently I have no power to make that the state of the world, but that's ideally the point of elected officials, to speak for the people.

Maybe the appeal to "do you want the mob to know where you work, when you're home, and how many kids you have" is sufficiently concrete to make this more obvious to people? But I kind of agree it's a lost cause after Obama betrayed his pledge to get rid of the Patriot Act.


The frustrating this is, we HAVE ALREADY SEEN terrible violations of surveillance and profiling power, in Nazi Germany, Stasi East Germany, Soviet Russia, Communist China, etc. Tens or hundreds of millions have disappeared, suffered, and died. I guess we didn't take the "universal surveillance/profiling is dangerous" lesson from any of these experiences hard enough, despite it entering into major western literature through stories like 1984.

I guess the analogy is that Voltaire was writing about abuses decades before they stormed the Bastille. So maybe it takes MORE abuses. It's terrifying to think of how to scale abuse UPWARDS from those examples, though.


1.) those horrendous actions happened more than a generation ago, people forget.

2.) the lack of a terrible regime on the other side of the planet committing these acts removes the shame in proposing them here.


I think that a 'better' example is what happened in France during WW2: jews were deported to concentration camps, how did they know who was Jewish? Because the French government (before the invasion) had a registry/census who kept this information..


To be totally fair, we would have difficulties detecting the opposite (rights preceding abuses) because people would presumably be organising against the abuses.

For example, murder is generally highly illegal everywhere without need for particularly violent murdering sprees to prompt action.


One cannot begin to count the number of murdering sprees ("invasions," "sacks," "pillages") in the history of civilization that very much prompted early societies to legislate against murder.


Those were some very good points!

IMO, we're on the cusp of a level of individual freedoms that were not available in the past (in the western world) and are seeing those that are desparate for relevance, those unable to compete/comprehend or those not involved/behind doing their darndest to stifle the growth and progress.

Most of the privacy exploitation sounds like that saying about "C managers" firing "A and B employees" due to their insecurities or more likely, job security/relevance.

My only question at this point is...

Does chasing the boogie{man,woman,trans} count as exercise?


> IMO, we're on the cusp of a level of individual freedoms that were not available in the past

I really, truly, wish I could share your optimism.

What I think will happen is that things are going to continue to go strongly the opposite direction, until everything gets so egregious that people actually start fighting back in numbers.

If that fight is successful, we'll just return to something close to, but not quite, the status quo that existed a couple of decades ago.


>I really, truly, wish I could share your optimism.

Lol. If you mix all of the colors, you get black. If you mix all of the data, you get a black hole.

I had to come up with that one myself to keep my perspective in line.

Hope that helps!


I have a plan! Can an individual class all personal information about them as: Trade Secret?

Companies can do this. Corporations can do this. If someone steals trade secrets, there are repercussions. Why isn't my personal info, which is valuable to me, and can harm me if used by the wrong people, not protected in the same way that information about a company would be?

AND

If that doesn't work, how do we get Trade Secrets for human beings? Why do companies and corporations have rights to protect information that human persons don't?


>As citizens of democratic countries we have uttely failed to protect our rights against private and public intrusions. We have failed to vote against politicians who introduce legislation to legalise pervasive automated surveilance. In the UK, Theresa May, who brought the Snooper Charter into law became Prime Minister instead of being kicked out of her political career in disgrace.* A majority of citizens of Western democracies continue to use social media and online services like Amazon and Netflix etc.

Compared to whom? We've de-munitionized RSA, we've mostly resisted the "outlaw strong encryption" meme (the U.K. is probably the most obtusely retrograde Western nation when it comes to tech, so I exclude them), we never pulled the "install this government cert to use the Internet" trigger (like Kazakhstan has tried to)... i-it could be worse!

x3

"Using social media" isn't a privacy-hostile stance like the above are; I'm a cryptonerd, but having my groceries delivered is just. I. Look. We live in a society. Where you don't have to go to the store anymore. It's too good to pass up. I order everything on Amazon through my offshore, Bitcoin-paid VPN, and I would have to be Stallman levels of insane not to. (Screw Netflix though, yeah; just torrent.)


>"We live in a society."


Who is going to stand up against the new threat to our rights? A few dispersed academics that nobody listens to? Anonymous?

Get to know your local anarchists.


> In the UK, Theresa May, who brought the Snooper Charter into law became Prime Minister instead of being kicked out of her political career in disgrace.*

To be fair, it is one horrendously broken electoral system. FPTP combined with this un-offical but totally official party system we have, has provided an illusion of democratic election. Just this week (now that election number 3 to be held within the last 4 years has been announced) we have national news papers publishing articles on how to vote tactically.

Just let that sink in. How utterly disgusting it is that we are in this position.


> Obfuscation may be our best digital weapon.

From Dan Geer's talk "Cybersecurity as Realpolitik" (which everyone should hear[1]/read[2]):

>> There are so many technologies now that power observation and identification of the individual at a distance. They may not yet be in your pocket or on your dashboard or embedded in all your smoke detectors, but that is only a matter of time. Your digital exhaust is unique hence it identifies. Pooling everyone's digital exhaust also characterizes how you differ from normal. Privacy used to be proportional to that which it is impossible to observe or that which can be observed but not identified. No more -- what is today observable and identifiable kills both privacy as impossible-to-observe and privacy as impossible-to-identify, so what might be an alternative? If you are an optimist or an apparatchik, then your answer will tend toward rules of data procedure administered by a government you trust or control. If you are a pessimist or a hacker/maker, then your answer will tend towards the operational, and your definition of a state of privacy will be my definition: the effective capacity to misrepresent yourself.

>> Misrepresentation is using disinformation to frustrate data fusion on the part of whomever it is that is watching you. Some of it can be low-tech, such as misrepresentation by paying your therapist in cash under an assumed name. Misrepresentation means arming yourself not at Walmart but in living rooms. Misrepresentation means swapping affinity cards at random with like-minded folks. Misrepresentation means keeping an inventory of misconfigured webservers to proxy through. Misrepresentation means putting a motor-generator between you and the Smart Grid. Misrepresentation means using Tor for no reason at all. Misrepresentation means hiding in plain sight when there is nowhere else to hide. Misrepresentation means having not one digital identity that you cherish, burnish, and protect, but having as many as you can. Your fused identity is not a question unless you work to make it be.

[1] https://www.youtube.com/watch?v=nT-TGvYOBpI

[2] http://geer.tinho.net/geer.blackhat.6viii14.txt


Well, obfuscation is great for that sweet frisson of resistance and for laughs (I guffawed when I read about ah's datapools; https://ahprojects.com/datapools/) but if we're resisting by hiding ourselves, then we've already lost.

We should not need to hide ourselves from anyone, because there should not be anyone to spy on us. We should not have to think about how to obfuscate our activity, because there shouldn't be anyone watching our activity without a very, very good reason (and even then, with enough safeguards to discourage all but the most commited of spies).

If we're subverting the authority of democratically elected governments -governments that _we_ have given authority to- then we are in deep, deeep shit. We might as well obfuscate ourselves out of shame of being recognised for the negligent democratic citizens that we are.


> your definition of a state of privacy will be my definition: the effective capacity to misrepresent yourself.

This is true of the world we live in today, which comprises of almost completely centralised services. But in the future, I believe the second option will be: building and using local FOSS replacements for those services.

Many of those local replacements will need no communication at all - So many things that run off a server today have no functional need to, it's about control. Others are more difficult, because communication is inherent to the service, and centralization is the simplest and most obvious choice when being provided by a coorporation, so an infamous "decentralised" solution is needed, which are more difficult to create, organise and promote.


What is the local FOSS replacement for Facebook and Instagram? It is not inherent in the notion of communicating publicly and privately using photos, videos and short text messages than I disclose some ungodly amount of private data. I have feel like writing some protocol and a few clients and servers, but I expect that there's already twenty four competing protocols that no-one uses because, if you're not the biggest, you're noone.


Yes, as I said in my last sentence:

> Others are more difficult, because communication is inherent to the service, and centralization is the simplest and most obvious choice when being provided by a coorporation, so an infamous "decentralised" solution is needed, which are more difficult to create, organise and promote.

I think the only way a decentralised social network replacement will win out facebook is with a universal but progressive/gracefully degrading protocol is created as the single way lots of different clients/federated servers etc communicate, a protocol that also allows greater level of control over where your data goes... i.e email but better.

As i said these problems are the most difficult, but there are a lot of thing being "clouded" at the moment that have no need, those are the low hanging fruit.

Note that I am suggesting FOSS is the preferred option to obfuscating, rather than the only option... some services will always be centralised not only because it's simpler but because it's about/for a central authority... for those obfuscation is the only way.


There is no alternative, because facebook isn't just "a protocol and some clients/servers". As long as it's -the- social network to be on, then it will always be just that. You can't compete with social culture just with mere programming.


It's not 'local', but the FOSS replacement for Facebook/Instagram is email.


> the effective capacity to misrepresent yourself.

Agreed. I describe this as The Right to Hide.[0] An important part of hiding is that when people ask you to provide information, you should be able to lie.

We've learned from App Mainfests on Android and on the web that if an app can tell whether or not they were granted a permission, they'll just try to harass the user into giving it. The better solution is to make it so that the app can ask for anything, but can't validate whether or not any of it is true.

I'm encouraged by the work people are putting into protecting privacy by blocking data collection itself, but I think that obfuscation/misinformation is a more promising direction for us to go.

[0]: https://anewdigitalmanifesto.com/#right-to-hide


> If you are an optimist or an apparatchik, then your answer will tend toward rules of data procedure administered by a government you trust or control.

Make the data public. All of it. Don't allow them any secrets. Knowledge is power, power to the people.


That would also spawn outrage mobs.

"This boy said right here in his Call of Duty chat that he was going to go to school with his dad's rifle and shoot people!!!"

The only thing you would do is put entirely unprofessional future mobs in place of the slightly unprofessional police we use currently. I would hope people could see the very real chance that your idea would make things worse.


> The only thing you would do is put entirely unprofessional future mobs in place of the slightly unprofessional police we use currently. I would hope people could see the very real chance that your idea would make things worse.

Worse than the status quo, yes. But you seem to be under the assumption that we will be able to maintain a police force at "slightly unprofessional" moving forward, when governments have the tools and incentive to become far more controlling.


Governments can be ultra-controlling with slightly unprofessional police. It happens all the time across the globe. But without exception, wherever those police have been removed, poor though those police may have been, it has resulted in untold misery on the populace.

Giving everyone the power to know exactly what everyone else is saying and doing is tantamount to taking data and removing all its police, and removing all its locks.

It will not end well.


I'm not sure how much stock I put in this analogy, but you could be right.

It could also create an incentive to avoid collecting data in the first place, which would obviously be a good thing. But perhaps this could be better achieved through other methods.


Power is power, and secrecy is a multiplier. So make the data of the powerful public. If you are a government, or chose to have a sufficiently high net worth institutionally or individually, then sure, make it all public. We gain most of the same benefits, while not incurring most of the harms. Its the 1% loss in privacy for 99% of the benefit.


It's heartening to know that I'm not the only person who has had this thought. Eliminate the power gradient and level the playing field.


> "Misrepresentation means putting a motor-generator between you and the Smart Grid. Misrepresentation means..."

All of that sounds exhausting. Some of it is easy... but some of those tasks would take over 10 hours to do. I've got a life to live. Plus, it doesn't really fix the problem, does it? Wouldn't it be better if we all took those hours to lobby the government to change the laws?


No, it would not be better. You are asking someone who is psychopathic and greedy to give up power, influence, control, and money. Short of forceful resistance (which isn't always necessarily violent - Gandhi used force but not violence for example), it isn't going to work.

It's also actually better if you only do the things to hide within your patience level, because the point is to lie and not be easily connected. Remember that part of why DNT is on the way out is because it set another trackable variable, working against the point. If you use Iceweasel on Linux, send a UA string of Chrome on Windows. Just look 'normal'. People with mental illnesses or alternate lifestyles have long since learned that the first and best coping mechanism to hide is to just look normal on the surface. The line from Ghostbusters comes to mind, 'Quiet, you're scaring the straights'. Look normal, keep your head down when there's no reason not to.

Make it too obvious that you're hiding, and they take it as a challenge. Just learn to hide passively.


People who attempt to obfuscate probably go to a special list when obfuscation is detected. You get scrutiny by attempting to avoid scrutiny.


The goal isn't selfish. Its collectively reducing the power of the scrutinizer by taking a small hit overall to protect yourself and others.


Misrepresentation is a good way to improve the training data sets for intelligence agencies.


Yep, as an individual there is a very limited pool of activities that you can and will misrepresent yourself in, and you will tend to do that in low entropy also.


On the other hand if you are just a mostly normal guy with a dark secret, your normal life generates enormous amount of data and that's fairly easy to misrepresent the tiny bit you want to hide.

If you are an avid facebook, instagram, youtube, reddit, twitter user, it should be fairly easy to hide your plan for world domination on HN.


Actually, misrepresentation is an especially bad strategy if you're a normal guy with a dark secret.


Only if they have the truth set to compare against already.


Thanks for sharing this.

I subscribe to this notion but this is very elegantly put.


Beautiful


Cost of ubiquitous surveillance is going down, and at the end of the day, that's the dominant effect. Given the intersection of low cost and demonstrated utility, we're seeing a sea-change in society of significant scale, such that "opting out of surveillance" really does being to equate to "opting out of society." Imagine if you had a moral objection to, say, walking on roads. What percentage of society would you have to avoid to maintain your personal constraint?

Rather than asking how we opt out, it's probably best to ask how we live in. How do we all get along and feel safe and secure given the new existence of multiple cheap-to-maintain panopticons? David Brin's "The Transparent Society" is a bit dated, but I think it provides a decent starting framework to consider the question. Because the genie's out of the bottle, and "being in the community" is going to be synonymous by "being seen by the community." We have some liberty to determine as societies what that means.

(Consider, as a bad example but perhaps a useful framing of thought: nobody in Star Trek worries about the fact that the computer on a ship knows what they're doing, continuously, at all times, and can report that to anyone who asks. Why? And how does that fictional reality differ from what people actually want?)


Everybody in Star Trek is serving in the military and has no expectation of privacy while on duty. Plus it only tracks their commbadges.

Another consideration might be: what privacy would someone have had living in a premodern or early modern village or even a small town at the turn of the century? Aside from being surveilled in our own homes by Alexa and her kind, are we worse off today?


> Plus it only tracks their commbadges.

Indeed, there's a few examples where someone leaves it in their quarters when they don't want to be tracked.


Not to rabbit-hole too far on this, but in the Next Generation, civilian families are also aboard-ship. However, that's almost certainly modeled as "military families and contractors," and the analogue to modern life is some autonomy is understood by those groups to be given up for the privileges of direct military support (i.e. you have to abide by some policies set by the military when you're granted on-base housing).

But we're talking a fictional universe, so best not to unpack this bad analogy too far. ;)


Also the civilians don't tend to wear commbadges.


What an irony that I open the page source of this article and see this literally on the second line:

  var mi_track_user = true
Then there are all the usual suspects: Google Analytics, Facebook, Twitter and a few others at the bottom. Is this the price I should pay for reading this article for free?


It’s good that authors are free to write about things that might reflect negatively on the underlying technology or business practices of the platform they are being published on.


i like the poetry of the third line too:

    var mi_no_track_reason = '';
We have no reason not-to track people, so let's track everyone


Could be worse:

  var mi_track_reason = '';



The use of the word Privacy is misleading the conversation. We should be setting the goal to make it illegal to collect. It should be illegal for commercial entities to collect information that can be considered personally identifiable information without explicit consent. Additionally, that should not be allowed within a commercial terms of service agreement, and if automated must also provide means to remove with the same timeliness of the automated subscription.


Consent is a poor tool for such a wide category of data. If you're looking for actual informed consent, we are talking about something either practically impossible or at least a massive strain on people's time and decision-making capabilities over things they often couldn't care less about.

See e.g. F.J. Zuiderveen Borgesius, Security & Privacy, ‘Informed Consent. We Can Do Better to Defend Privacy’, IEEE (Volume 13, Issue 2, p. 103-107).


In context of the cited article, the assumption is the data was already collected. My posit is that should not be allowed. I do not expect companies should be able to collect and then ask for consent. They should be subject to criminal behavior if they do so before and without explicit request.


Additionally if we agree to have our data collected, we should still own that data, meaning the 3d x-ray that the othodontist generates is your data, and you should have access to it (from anywhere if digital, or a copy of it if physical), and the choice of it's removal, when said data no longer serves you.


This would be a huge step in the right direction but I think it's important to limit surveillance by government entities as well. After all, you have _no_ opt-out options where they are concerned.


I agree, however expect government regulation to be easier target commercial activities. I additionally am considering existing U.S. law that requires government agencies to perform PII analysis of all projects and have that information available via FOIA and OMB reporting. That does not cover non-U.S. entities.


I cannot help but wonder if the use of such tools as the ones listed in the article are not just "privacy theater", in the same sense that we have come to realize that much of what happens in the name of security is just "security theater". Ironically, of course, is that much of what happens in the name of security is exactly what takes away our privacy in the first place.

What is needed is for people to tangibly suffer from lack of privacy. It is obvious to people how they suffer from lack of security.

Germans remember what a lack of privacy can do (Stasi in East Germany), and as a consequence, they are much more privacy-minded than in many other places.

What will be the wake up call for other nations?


Ther is a wakeup call already but it needs to come from us (tech workers). We do understand what is going on, we do use use adblockers, application firewalls, obfuscate our online fingerprints, use multiple accounts etc.

On the other side we also produce tools for mass survailance, ads networks, tracking, implement telemetry, adding ads providers frameworks to our applications.

In same breath we are protesting against selling technology to military, ICE, disallow their children playing online games where gambling is the revenue (opening boxes and similar), gitlab enforced telemetry,...

I see a huge discrepancy between what we want for ourself (privacy), what we preach and on the other side, what we do and we fuel the survailance society.

The solution is to stop on our side and dont do to the others what we dont want to be done to us. (... and even if there is always going to be someone "selling drugs to kids", most of people dont do it).


>We do understand what is going on, we do use use adblockers, application firewalls, obfuscate our online fingerprints, use multiple accounts etc.

That seems like a broad statement. I tend to use adblockers (but not religiously). But I generally don't obfuscate my online behavior or firewall different identities for most purposes. I certainly would for certain types of online behavior--and certainly for things like political dissent in some countries--but by and large I try to avoid doing things online I wouldn't want someone to find out about.

(Obviously I'm talking about actions that don't carry an expectation of person to person communications as opposed to broadcast. Though, even then, I'm pretty careful about what I commit to digital text or image that could potentially leak.


Do to others anything they accept to take. But tell them about what you do. One way or another, don't hide what you do, explain what you do.

On your drug analogy: sell drugs, and tell exactly what drug you are making. Or others, less scrupulous will make all sort of drugs, hiding what's exactly in them.

Do what you are good at, and make sure to inform, if whoever you are doing it for isn't playing transparency then whisleblow what they do. Then quit or stay, that part will depend on what impact you want to make to this world.


> What is needed is for people to tangibly suffer from lack of privacy. It is obvious to people how they suffer from lack of security.

And because it is obvious we have more and more security theater...


"The Lives of Others" of a moving film about the people on both sides of the East German microphone.

https://en.m.wikipedia.org/wiki/The_Lives_of_Others


the various 'cancellations' , celebrity images leaking, revenge porn etc should serve to remind people about its dangers. Maybe we need more hackers to put stuff online?


AdNauseam and similar obfuscation mechanisms strongly remind me of Neal Stephenson’s ‘bogons’ he introduced in his cinderblock-sized (and eminently enjoyable) novel Anathem. In that regard, he appears to have been highly prescient: hiding signal in reams of random noise (and random meta-noise about both signal and noise). http://www.virtustate.com/fake-news-advertising-and-bogons-s...


The author: Privacy does not mean stopping the flow of data; it means channeling it wisely and justly to serve societal ends and values and the individuals who are its subjects, particularly the vulnerable and the disadvantaged.

You are wrong, privacy mean stopping the flow of the data completely. No one should spy on your life.


No one should spy on your life.

Absolute statements are always wrong. ;)

The problem with this way of thinking is that it's very hard to define what 'spying' is, and what 'your life' is. This is best illustrated with an example - when you read this comment you'll have loaded a page on HN. That means HN's server probably has a log of your IP address, browser agent string, etc. There's a complete history of every article you've ever read, upvoted, commented on there for you (and the public at large for some things) to see;

Your profile: https://news.ycombinator.com/user?id=Altheasy

Your comments: https://news.ycombinator.com/threads?id=Altheasy

Your favourited articles: https://news.ycombinator.com/favorites?id=Altheasy

Taking the first page of your comments and that favourite I can reasonably assume that you're a developer, you don't like testing much, you have a cat, you have a judgemental attitude about how other people spend money, you have a smart phone, etc. Not great insights but you're pretty new here. If I trawled through the comments of someone who has 20,000 comments I could learn a lot.

So... is HN spying on you? Am I spying on you when I read those pages? I don't think so. You put that data out there in the open. The same is technically true for most data that people say is spying - eg Google Analytics isn't spying on you when it tracks everything you do on 50% of the websites you visit. You're giving that data away. That's fine. It's useful. It makes the internet better.

The flow of data in itself is OK. It only really becomes spying when the data is misused. That's what people want to control.


Grandparent's statement is pretty absolute, but I find myself in agreement with it. Data collection is the right place to intervene, because once collected, data can be copied and misused at any time in the future.

> when you read this comment you'll have loaded a page on HN. That means HN's server probably has a log of your IP address, browser agent string, etc.

Such logging isn't technically necessary to serve web pages, and ideally shouldn't be done without consent.

> Am I spying on you when I read those pages?

That's not spying, because the user consented to making their comments public. (Not sure about favorites though, there's a small note on the profile page but maybe the favoriting action should make it more explicit.)

> Google Analytics isn't spying on you when it tracks everything you do on 50% of the websites you visit.

It's spying if you didn't consent to it.


> Such logging isn't technically necessary to serve web pages, and ideally shouldn't be done without consent.

It's needed as soon as you want to do: non-trivial spam protection, context connection for errors/exceptions, dos mitigation, correlation of issues across browsers, and a few other things.

For most of those you could theoretically hash the IP because you're interested in matches not actual values (although matching either the AS or at least /24 makes things easier). But until we migrate to IPv6 hashing doesn't make sense (and once we move, keeping individual addresses doesn't make sense).

Basically the bigger the site, the more important that information is for operations.


You can do all of those things without logging that information. It’s a cheaper solution to the problem, but that does not mean it’s required.

Which devolves your argument into collecting this information is significantly more profitable. Which I think is generally accepted as true, but not nessisarily enough to make it acceptable.


How would you match traffic from the same source without keeping the record of that source?


That’s a technique not a goal. What are you trying to do?


Find when a specific endpoint / AS / country starts sending dos levels of traffic, (or hack attempts) so they can be banned.


Rate limiting prevents a specific IP from causing a successful DoS. You can log higher level information like county without linking it to a specific user.

In terms of hacking, building a secure site prevents this problem at the source. Banning specific IP’s in a world of proxies and public WiFi is almost useless.


You don't ban them forever. Banning specific ranges which impact you right now is very effective too. Also "building a secure site" at some scale is impossible. At some point you try to figure out where the risk is, how to mitigate it, and what happens after a break-in. You can't prevent it. Logging helps track specific behaviour and catch those situations. That's similar to fraud prevention as well. The fact that someone who just logged in from Germany tries to spend credit in a request from Brazil is important and prevents real crime. That kind of information needs to be connected to an account.


I specifically said you can get and log country information without logging specific IP’s.

Working on at /24 the level does everything else you mentioned.


It's spying if you didn't consent to it.

There's no explicit consent but the fact you've told your computer to download some code and run it looks a lot like implied consent.


I think that argument proves too much. To a user browsing the web, clicking a link that says "check out this nice article" signifies intention/consent to read that article, not to suffer the effects of all possible JS tripwires including pwning their computer and such.


This is the point I was making about misuse of data. Thinking usage analytics on a website is a tripwire is quite extreme. Thinking that building a complete profile of someone based on their activity on lots of websites is a tripwire is quite reasonable. Hence the difficulty in defining what 'spying' really is.


If by analytics you mean something like a hit counter from the 90s, which doesn't require recording user sessions, then I agree with you. But if it's recording user sessions, I think it's a good idea to require consent for that.


Sure, but all this tracking isn't a product of JS tripwires pwning computers: it's a natural result of downloading an article from a server.


No, it's a result of the article telling the browser to also download and execute analytics scripts. It's abusing the good faith HTTP protocol was built upon. That's why I consider ad/content blockers OK and desirable. They're a way for users to express that they don't consent to loading and execution of some resources.


That's true of malware too. Consent is different from actions.


Then its the misuse itself that need to be fixed.

Its like knife can be used to kill people, lets get rid of knife instead.


> Absolute statements are always wrong. ;)

Love the irony here :)


No, spying does not require malicious intent. If you look through your neighbours' blinds every night, just to ensure they're doing well, you're still invading their privacy and spying on them. By definition spying is to collect information furtively, which I think qualifies the behaviour of all internet trackers as 90% of the population is unaware of their existence and 99% of the population doesn't know the extent of their consolidated online profile. The uncontrolled collection of data is itself harmful, that's why the GDPR requires companies to justify the pieces of information they collect.


>neighbours' blinds every night, just to ensure they're doing well

What harm does it cause ?


>> No one should spy on your life.

That actually sounds as a pretty straightforward, self-explanatory request. I don't see where the confusion arises.


Disagree. Information wants to be free. Good luck trying to stop the flow of data. The solution of privacy should be embrace the flow of data or even increase it by increasing transparency. Sure, there are going to be issue in regards data being public and thats what should be fixed instead.


"Should". But they do nevertheless. It's a fact.


In Germany recording folks without their consent is illegal (or at least not admissible in courts) in many contexts. It is possible to build incentives that don't lead towards mass surveillance.


I think this is the correct solution. The author's recommendation - a cocktail of browser extensions and other technologies - is only a temporary fix. This needs to be fixed at an institutional level, but there are entire industries (facebook, other ad-service companies) that depend on bulk data collection so a bill like what Germany has will be fought against tooth and nail.


That's great, but not much protection if the surveillor has no intention of bringing the information to court.


Do stores there not use security cameras? Or is a sign at the door considered consent?


Definitely check out the browser extension TrackMeNot, mentioned in the article: it runs searches in the background to obfuscate what internet behavior is actually yours.

https://trackmenot.io/


This article talks about the cost of opting out, but doesn't go into some of the more insidious ways in which companies have made it harder and harder to do so. I recently opened the Downloads app on my new Mi phone only to find a notice asking me to grant authorization to Xiami to "... collect, process and use [my] personal data" in order to use the app[1]. Why does a basic downloads app need to gather and analyze my personal data? Why can't I use a version of the app that doesn't collect that data, and why is the agreement framed almost as a carte blanche for Xiaomi to collect personal data? I can see that the downloads app is running in the background, because it sends me notifications every now and then about recent downloads, which was what prompted me to open the app in the first place as I wanted to disable those on the app level. Now there's an app on my phone that's constantly running in the background whose behaviour I cannot control except by blocking all permissions and internet access unless I decide to give Xiaomi the keys to my data. I won't even go into the "Security" app that comes pre-installed that cannot be removed and has permissions to record audio granted by default!

Similarly, Gitlab recently prevented people from logging in and even deleting their own repos unless they agreed to data collection[2]. They had the feature to "opt-out" by setting DNT on your browser - but only if you granted them permission to collect data first. They did turn that feature back, but only after widespread criticism from all quarters. I might just be cynical, but I think we might see more and more companies use basic features as Trojan horses for data collection that is unnecessary to the functioning of those basic features themselves, with options to "opt-out" as cover and PR.

[1] https://i.imgur.com/yxokdzY.jpg [2] https://www.reddit.com/r/programming/comments/dm72oa/gitlab_...


> Why can't I use a version of the app that doesn't collect that data

Because that particular developer hasn’t chosen to offer you that deal. If you don’t like the deal vendor X is offering, don’t use vendor X. Particularly in the case of a phone app, you tend to have a lot of choice. If vendor Y provides you a deal you like better, why not use them?

You might find that vendors who offer those terms might bundle them with other terms you don’t prefer (like charging money directly for the app or service).


The linked article is literally about why your reasoning doesn't pan out.


That's absolutely daft. You can pick your lord, but they'll all demand unnecessary permissions. The road to serfdom looks like capitalism working according to spec - right up until you can't meaningfully vote.


The difference is in a great many cases, you can pick "no lord". Don't like being tracked with electronic tolling transponder? Don't use a toll transponder. Don't like giving Xiami permissions to use your personal data? Don't install a Xiami app. Don't like the supermarket tracking your purchases? Pay cash and don't use the loyalty card.

The key is that you have to be willing to give up what those things give you as well. The article even lists several: "long waits at road toll cash lines, higher prices at grocery stores, inferior seating on airline flights."

There were very few things in the article that I read that a principled person literally could not reasonably live without. Electricity was perhaps the closest, but even most automated meter reading or smart metering programs have a way to pay a monthly fee for a dumb meter to be installed and read manually.

If that's the case, then the question is one of convenience vs privacy tradeoff. I choose to get a driver's license, to have a car and to register it and put license plates on it. I could bike or walk. I choose to use credit cards, frequent flier and other loyalty programs, to use electronic tolling and automated meter reading, to use Kindles and highlight passages, to use a TiVo, Netflix, and Amazon to consume television, to use a smartphone and many apps, to connect with friends and family on Facebook and colleagues on LinkedIn. I do so because I perceive the benefit of doing so to be extraordinarily higher than the loss of privacy.

That's quite a bit different from being a feudal serf living on a lord's land in my book.


You have erected a hilarious false choice.

The supermarket is recording your image as you browse through the store, and within five years will assuredly be plugging you into facial recognition software.

It is not necessary for you to have a toll transponder in order for your car to be tracked. You have a license plate which can be trivially collected and indexed with off-the-shelf hardware and software. If you have a car made within the last few years, it probably has a permanent tether to the Internet.

Your cell phone is triangulated and your location is approximated to a handful of feet by your carrier.

In some not-too-distant future, it’s easy to imagine the death of cash, the outmoded way to pay: cryptocurrency is just so much sexier.

Newsflash: every man is not his own island.

In excusing those responsible for polluting my life, you are culpable also.


Obfuscating your personal profile is like obfuscating an entrance to your house so that thieves to have hard time to find it. If that is a concern, a society should consider a strict law and legal trouble for them, not hiding tactics.

Opting out is a thing created by fraudsters, and why is it tolerated is mysterious. The way EU implemented it, allowing sites to drag a user through a burden of un-checking a dozen of checkboxes, is also ineffective en masse. It is still easier to just use complete ad-blocking (passive measures) and clicking “okay” to get rid of a popup.


Their proposals for obfuscation are a mix of real solutions and things that are easy to work around in an automated way:

* Tor: real workaround, though it does have some vulnerabilities (ex: correlation of ingress and egress)

* Browser plugins that block trackers: real

* Browser plugins that click on ads: super easy to filter out, since you're a massive outlier

* Browser plugins that choose random FB reacts: even easier to filter out since FB has a strong system of identity

* Clothes that fool facial recognition: they can update the software faster and cheaper than everyone can update their clothes


One of the issues with Nissenbaum and Brunton's Obfuscation book is that there has been an extensive argument made that it actually supports pro-state forces, not opposes them [0]

[0] http://computationalculture.net/poisoned-fruit-booby-trapped...


I think it's reasonable to turn the philosophical question around and look at it from the other direction.

Why should a person interacting within society have the option to opt out of surveillance?

We don't imagine that in pre-industrial-age communities, people believed in a right to not be seen by, say, the butcher or blacksmith when they went in to buy meat or metalwork, or a right to not have their commerce recorded in books of accounting. And people weren't immune to town gossip---the low-speed, high-latency, low-accuracy equivalent of having your disparate data points aggregated. Has society changed or has what it means to be seen changed? Is gossip qualitatively different now because it's fast, low-latency, and high-accuracy?

Framed from this angle, I think the question is less about opting out and more about "How much gossip is it polite to engage in before people should see one as an asshole for being that nosy?"


> Why should a person interacting within society have the option to opt out of surveillance?

A non-exhaustive list of a few specific reasons:

- Private, unregulatable actions are historically large components of social change, particularly for minority groups.

- Historically, powerful groups (both governments and societies as a whole) have used surveillance to harass critics and undesirable parts of the population.

- People have a natural tendency to self-select and avoid honestly engaging with controversial ideas when they know they're being watched.

I tend to sum most of that up into a more general point:

People have the Right to Hide because society can be cruel, vindictive, and arbitrary, and because people have the right to protect themselves from others in the same way that a deer has the right to camouflage itself from wolves.

Opting out of society is not a real choice that anyone here can realistically take, so a just society must balance collective and individual rights. That means respecting individual rights of privacy and self-autonomy.


> Private, unregulatable actions are historically large components of social change, particularly for minority groups

I agree but I'd note there's reasons not to maximize that effect universally. Some balance is important; the Klu Klux Klan in the US is a minority group. Apart from that, I generally agree with your assessment of the benefits of the ability to have a private life.


Yeah it would crush dissent, limit change, make us weaker as a species.


To me, it's the lack of personal responsibility. Once it's automated, there's basically no single person to call an asshole.


I have a different take

The information is out there, we cannot rely forever on the inefficiency of an attacker.

Computers and their internet are making it easier to index and correlate this information.

Our location, gait and so on is constantly tracked by phone companies.

But everything else can be found in public by CCTV cameras, drones etc. They are able to literally know where you are at all times. They can analyze your gait, face, tag your car, and so on. Law enforcement can use wifi to see if you’re inside a building and find a good correlation between that and any video.

AI can predict when and where a network of dissidents will organize, and flag police to intercept (palantir is just one company that does this). It would be trivial for an AI-powered government to predict and defuse any uprising or organizing.

People have been outed by timing attacks (being in real-time chatrooms and having their internet cut to confirm it’s them). JK Rowling has been outed by analyzing her style.

Robots will be able to track down a human, outrun and and incapacitate them easily. Forget speeding tickets by cop. They’ll have drones spot the car from the sky, cameras by the road snap the car’s info. If it’s a criminal suspect, they can release little bots to roll up next to a car and puncture its tires to slow it down. Come and collect. No more high speed chases.

We need to let the public have access to this. We need encrypted CCTV video from cameras in college dorm tooms stored and decrypted when there is a need to solve eg a rape court case.

In some way this may also be helpful in the future for preserving alibis because video and audio can now be totally faked with deepfakes, and adversarial iteration will make it go past the uncanny valley into indistinguishability from real video. So the future will have immutable records with timestamps of everything. Where you were and when. Deal with it.

The genie is out of the bottle. More here:

http://magarshak.com/blog/?p=169


I've been saying for ages now that the ubiquitous surveillance panopticon is inevitable: technology makes it possible and politics and economics drive it.

(BTW, FWIW, I don't like it. I just don't see any way around it.)

So, given that, the obvious problem becomes making sure it is self-referential: the powerful must be subject to it as well as the "little guys". Otherwise you get Morlocks and Eloi, a split-level society with most people living as de facto slaves. Ask the people of Hong Kong about it, eh? (If they can still speak freely by the time you read this?)

If we and our children and grandchildren and their progeny are going to be irretrievably strapped into this global machine we'd better make sure it's humane, eh?


I wrote an essay for school on this, and the central thrust was that we can apply feminist theories of informed consent and adaptive preferences to digital consent, and the waters are very, very murky, especially when that consent is manipulated.

If I consent to something, fully believing it is in my best interest, it can still be said my consent is not legitimate in certain circumstances. For example, if an abuse victim is gaslit into beliving their abuser is the only person who could ever love them. The applications to the digital world, I hope, are apparent.

For more information, I highly reccomend Serene J. Khader's "Adaptive Preferences and Women’s Empowerment."


It raises the interesting question: how do other employees of platform companies protect their own privacy to prevent being put on a list after a political conflict of some sort? When you work in security, this kind of thing is always a distant blip on the radar, but it's there.

I know when I did work for a platform company, I didn't trust the internal controls or quality of people enough to keep their app on my devices. I was also sure there were people with access to user data who could not be depended upon to be noble.

These aren't faceless government agencies monitoring you, when it's someone doing a favour or using it to support some cause, it's people you know.


Start taking you privacy seriously, and opting out of bad services where you can. This sometimes means not getting to be on $latestplatform, and missing a few conversations, etc. Sticking to your principles is never easy or costless. Even then, verify that what you are using isn't spying on you. That means read the EULAs and TOSs. That means packet sniff your network regularly to find potential issues. etc etc It requires work.

I think there will be a huge market for entities to help individuals without these skills do these things.


The article's point is that individual actions really don't scale.

This is a general problem of race-to-the-bottom dynamics -- a form of Gresham's Law. At best, personal resistance simply opens more space for the bad actors / RTTB dynamic.

Instituting systemic blocks, through regulation, legislation, taboo, or other mechanisms, is required.

Individual action can help spread awareness. But it's a first step, not a last.


I guess in that case I'd be interested in hearing more about the "other mechanisms".


Anything that's systemic and collective rather than individual.

A few come to mind, but the primary intent was to indicate my list was not comprehensive.


I understand that, I just feel like the others in your list have tended to fail to produce change (regulation, legislation, taboo), via regulatory capture, K-street and money influence on congress, and media manipulation, respectively. So I am more interested in anything you might consider to go in the other category. Not looking for a comprehensive list, just something to get started with.


Most of the actions referred to in the article are mundane: you left your home, you got cash from the ATM, you used the subway, etc. I can't see how that can be used for anything against us.

The real problem is the internet: when we express our political interests online, when we visit certain sites, a profile is built for us about not what we do but what we believe in.


>Those who know about us have power over us. Obfuscation may be our best digital weapon.

Interesting, we could use tools like deepfake to fake images & videos of ourselves in order to increase the level of obfuscation.

Perhaps we can even analyze our writing styles and then use GPT-2 to produce writing that matches our styles, and flood the web with that.

Anonymity through noise.


Not really, you're still giving them the real signal.


Meatspace is a lost cause. So you must blend in. Just do whatever you must to get along.

But online, there's still a chance. As in Vinge's True Names. You hide your uplink using nested VPN chains and Tor. And then do whatever you want. Using as many unlinked personas as you like.


I actually sort of thought this was the common POV for anyone within tech. Controlling google searches for your name, putting fake content on social media, etc. are obviously only way to create enough noise to direct algorithms to outcomes you’d prefer no?


Yet i can see your full name, your linked in profile, twitter etc etc. Use pseudonyms, people


It is a viewpoint of the SEOers. Others prefer not to bother with that arms race of bullshit and never "opted in" to the dancing monkey trap in the first place.


> But if we are nearly as observed and documented as any person in history, our situation is a prison that, although it has no walls, bars, or wardens, is difficult to escape.

The defining characteristic of a prison is a loss of freedom. What is it that you want to do that these technologies are preventing you from doing?

It's unlikely that we're going to put the technology genie back in the bottle. Instead we need to preserve freedom by sharply curtailing government and corporate power, and demanding radical transparency from any organization that is collecting data.


Always-on surveillance is the ultimate prison. Foucault articulated this very clearly back in the 70s in Discipline and Punish. You don't need walls, you don't even need guards. You just need an ever-present observer and a fear of being observed doing something 'undesirable'.

Perhaps the genie isn't going back in the bottle, but rolling over and reiterating the 'if you have nothing to hide...' fallacy is not a productive course of action either.


Nowhere did I say anything about "...nothing to hide". I am saying we need to, via legislation, weaken governmental and corporate authority, and increase personal authority so that the data collected cannot be used to harm us. Because the real fantasy here is that there is some way the data is not going to be collected.


If we agree that data will be collected whatever laws we put in place, under the same premises can’t we agree that it will be misused as well ?

Problem with laws to me is that they are only binding to some portion of the actors. Some government actors have a mandate to work around the law, and other organisations have divisions dedicated to avoid triggering these laws. Regulation is only part of the solution.


> What is it that you want to do that these technologies are preventing you from doing

Organizing tea parties without being surveilled and oppressed?

Inviting friends to a meeting, without tech giants knowing about it, and taking actions against said meeting.


Which actions are tech giants taking against people organizing meetings?

Does Google have a goon squad that is busting people's doors down and breaking up meetings? It seems like something like that would be in the news somewhere.


> Which actions are tech giants taking against people organizing meetings?

Feeding the data straight to NSA, which then feeds FBI, which then crashes the party with pepper spray and dogs. Sometimes the Feds can send under-cover agents to join the party, then stirr dissent and division.

See Occupy Wallstreet.

Another example is, tech giants forbid/censor messages (try sending a torrent link with wikileaks data in it to your friends), making your party so small you think no-body is actually interested. But the message is just silently dropped. Spooky eh.


it's a bit like global warming probably. untill it has no obvious effect. but it's too late to do something when goon squad is already knocking.


>What is it that you want to do that these technologies are preventing you from doing?

Please make public all your (accounts,passwords) or at least post publicly all your communication. What is it that you want to do that these actions would preventing you from doing?


The technologies discussed in the article do not record account passwords, nor are there any such technologies proposed that would do so.

Services you use already have a record of your account password (or a hash of it at least). It is necessary in order for you to be able to use the service. You supply your Amazon password to Amazon every time you log in. Are you saying you need to keep it a secret from them?


There is a famous analogy, I forgot the source (maybe Cory Doctorow?): why do you have a door at your toilet? everybody knows what you are doing there.

So I don't need any of your private data. I voluntarily post my own, as concerns my passions. But let me choose what I make public and what I don't. That's the point.

(I don't use Amazon. All my work is public, I use only free software, even at home.)


The article is not talking about recording people on the toilet. It's talking about the surveillance cameras that are placed outside, in public. There is no conceivable way for your actions outdoors, in public, to be private because they are taking place in public.

The article also seems to presume that there is some single source where the data from all these disparate systems is tied together, but no such thing exists afaik.


You know what "the map is not the territory" means, right? That one should not make a confusion between reality and (any form of) bureacracy. Now, technically any interaction a real person has on the net is influenced by the map of the person, i.e. by the classification of that real individual. The classification does not have to be geographically centralized, it is enough that the tags the real individual is blessed with by algorithms can travel effectively with the individual. This is technically possible and it is the root of the problem. So not that _they_ know, but that my interactions with other people are mediated not by the territory, but by the map. Who owns the map owns the territory, this way.

You can see that this is the problem in many places. One of those places which springs to mind is that, for example, Google argues that they are just like any user of the (infrastructure of the) internet. But whenever two individuals interact via anything Google, the communication goes mediated by Google. In a sense it is centralized, and there is a centralized map, even if physically spread between variously located data centers.

hnuser77 19 days ago [flagged]

There is a conceivable way for actions outdoors to be private. Just roll back the clock a whopping 3 decades. Maybe nosy old Ethel notices when you leave for work in the morning, but she's not tracking your face's position and emotion across the city. Your location and reading history is not being recorded from your cell phone and logged permanently. Would it bother you to send me your exact GPS coordinates right now, or if I dispatched a drone to follow you around at all times the instant you stepped outside your front door, then post the video feed on the internet?

There are multiple billion-dollar companies whose entire business is tying together data from all these disparate systems, e.g. Palantir. There are leaked multiple government programs whose entire purpose is the same, e.g. XKeyscore. Or look at something more general purpose like Splunk, where it is trivial to filter 1TB of logs from different sources and map identifiers.


> What is it that you want to do that these technologies are preventing you from doing?

Have secrets.


>What is it that you want to do that these technologies are preventing you from doing?

It's a spiritual thing. What does it mean if something is private? What do you allow yourself to do if nobody is watching? E.g. do you eat more sloppy when you are in private than in a restaurant?

If everything is public, there is no privacy. This comes with its own spiritual benefits but some people don't like that.


Security by obscurity is no security at all.

What is needed is stronger individual rights to access, correct, delete, and control the use of any information collected about us, as well as strong prohibitions on the use of data against our individual and collective interests.


The point is not security though, but privacy. At its base privacy is obscurity.

Also you seem to advocate for every single individual to run a race against the rest of the world to not misuse their data. We both wish there was decent safeguards, I am way more pessimistic about any effective rights regulation though (I think it’s just way too hard for any legislation body).

Not saying nothing should be done on the elgal anr regulation side, but that it’s only a tiny drop in what is needed for us to preserve privacy.


The problem with privacy by obscurity is that once broken it cannot be fixed. Once information is released and becomes public, there is no way to retrieve or expunge it from the public record. The only remedy is law, regulation, and a social compact that the continued dissemination and use against your interests is forbidden.


The real issue historically hasn't been privacy itself in many cases so much as people unduly giving a fuck about things they shouldn't because it doesn't actually affect them at all or their "requirements" are objectively barking mad. While surveillance is full of evils the true root of most damage from releases. Society's stupid reactions are and it would prefer to change anything execpt their dumb presumptions in the face of reality and morality.

It brings to mind the stacked absurdity of pagent winners losing their crown over nude picture releases. Clearly they should inspect the models to beforehand to be sure they have Barbie Doll crotches beforehand if having a naked body is a disqualifier. It would save them all embarassment once they realize there are no qualified human models which meet their standards and they will have to make due with giving trophies to department store mannequins.

Or the many who suffered from being outed as gay including dumbass tautologies where making a secret firing worthy because of a risk of blackmail - that they created.

ddiq 20 days ago [flagged]

It's the year 2027. You get a notification from the H8Watch™ by the ADL app, default installed on every phone, and can't be deleted without jailbreaking. You open the notification, and it shows a video of you from 2 weeks ago, from a camera embedded in the table of the fast food restaurant from where you were eating lunch with your friend. Your voice has been reconstructed through lipreading software and played back to you. "You're such a feandra, I paid last time!" you said to your friend playfully, using the new insult people just started using to refer to someone thrifty.

After the video and reconstructed voice plays back to you twice, it changes to a message. "Minor class-based insult detected." The nonsense word "feandra" had been added to the H8Watch™ database 2 hours ago, and your offense had come up as part of their program to catch racists and classists retroactively. This prevents the use of new words or euphamisms that may otherwise be used to bypass H8Watch™ detection.

Moments later, you get more notifications. Twitter: account suspended for 1 day, your bank: $25 has been deducted from your account for terms of service violation, Facebook: 2 offenses so far this month, 1 more and all three videos will be posted to your Facebook feed.

You quickly swipe away the rest of the notifications, you'll deal with them later, and text the friend you were going to meet for dinner. "Hey John, I can't make it tonight. Cash is a little tight after another violation, and I can't risk a public conversation this week."


This account has been using HN primarily for ideological battle. That's against the site guidelines and we ban accounts that do it, regardless of which ideology they're battling for or against. It's not what this site is for.

Breaking the site guidelines will eventually get your main account banned as well, so please don't.

https://news.ycombinator.com/newsguidelines.html



It's remarkable how someone living in 2019 still finds grounds to complain about stiffing political correctness.


A man in the UK was investigated by the police for retweeting a limerick about transwomen - https://www.telegraph.co.uk/news/2019/01/24/man-investigated...

>After Mr Miller questioned why the complainant was being described as a “victim” if no crime had been committed, the officer told him: “We need to check your thinking”.


In 2019 the hand symbol for OK has been retconned as hate speech. There is no sign yet of this trend reversing.


We live in a world where you can lose your job just for making the OK hand sign at the wrong time.


There are people still sore about swastikas and roman salute "retconned" too. But well maybe don't use them for supremacist signaling in the first place, that sure could help.


If you think this story is limited to, or even about, political correctness, you are mistaken. It could be your insurance rates going up for jaywalking, even with no cars around. It could be your credit score dropping if you have too many drinks at a bar. It's about your personal activities all being tracked, measured, and in the end, controlled by corporations.


I don't know why you are downvoted. Think the dangers are real with social credit systems on the rise. Much has been written on them, e.g.

> The social credit system is used to punish citizens for bad behavior with numerous blacklists preventing them from traveling, getting loans or jobs, or staying in hotels, and even by limiting internet access.

https://www.businessinsider.com/china-social-credit-system-b...

Maybe people find it hard to believe these will be implemented in western countries?


> I don't know why you are downvoted.

I don’t agree with the downvoters at all, but I think it’s fairly obvious why he’s being downvoted: either folks think he’s wrong that such a thing is possible, or folks think he’s right but don’t care (i.e., they think that the society he describes is desirable, or at worst neutral).

I don’t think that his described society is impossible, and indeed we already see a slow, less-regimented version of that playing out.

I cannot emotionally understand someone who would find such a society desirable, but intellectually it kind of makes sense: one needs to be just authoritarian enough to want to control non-violent behaviour, while just naïve enough not to realise that there are aspects of one’s own behaviour which would be controlled, too (i.e., every one of us disagrees with the majority on at least one or two items).


It's shocking to me that here, on Hacker News, posts like these get voted down. I could understand them being regarded as unpopular on mainstream media, but HN? Wow.


It already is implemented in America. Political dissidents are harassed by airport security, get their bank accounts closed and credit cards canceled, are banned from services like AirBNB, and get railroaded in the courts.


Source?


Chase Bank shuts down prominent conservatives https://www.nationalreview.com/2019/04/chase-bank-conservati...

AirBnB seeks out and blocks people going to the American Renaissance conference https://gizmodo.com/airbnb-doesnt-want-extremists-on-its-pla...

Credit cards cancel payments to "hate" groups https://nypost.com/2017/08/16/credit-cards-are-clamping-down...

These are all political censorship against people or groups that aren't doing anything illegal, and the standards are applied differently based on ideology rather than any objective standard.


Thanks, although it's the security checks at the border based on traced political views isn't sourced, I found some.

That's the one that bothers me, as there is hardly ant competition possible in the security checks business.

Airbnb, banks.. all it takes is some open minded provider to make those political police institution to loose ground. Although its matter a time before open minded providers also get pressured to police their customers.


I see what you're saying, but it's just a bit naive. It's not that easy to create a new provider to help political dissidents. A number of web services have been started to do so, and forced to shut down by being cut off by upstream providers. "Just make your own ..." at some point isn't viable, because other companies at every level are making sure you don't succeed.


"American Renaissance" - actual fucking nazis. If first we came for the nazis the poem would have but one line.


It's ok to go after the free speech of political dissidents if you slander them with generic insults first?


Maybe you should have used one of those examples instead.


Doesn't seem to matter, since that comment is getting downvoted as well.

I guess technologists don't want to see the future they're creating, and would rather just come up with tactics that barely defend against the surveillance we have now.


Technologists are for hire. They get paid. They need to get paid for something. They do see the future it's creating, some of them build tools (for free) to counter what they or they peers are building.


With a clamor of bells that set the swallows soaring, the Festival of Summer came to the city Omelas, bright-towered by the sea. The rigging of the boats in harbor sparkled with flags. In the streets between houses with red roofs and painted walls, between old moss-grown gardens and under avenues of trees, past great parks and public buildings, processions moved. Some were decorous: old people in long stiff robes of mauve and grey, grave master workmen, quiet, merry women carrying their babies and chatting as they walked. In other streets the music beat faster, a shimmering of gong and tambourine, and the people went dancing, the procession was a dance. Children dodged in and out, their high calls rising like the swallows' crossing flights, over the music and the singing. All the processions wound towards the north side of the city, where on the great water-meadow called the Green' Fields boys and girls, naked in the bright air, with mud-stained feet and ankles and long, lithe arms, exercised their restive horses before the race. The horses wore no gear at all but a halter without bit. Their manes were braided with streamers of silver, gold, and green. They flared their nostrils and pranced and boasted to one another; they were vastly excited, the horse being the only animal who has adopted our ceremonies as his own. Far off to the north and west the mountains stood up half encircling Omelas on her bay. The air of morning was so clear that the snow still crowning the Eighteen Peaks burned with white-gold fire across the miles of sunlit air, under the dark blue of the sky. There was just enough wind to make the banners that marked the racecourse snap and flutter now and then. In the silence of the broad green meadows one could hear the music winding through the city streets, farther and nearer and ever approaching, a cheerful faint sweetness of the air that from time to time trembled and gathered together and broke out into the great joyous clanging of the bells.

Joyous! How is one to tell about joy? How describe the citizens of Omelas?

They were not simple folk, you see, though they were happy. But we do not say the words of cheer much any more. All smiles have become archaic. Given a description such as this one tends to make certain assumptions. Given a description such as this one tends to look next for the King, mounted on a splendid stallion and surrounded by his noble knights, or perhaps in a golden litter borne by great-muscled slaves. But there was no king. They did not use swords, or keep slaves. They were not barbarians. I do not know the rules and laws of their society, but I suspect that they were singularly few. As they did without monarchy and slavery, so they also got on without the stock exchange, the advertisement, the secret police, and the bomb. Yet I repeat that these were not simple folk, not dulcet shepherds, noble savages, bland utopians. They were not less complex than us. The trouble is that we have a bad habit, encouraged by pedants and sophisticates, of considering happiness as something rather stupid. Only pain is intellectual, only evil interesting. This is the treason of the artist: a refusal to admit the banality of evil and the terrible boredom of pain. If you can't lick 'em, join 'em. If it hurts, repeat it. But to praise despair is to condemn delight, to embrace violence is to lose hold of everything else. We have almost lost hold; we can no longer describe a happy man, nor make any celebration of joy. How can I tell you about the people of Omelas? They were not naive and happy children – though their children were, in fact, happy. They were mature, intelligent, passionate adults whose lives were not wretched. O miracle! but I wish I could describe it better. I wish I could convince you.

Omelas sounds in my words like a city in a fairy tale, long ago and far away, once upon a time. Perhaps it would be best if you imagined it as your own fancy bids, assuming it will rise to the occasion, for certainly I cannot suit you all. ..

..........

https://canvas.wayne.edu/files/3466337/download?download_frd...

https://webcache.googleusercontent.com/search?q=cache:IZDmzk...


I'm gathering some people here aren't familiar with Ursula K. LeGuin's short-story epic.

It's a bit of a stretch for this particular post, but I think there's a connection which can be made.


It is indeed a story about "opting out". I was hoping it would spur a little discussion. It's not exactly a long read, but I guess too long for the attention span of the current crowd.

Everyone in modern society has chosen not to walk away from Omelas. Every day we live on the backs of untold suffering - actual peasants and literal slave labor who mine our minerals and manufacture our goods.

The Ones Who Walk Away From Omelas personifies that as a single child, whose suffering we all know. And yet our banners could not be as bright or our people as happy if we didn't.

We of course all identify with the ones who walk away, yet none of us choose to do so. It's a thought-provoking piece.


I think it got interpreted as copypasta, of which there's been some uptick.

I got the point. Then again, I've also walked away, at least from much of the tech Omelas.


Yes, there is some friction in opting out just like there is friction in any action you want to take (from going to the grocery store to voting). So what?

The reality is that some people really care about this stuff but the wider population doesn't. The people that care tend to get really frustrated at the wider population and attribute the population's uncaring attitude to the small amount of friction that it takes to opt out and/or to ignorance and/or low-intelligence. This is followed by a push for a government policy to adjust the behaviour. If people won't care by themselves or are too stupid to do it, we'll make them care through laws ... and that usually results in idiotic policies like a cookie warning on every webpage on the internet.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: