Hacker News new | comments | show | ask | jobs | submit login

Do you feel guilt over creating them?



Should they? The vast quantity of users find it incredibly useful and have no reason to be concerned about governments or third parties being able to determine their geographic location, because governments or third parties don't generally care.


>have no reason to be concerned about governments ...

Many aren't, but everyone has reason to.

Governments change. Telling your government your religion in 1920s Germany was harmless, in 1940 many would have preferred if the government didn't have their religion on file.

Circumstances change to. In 1920 being a Japanese in the US wasn't special. After Perl Harbor came the internment camps.

And then there's the mundane stuff. You protest a government policy, someone in the government takes issue and tries to put some of these annoying people in jail.

Given that you don't know when you might become an enemy of the state it's always a good idea to keep the power of the state over its citizens in check.


You can be upset about an aspect of a product, and seek to change that aspect, without abandoning use of the product. For example, 1.3 million people are killed by cars every year, and while we recognize the risk, we also constantly improve them through safety regulations, training and improved technology. Just because people use cell phones and apps today doesn't mean we're okay with the downsides and should stop trying to improving them.


It's an interesting example you've chosen, since one of the dimensions along which car safety improvement is being researched is ubiquitous GPS signalling to share data about road and traffic conditions (and since every self-driving car is basically a panopticon and recording device rolled into one).


Mass surveillance is not really for investigating individuals.

The game being played is not '1984', it is 'Foundation'.

It is for steering entire societies, and this works far better on the boring people who think they have nothing to hide as they are the easiest to model


I agree the greater emphasis is Foundation-style analysis, but really, it's for both.


I've been working a theory that what we are seeing in the last 10 years or so is the escape of these techniques from government into private industry.

With a single powerful player, you get a consistent, but slightly false narrative. If you have lots of players though, you get multiple competing narratives and the news stops making sense.

Is partly why I still think Gibson is one of the people who got it closest to the mark.


If they "don't generally care", they wouldn't be collecting that data to begin with.


It's possible that they care about the aggregated data and not about the individual data.


They collect the data because they can find themselves needing to care in the future, at which point nobody wants to be kicking themselves for failing to collect the data.


So they do care.


Cambridge Analytica did far more with far less.


Did they? They're sales pitch claimed they could but what we've heard of actual methods and impact didn't appear more effective than regular FB ads.


It's not about being able to track everybody. You're right, nobody cares about that.

It's about being able to track anybody.


1) Users get no benefit from information resale. 2) COINTELPRO


Keep in mind: most users are not part of a domestic political organization targeted by the FBI, so again, when the rubber hits the road, they'd rather not be inconvenienced for a risk that applies to other people. They don't care about COINTELPRO (disregarding, of course, the percentage of the population that actually thinks the FBI digging into "subversive" groups is part of its job).

Users get no benefit from the information resale directly, but they also aren't generally harmed by it. And the benefit they get from having a ubiquitously-connected device in their pocket outweighs the (apparently calculated to be low) per-person cost to their information being resold. The fact that you or I may do the calculus differently for ourselves (because we have different risk sensitivity) doesn't impact those who don't reach the same conclusions.


What I'd say is that until somewhat recently, I was interested in politics but not engaged. I took your position during that part of my life. Now that I'm actually engaging in political activities, COINTELPRO and its current incarnations scare the bejesus out of me, and I'm not doing anything that radical, just left of the Democratic Party. YMMV.

There may come a time in your life when you wish to have a say in the political system or are wronged by a powerful corporation. You'd care in that case. When your political rights disappear, they aren't easy to get back.


I agree that one in that context cares, but I think you can agree that most people are not in that context. So on the whole, they receive benefits from deep data integration and no immediate downsides.

Which circles back to the original question: should a person feel guilt over creating tools that help the average user and harm the political dissident? Seems an open question. Perhaps one heavily dependent upon whether the actor agrees with the political dissident's position.


Generally, dissent is a healthy thing and you'll get a better society that way. Once the capability for real dissent is eroded, the social controls of the society will be turned to the benefit of the victorious faction. This turns out poorly for everyone else.

We should not be creating a mass surveillance state. The actual abuses domestically generally of minority populations, abroad, generally of non-NATO civilian populations, and potential domestic abuses (with many well noted assassinations and infiltrations in the past) are alarming and have already stronger, more precise abilities for social control of the population by the state than existed in dictatorships. The Stasi would have killed for the NSA's database and planting live tracking beacons on most citizens.

I'm on the left, but the non-financial political freedoms of the right are a bellwether for my own (though the literally genocidal far right is a more complex discussion). In general this makes a lot of logical sense because conservatives and right wing ideologues wish to maintain the status quo (literally to conserve it) or to return society to a past state (e.g. the relation of men and women, the role of religion, etc.), and range from libertarian to authoritarian, neither of which really threaten established authorities (and often reinforce them) and so are treated with kid gloves (watch how police treat right wingers at protests on average). Blue lives matter is a right wing cri de cœur that's an example of a "protest" that celebrates existing civil authorities.

Liberals and more-so leftists wish to change society into a new state which threatens the established order. Therefore, the civil authorities do not treat them with deference. Typically political freedoms lost by the right are applied with vengeance against the left.

Encourage dissent. We make fun of countries that don't. :)

EDIT: added info on why losses of political freedoms for the right are an especially bad bellwether


Perhaps we need more freedom in this scenario. We have public airwaves which are presently owned by corporations (the highest bidder) where in fact we ought to give everyone the freedom to carve out a slot of that bandwidth in their spatial region. We ought to homestead the airwaves giving individuals the ability to both send/receive and route packets. These airwaves should be treated like byways where everyone can send/receive packets. We'd carry an envelope of bandwidth around us wherever we travel by reserving common airwave (e.g. 2.4ghz, 5.8ghz, etc...). In this way, you starve the companies of their revenue, making their existence more difficult.

I'd like to ask anyone within RF earshot to carry my packets. I'd even consider paying for faster bandwidth if others were offering below some threshold. Some common low bandwidth communication should always work, say some fraction (split between freeloading users) of 20% of the link speed. I'll carry your packets if you're in earshot, rebroadcasting as needed, following the same rules. We could rotate our source addresses every so often.

You'd be persona non-grata (illegal) if you're recording and sharing who you hear. At the heart of it, saving and recording in perpetuity who you're communicating with and where ought to be illegal. Certainly selling that data should be too, or you end up with what we have today.


Does the UK encourage dissent? It doesn't seem to. And it seems to be doing fairly well.


All I have to say to that is that regardless of how you feel about them the US picked Trump and the UK picked Brexit. These were both events that demonstrated that elite opinion was so cloistered the rubes started throwing molotovs at the political system, in some cases just for the cheap laugh in the voting booth.

Yes, the UK absolutely needs additional dissent.


Really? Didn't you just demonstrate that when the "rubes" are given power, they vote Brexit and Trump?

That seems like a terrible idea.


If you create a dielectric barrier, the required charge to overcome it can have more effective power than the constant flow that would otherwise occur at a low resistance.


A potential victim's ignorance of their risk doesn't mean they aren't at risk.

Because I'm not specifically aware there's a cross-town bus with my name on it, I'm somehow not about to get pancaked?


Any source for this claim?


The general public and repeatedly-reported-upon understanding of how data collection can be leveraged to find unexpected insights not obvious from the data, coupled with the Snowden leaks, coupled with the ever-increasing user count for cellphones, Facebook, Twitter, and the Internet in general.

If people were deeply individually concerned about the risks vs. rewards of these technologies, they'd stop using them. That's the rubber-meets-the-road calculus I see.


Do you trust the public is informed about these technologies? I think you might be overestimating individuals... most folks still don't know about Cambridge Analytica.


> "If people were deeply individually concerned about the risks vs. rewards of these technologies, they'd stop using them."

Why do you think that? It clearly doesn't apply to stuff like oil, for instance.

I could give up my phone, but I would be in deep shit if I did it tomorrow. It would take a lot of arrangement to do so and it would piss off my family and lose me work.


Actually, I'd argue that it does apply to stuff like oil.

People say they're concerned. But the actual number of people attempting to zero the amount of oil they use? Much lower than claimed concern.

Words are easy. Actions have costs that people would prefer not to take on.


>the actual number of people attempting to zero the amount of oil they use? Much lower than claimed concern.

How do you know how many there are? Anyone doing that couldn't travel except by foot, buy any commercial products or use any available communication services.

edit - alternatively, there are loads of people attempting to zero the amount of oil they use. They are just using oil to get there.


Tu quoque.

See also: "Ayn Rand collected Social Security benefits." (And I abhor her oeuvre and "movement".)


Tu quoque requires someone to have made a claim in the first place.

I'm saying people make the claim on the average person's behalf that they want privacy and information such as their location (as triangulated by cellphone towers) kept generally secret from governments and corporations who can offer them benefits, and that claim is not actually supported by much evidence. I think the digital intelligentsia cares deeply; the average cell user, not so much.


And I'm saying that lack of care is a product of ignorance — ignorance in no small way imposed upon them by the shady behavior of the people who are doing this. As such, it can't be reason to blame them for that "choice". It's a passive choice. It's opt-out, without being told there's a option. And there isn't actually an option.

That is, if Verizon was unambiguous with Joe Customer, "We may sell your real-time location information to companies known to re-sell that kind of information to the government, and you can't do anything about it" how many of them would be pissed? Isn't the state being restrained from un-warranted — literally — snooping into people's lives a core American value?

Your position is that most people would "meh". I think you're wrong. You're probably right that there's scant evidence either way, though.


Kind of like how automobiles are a luxury, and if people cared about the 4th Amendment they just wouldn't drive anywhere. Nevermind that our way of life is literally not possible without the technologies in question.

Every single one of the revelations you've mentioned was met with public backlash, followed by either a misinformation campaign or intense dog-wagging. This is called manufactured consent. For example, let's look at Cambridge Analytica. When it was revealed that a military contractor was hired to subvert the 2016 Presidential election, the dominant story in the alphabet-soup media was a twitter tantrum from Trump. As it became clear over the next few days that the story wasn't going to be buried easily, the narrative was quickly shifted away from the subversion of democracy to blaming Facebook for leaking user data, culminating in parading The Zuck before Congress. He played his part perfectly: no bread, but enough circus to keep the masses from thinking too hard about what it means for an election to be free.


You'll have to unbox how driving is related to the 4th Amendment; I would have assumed you were going to observe people continue to drive even though 40,000 people a year die in car accidents.

People do the calculus to decide if risk is greater than reward all the time. It appears ubiquitous connectivity, for most people, is far more rewarding than risky.


In short, doing anything that requires a Driver's License severely restricts your freedom from search and seizure while traveling on public highways. To gain those rights back, you have to (de facto) forfeit your Driver's License and stop driving on public highways.


>People do the calculus to decide if risk is greater than reward all the time.

Technically you're right but what you seem to be missing is that people (in general) suck at risk assessment. Although they are doing "the calculus", most of their calculations are based on heuristics that just don't reflect a rational analysis.

That is why so many people fear plane travel more than car travel, immigrants more than cigarettes, and pharmaceuticals more than "raw water".


Several recent HN stories have had this kind of comment (first noticed with the Securus submission) that's a weird mix of "You have nothing to fear if you have nothing to hide" and "They will never come for you, you're too unimportant." Is this a sustained campaign or just a way for folks who have contributed to these issues to feel good about themselves?


> Is this a sustained campaign

This breaks the site guidelines. Could you please read and follow them when commenting here? https://news.ycombinator.com/newsguidelines.html

Insinuations of astroturfing or shilling without evidence (an opposing view does not count as evidence) are an internet toxin that turns out to be worse than the things it insinuates, because it's so widespread. I've written a ton about why we don't allow that here, if anyone wants to read more: https://hn.algolia.com/?query=by:dang%20astroturfing&sort=by...


Welp, sorry.


It's just how a lot of people feel about the issue.

I'm not sure why you would jump to concluding that it's a sustained campaign or some kind of reaction to guilt.


Wilsonnb hit the nail on the head, it’s just how some people feel. Though I don’t doubt that some people involved in the creation of this phenomenon use the argument to justify their work.

I had a hard time understanding why people wouldn’t be more conscientious of their privacy, until I had discussions about the issue with people close to me.

My folks had a very similar sentiment to the typical “if you have nothing to hide, then why do you worry about it”. My girlfriend had the same thought, but took it a step further and asked why I cared so much about people uninvolved in my life knowing personal details about it, then said I was “the most paranoid person [she’d] ever met”

Once the Cambridge Analytica scandal broke, they all understood my point. I think the majority of people who don’t work in tech don’t understand the massive implications that our lack of privacy has. They don’t know how cookies or backends or tracking pixels work, and may not even know they exist. They imagine an NSA agent sitting in a room looking for keywords, not companies that they entrust their digital lives to selling off every little piece of info about them. It’s so much more than your Facebook or Twitter posts being public, it’s data that we might not even know about ourselves being kept in the hands of unknown entities.

To sum up this rant, some people have to see it to believe it because this is outside their scope of knowledge


I'm surprised you've had conversations with tech laymen that understand what Cambridge Analytica is guilty of. Everyone I talk to, even reasonably tech-literate people, still don't understand the repercussions. I even point out the possibility of throwing a presidential election, and my mother said, "so what, isn't that just people pushing for the guy they want?"


It would be better if they did, yes.




Applications are open for YC Winter 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: