Hacker News new | past | comments | ask | show | jobs | submit login

I am the author of the original tweet, so allow me to push back on this point of view.

Technology people are not the solution to this problem, we are the problem. And we really need to stop being the problem. There have been authoritarian governments before, and still are all over the world: but they've never had access to technology anywhere near as powerful as what we have today, e.g., ubiquitous user location or unencrypted smartphone backup or vast unprotected email databases. These technologies themselves are brand new: smartphone cloud backup is only 10 years old, and ubiquitous GPS-enabled smartphones are younger than my middle-school aged child.

But thanks to aggressive deployment by Silicon Valley we went ahead and created a surveillance utopia on a timescale so short that most citizens don't even realize what we've done. Future governments now have capabilities we could never have imagined a decade ago: centralized data repositories and collection systems that would make the Stasi blush. These systems now exist because people like us built them and decided to leave the problem of securing those repositories to the future. But we're running out of future.

Is more privacy the solution? I don't know. It's possible that authoritarian regimes are very efficient and will somehow force companies to build insecure systems that can be abused, and force consumers to adopt them at Google scale. I suspect this could happen but would be slow and inefficient and full of friction. None of that has to happen now -- there is no friction -- because Silicon Valley has already done the architecture work and deployed the technology at scale. They did it recklessly and with no thought to how it might be abused, and now they're finding it difficult to put the genie back in the bottle. It will not get easier.

TL;DR: The data repositories already exist and all that's protecting them is a fig-leaf of legislation and honest government behavior that could disappear at any moment. Removing those repositories might not save us, but we need to try. We sure need to stop creating more of them.




Okay, so we delete existing repositories and stop collecting that data for now. I'm in full agreement.

I just don't see how developing new systems is any sort of solution.

> It's possible that authoritarian regimes are very efficient and will somehow force companies to build insecure systems that can be abused

The genie is out of the bottle. Issuing an edict to "do things like you did them in 2022" doesn't exactly require genius-level administrative talent...

It's also a bit presumptuous to assume that a future authoritarian regime wouldn't have expert technologists at their disposal... why do you make this assumption? Technologists are just people; a few will be partisans and many more will go along to get along. (To wit: how many of your colleagues at Johns Hopkins who have personal issues with the defense industry don't take boatloads of $$$ from DoD? Or at very least submit to NSF CFPs that are transparently adjacent to concurrently running DARPA programs? Ditto for colleagues who have beef with the tech industry?)

> and force consumers to adopt them at Google scale.

No forcing of consumers required. Just point a gun at whoever has market power and tell them to do things like they were done in 2022. After all, we are talking about an authoritarian regime.

You make a reasonable case for deleting existing troves of data, and a good counter-factual case for never developing smartphones in the first place. But, again, the genie is out of the bottle.


> It's also a bit presumptuous to assume that a future authoritarian regime wouldn't have expert technologists at their disposal...

I think your thinking here is too binary. Clearly there exist possible futures where authoritarian governments are hyper-efficient and have brilliant technologists advising them. In those futures we're clearly doomed and thus we might as well give up now. The thing is, those futures are not inevitable. There are also many (IMHO more realistic) futures where governments are messy and inefficient (like they are today), where their authority is blunted by organizational and jurisdictional issues (as it is today), where their technologists are not hyper-competent (believe me, as they are today.)

In those worlds there is a huge difference between a scenario where the full weight of FAANGM's resources is pushing to build massive data repositories, and one where they've taken firm technological steps to limit it. We are in the first world and we should be in the second one.

> Issuing an edict to "do things like you did them in 2022" doesn't exactly require genius-level administrative talent...

I think that this will actually be more difficult than you think: this is why governments are spending ~millions right now to slow down the deployment of new encryption technology [1]. But stop worrying about 2022: what you should be worried about is 2036. Look at what we've done to privacy since 2007 -- the year the first iPhone launched. Now imagine someone from ~14 years in the future coming back to to explain what Silicon Valley has done with even better technology like wearables and powerful ML tooling. When you're in a hole, the most important step is to stop digging.

[1] https://www.eff.org/deeplinks/2022/01/uk-paid-724000-creepy-...


I think you massively under-estimate the banality of authoritarianism.

If an authoritarian movement takes over the US government, at least a majority of the Johns Hopkins CS faculty will continue taking grants from the NSF/DoD. Many of those grants will be more-or-less aligned with the objectives of that authoritarian movement. Non-authoritarian students will grind away on those projects.

Something similar would happen at FAANGM. No iCloud backups? NBD; lean on those companies to collect whatever data the state wants. You don't need super competent loyal technologists, because FAANGM and their employees will most of the time just do what you tell them. You don't need existing troves of data, because you can start collecting at any point and still get a huge amount of utility.

Could the authoritarian world be marginally better if big tech makes an about-face and stops collecting data? Sure. Is that difference enough to make any sort of significant difference in the lived experience of people or the trajectory of the authoritarian regime? Probably not.

I don't think you're wrong, per se, about the risks. But I don't think you have a compelling solution. And, anyways, there are much stronger arguments for reigning in data collection at big tech than the risk of impending authoritarianism.


> Something similar would happen at FAANGM. ... lean on those companies to collect whatever data the state wants.

Well, that has already happened. The NSA went to Google and other companies and asked them to implement PRISM, and they did: https://en.wikipedia.org/wiki/PRISM_(surveillance_program)


Right? This comment thread reads like this sort of thing hasn’t already happened here. But it has. The US may not be the most authoritarian regime, but I think that its recent actions scream authoritarianism louder than any words claiming that it’s not.


>Is more privacy the solution?

It is the only solution to prevent massive widespread reduction of freedom.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: