Hacker News new | comments | ask | show | jobs | submit login
A radical proposal to keep your personal data safe (theguardian.com)
363 points by ngcazz 10 months ago | hide | past | web | favorite | 125 comments

I wish we could find a way to apply the model of the vending machine to mass commerce.

A vending machine doesn't make notes of who buys its products. A vending machine doesn't keep logs of its transactions or of the payment methods used. A vending machine does not make "recommendations" of what the consumer might like based on their past purchases. You give the vending machine a dollar, it gives you a soda. The transaction between consumer and vendor is done, and no tangible evidence of the transaction remains, other than the exchanged goods.

But we haven't figured out a way to scale the vending machine model yet, because when the machine gets stuck and fails to dispense its product, the consumer gets an attendant to help or kicks the thing a few times in frustration and walks away, because the lost dollar isn't worth the hassle. If someone paid $1500 for their laptop, however, they're going to want that convenience of being able to prove they purchased the laptop from XX retailer and be compensated if the thing is a lemon.

I'm not sure we can solve this problem, completely. But we might do better if we actively aspired to make our services more like vending machines.

EDIT: Yes, I'm aware that some vending machines these days can be paid by credit card. My university even had one where you could pay using your Student ID, which was linked to the balance allocated for your meal plan.

Physical stores can be (used to be) big vending machines. They accept cash and give receipts as a proof of purchase and can (usually) be found in the event of issues. They're probably recording you on camera in case you steal or use counterfeit money, but vending machines probably are too.

I think the real problem is doing the same thing at a distance. At a store you know where to find the entity you purchase from in case of issues and you get a receipt at the same time you purchase. You can sort of get the same thing with an individual face-to-face, but only if there is an established trust in some form (buyer has to trust they can find the seller if something goes wrong, seller has to trust buyer isn't using counterfeit money, etc). It's pretty hard to trade at a distance without revealing something about your self. At the very least, there's going to be a middle man that will want to know something (a lot) about you if they're going to provide you services (P.O. Box, shipping company, etc). They want to make sure customers stay honest so they don't get in trouble when customers scam each other or make an illegal trade.

It all comes down to trust. I think all the retailers, on and off line, are blatantly violating that trust when they sell our data off, but not enough people seem to care or have a better alternative, so there are no consequences for retailers. New law is usually the last thing I want to see, but it seem to the only way to introduce a consequence for sellers at this point.

> But we haven't figured out a way to scale the vending machine model yet,

Haha sure we have, it just isn't as profitable as the current solutions which are to collect as much data as you can and use that information secondary income such as selling to marketers or using for "analytics" or "targeting"

Service builders just have to care about designing their systems like this. Every single person in this thread who works with or for one of these information collecting companies is responsible for the current state of the world.

> If someone paid $1500 for their laptop, however, they're going to want that convenience of being able to prove they purchased the laptop from XX retailer and be compensated if the thing is a lemon.

That can easily be solved. Either by using GNU Taler as Stallman describes, or by issueing cryptographic receipts of a purchase, whereby the vendor digitall signs the receipt so they can recognize it later if presented, but internally they just don't store a record of that transaction, or they don't store any information about who purchased that.

It's not a hard problem to just not store personally identifiable information, you just don't store it.

What is a hard problem is convincing people in these industries that they should care, that they should protect user privacy by not collecting the information in the first place.

>A vending machine doesn't make notes of who buys its products. A vending machine doesn't keep logs of its transactions or of the payment methods used.

Well technically the vending machine itself doesn't note this, but I've increasingly seen vending machines which accept credit cards. Thus your CC issuer knows where you were, when you were there, and can even make some inferences about what was bought based on the price.

Personally, I keep an "entertainment" budget which I withdraw in cash, both for privacy reasons AND budget reasons. (Staring down into your wallet at cold hard cash makes you hesitate before buying that cup of coffee or other sundry)

How close is Apple Pay? Is there anything in an Apple Pay transaction identifying the customer (that can be correlated across purchases)?

I've heard good things but I have no idea about the details.

Edit: https://www.engadget.com/2014/10/02/apple-pay-an-in-depth-lo...

merchant sees NO account information but has the final four digits of the PAN [CC number] as an account identifier to use as a mechanism for tracking volume/activity per account

Like another comment mentioned, no matter how secure Apple Pay is, any sort of anonymity is easy defeated by having a video camera pointed at your face.

You can prove that you purchased something by having a receipt. It can be as simple as a unique purchase number. If you pay by some cryptocurrency, you can even prove you own the sending address.

True but cryptocurrency is not particularly anonymous once you're buying physical goods that need to be shipped to an individual. Imagine how useful the blockchain would be to Amazon if they knew the addresses of all their customers wallets (which they would surely collect) and the addresses of all their competitors wallets. They could see where their customers shop and micro-target deals.

I like the GNU Taler model mentioned in the article where the payer decides whether they want to be identified, but the payee is always identifiable. Combine this with a crypto-signed receipt from the payee and you have a system that better respects privacy.

Problem is that vending machines don’t scale. Consider that much of the “tracking” on a site like Amazon.com are to overcome the inherent problems of finding products in such a massive catalog.

Maybe it is just my experience, but individual tracking (which is the sort of tracking under consideration here) does not seem to help, let alone be necessary. I do not think I have ever found something of interest in things that have been pushed at me, and the only thing that helps navigating a massive catalog is more extensive-in-scope and discriminating search capabilities.

To add to this, what if the vending machine was able to dispense 1000 things? And let's say that you are diabetic? Wouldn't it be nice if the sugar-free treats were easy for you to find? And the organic stuff easier for those seeking them to find? Etc. etc.

Somewhere there must be an intelligent balance.

That's an argument for good search and discoverability - a sugar-free aisle.

Amazon could be that machine, if we could rely on them to (at the customer's option) properly delete the customer's credit card number and delivery address when the package is delivered.

> Somewhere there must be an intelligent balance

In practice though this is mostly biased towards the revenue of the company selling the goods, not towards your health requirements.

It's in the interests of the company to align those however - so these don't conflict. If your customer is healthier and lives longer because of your service, that is beneficial to both of you.

Good health of the customers benefits competitors too, so it's a type of commons. By the Tragedy of the Commons, it's in the company's interests to damage the health of their customers for short term profit. If they don't then a competitor will.

How did diabetics even manage to buy sugar-free snacks before Amazon's recommendation engine?

Thank god for Amazon profiling them, they must have been starving before.

It sounds like you are describing the problem of trust. In a vending machine, you trust the brands to deliver a standardized product. You know what you are buying ahead of time with no doubt at all. the cost is also low enough that you trust the delivery mechanism within a given level of risk tolerance. Finally, you don't expect the product to last, it is a consumable by nature, so no relationship needs to exist - your relationship is with the brand, not the individual can of soda. If you choose to give the soda away, the person you gave it do also knows what they are getting, and the brand doesn't care who drinks the soda.

To take your laptop example, though, you may trust the brand, but, you expect it to last. therefore, your relationship is with the brand AND the product. You will need to service that laptop, and they, as the vendor will need to set parameters around what services they will and will not provide in which circumstances. They have a kind of "contract" with you that doesn't necessarily extend beyond you. In fact, there are some situations with a laptop where giving it away to the wrong person could be a crime.

My only question is: if we could "solve" this problem, would we actually want to? Do we really want to make every interaction we have into some kind of transaction, with no obligations beyond payment for goods? I don't think so. I think most of our consumer protection laws have been developed to protect consumers from the considerable costs that such a system will create - warranties, lemon laws, etc. Critically, all aimed at improving the amount of trust between brands and consumers.

Instead, I think we probably need to more carefully define what kind of trust is required, and what the covenants of brand/consumer trust actually include regarding data. GDPR is an admirable model, and probably a step in the right direction, but certainly not a perfect effort.

I'm sorry, but you are unfortunately wrong. Some colleagues are involved in the development of a vending machine that does all those nasty things. And I think there are similar products already deployed.

I think the point is that a vending machine doesn't need to do all those things to fulfill its purpose. Dollar goes in, soda comes out. Yes you can make a vending machine do that, but that's not the point.

What is the purpose of the vending machine, from the consumer point of view?

Is its purpose to provide an oasis of snacky goodness to those who happen to encounter it while passing through? If so, then yes, all it needs is dollar in, item out.

I'd argue, though, that vending machines that are in places where they are regularly encountered by the same people over and over again, such as vending machine in office buildings, dorms, schools, and such have a higher purpose.

They are to provide those who live or work near them a snack safe space. A place they can count on to take a quick break from their work day and find their favorite snacks.

A vending machine that could keep track of regular customers and their snacking habits could better fulfill that purpose. For instance, suppose Carol from accounting only ever buys Tab, and the machine is running low on Tab. Dave from marketing likes Dr. Pepper best, with Pepsi a close second, and only rarely buys anything else.

Dave tries to buy the last Tab (perhaps by mistake). The smart vending machine could recognize that it would be better to reserve the last Tab for Carol, and could tell Dave, "I'm sorry Dave, I'm afraid I can't vend that to you. Would you like a Dr. Pepper or Pepsi instead?".

The future didn't give us the flying cars and jetpacks we were promised, but at least it can give Carol from accounting a reliable Tab supply at work.

>They are to provide those who live or work near them a snack safe space.

>"I'm sorry Dave, I'm afraid I can't vend that to you. Would you like a Dr. Pepper or Pepsi instead?"

I realise this was humour, but you framed the problem as I see it perfectly. I don't feel safe if I don't know Hal's code, especially if he's talking to all the other Hal's (be they other vending machines or information aggregating corporations). And what are the odds that these things will be open source?

Nice idea, but you just created the echo chamber of soda pop, that gently restrains you within your existing habits. Substitute [liberal-slanted article] for Tab and [conservative-slanted article] for Dr. Pepper, and we've now invented the Facebook newsfeed.

Maybe Dave and Carol need to try some new soda flavors once in a while!

The vending machine would be a more equitable host by telling Dave that it was saving the last Tab for Carol, and offer the alternatives. Dave could accept or decline. Maybe Dave wants the Tab because he's fetching it for Carole.

But that would be revealing personal information; some customers would consider it creepy, and possibly even sue.

You could call it the Nutrimatic. Share and enjoy!

This seems like such a losing battle. Payments on public transit could become anonymous, but the video feeds would still be recorded, and facial recognition could be easily performed.

The cat is out of the bag. Information technology makes this data accumulation inevitable. Likely even for those who walk around with a face covering all day, pay only cash, and don't use smartphones. By doing those things you actually make yourself more noticeable, in a lot of ways, and are still trackable.

There are some cryptoanarchist ways to achieve practical privacy, to all but the most dedicated governments. Yet, those technologies have severe downsides for both individuals and society at large.

There's no obvious way out of this conundrum. The concept of the Participatory Panopticon is one proposed solution, and is less top-down, but not necessarily any less capricious than centralized authority.

The best safeguards are to create strong institutions, informed citizens, stable economies, and all the usual boring social reform stuff that tech geeks hate and think is mushy, unchangeable, hard to study, and likely irrelevant.

It's also interesting to look at Stallman's tone: he's defeated. "What can be done about privacy Mr Stallman?" "Society needs to care more." It's the same tone JPB had when asked about how the EFF can have an impact on younger generations who take the internet for granted. "They need to care more. And education."

They're not wrong, of course. And your comment is very similar: the problem isn't the technology, it's the impetus of society that needs to change.

I mean, they're kinda wrong.

You can care plenty about these issues but not see an actual way to have impact such that your concerns are addressed. If society isn't caring, then either society doesn't see why there's a concern, or society recognizes that atomically it does not have agency to address the issue, and the people who do have that agency are the ones that don't give a shit.

Saying "society needs to care more" is the last stop before the final destination of "this can't be done". Getting great many people to coordinate on something that's beneficial to them long-term, but carries an (even insignificant) short-term cost is one of the hardest (if not the hardest) things to do, out of all the things humans ever tried.

I don't think calling out 'society needs to care more' as bullshit means we collectively throw up our hands and walk away. It means we recognize that 'getting many people to coordinate on something' just plain isn't going to happen as the primary mode of addressing these issues and instead work on solutions that do work.

For a subset, sure, maybe a good PR push gets people sufficiently inflamed about a particular topic, but does that mean that key problems that aren't 'viral' enough aren't deserving of a proper solution? If "society needs to care more" is the answer, then that's the implication. But that's obviously wrong.

I interpret "society needs to care more" as a challenge: what can we do to make society care more? Put another way, I view this as a collective action problem.

The classic solution to collective action problems is to have the government take care of it.

But here, the government is probably the biggest offender. You'll pry their databases and TSA inspections from their cold, dead hands.

I meant it more as: you need both. (At least I'm not suggesting people shouldn't care--can't speak for the GP, RMS and JPB, but I suspect they also still care[d]--and you always can care.) When you've spent large portions of your life advocating for digital liberty and software freedom only to see it decimated in both the public and private sectors in favor of convenience & control, it probably makes you a bit jaded. And you wonder if you missed something. Where the battle stands today, at least in my opinion, is with convincing people that their life is better with these freedoms, not as much that these issues exist.

My point was simply that we can't skip the stable society part and sit in an ivory tower of activism.

They mean that the majority of society needs to care more collectively. If that were to happen then there would be impact.

He doesn't sound defeated to me. He's repeating the same message he has for a long time now. The only practical realistic solution is at the political level. I don't recall him ever implying the tech alone was a complete solution. Votes, not Code.

>the video feeds would still be recorded, and facial recognition could be easily performed.

As another pointed out, the article covers this topic, explicitly.

>The best safeguards are to create strong institutions

I would replace 'strong' with 'transparent and honest'. Further, calling it a losing battle is a bit defeatist.

>Yet, those technologies have severe downsides for both individuals and society at large.

Out of curiosity, what are some of these downsides?

Zuckerberg- is that you?

Seriously, WTF!?!

Not one part of that comment is true.

The cat’s not out of the bag. Nothing “permanent” has happened. A law could be passed tomorrow requiring that all of our personal data be erased. You might not like it and it would put you out of business.

And that part where you say “best safeguards are to create strong institutions...”. That sounds strangely like “the 4 dog defense” article that’s currently on HN. You just skipped the other arguments

1. We don’t have your data 2. If we had your data it’s it would have taken work to collect it. 3. If it took work to collect it it’s ours since you should have protected it. 4. If you didn’t protect it, the only answer is to form strong instituitions to protect the information formerly about you, but which now belongs to us.

> A law could be passed tomorrow requiring that all of our personal data be erased.

Such a law would be politically viable in only the most enlightened of governments and societies. Which goes back to the parent's point of "strong institutions, informed citizens, stable economies, and all the usual boring social reform stuff".

> Payments on public transit could become anonymous, but the video feeds would still be recorded, and facial recognition could be easily performed.

In the article Stallman explicitly proposes deleting such video recordings after a timeout, and forbidding facial reco without prior cause.

His first argument was that accumulation itself needs to be prevented. If security requires accumulation proceed, then the answer is that his vision of preventing accumulation is not really possible.

How many times have we heard of large institutions suffering data breaches or failing to delete data that they are supposed to delete? Once it is accumulated, all bets are off.

Once again, The Fine Article has you covered. Stallman says when such logs are necessary, they should never be network accessible. Go get the disk out of the physically secured machine if you need the data for $SERIOUSREASON.

Maybe the answer is to turn the dial to 11 on information sharing and require data transparency from managers/leaders/public servants who operate data accumulation outfits.

Track users' location data? Then your location must be public record & realtime. Etc.

At the end of the day, this is about information asymmetry. If the general public loses privacy, then so should those who benefit from that information, in order to level the playing field.

> At the end of the day, this is about information asymmetry. If the general public loses privacy, then so should those who benefit from that information, in order to level the playing field.

This could be said for literally any transaction, ever. The bank knows how much money you have, but you don't know how much money the teller has. You don't know which books your librarian has out or what the bartender drinks. They are also citizens with their own right to privacy.

I'm all for transparency and oversight, but information asymmetry will always exist. Some industries have regulatory structures around that asymmetry to prevent abuse, HIPAA for example, and others have not.

maybe replace "managers/leaders/public servants" with "shareholders/party donors" to really ruffle feathers

also , why settle for a level playing field ? why not high privacy for citizens and high transparency for gov/large orgs ?

How about wearing huge Sombrero so that your face is invisible for cameras mounted above your head?

Then they just track the person with the huge Sombrero!

What if we all wear huge Sombreros?

Or walk around in grey suits and bowler hats with apples in front of our faces

Expecting groups of people (tech companies or government) to purposefully do something that is disadvantageous to their bottom line, for nothing more than ideals, is a pipe dream.

The real question is personal and harder to answer: "Do I really need this information technology? Is it making my quality of life any better than my ancestors?"

The more I ponder this question, the more I'm convinced I could live an equally happy and fulfilling life outside of the majority of consumer technology that has been introduced in the last 20 years. I benefit from a career creating that technology, but I'm partaking in it less and less once I clock out.

If you haven’t sacrificed anything for your principles, then they are just words.

I think the deeper problem is that we no longer even pretend to value people of character.

I'm prepared to sacrifice many things for various principles. But I won't sacrifice access to information, which is what I find most valuable about the internet.

Information is power, and if principled people sacrifice their access to information, then unprincipled people will hold more of the world's power.

So the danger of encouraging principled people to opt out of parts of the internet is that they will become less powerful, and unprincipled people more powerful.

> Information is power, and if principled people sacrifice their access to information, then unprincipled people will hold more of the world's power.

What if access to information necessitates handing over information about oneself, which gives other entities power to influence one? Eg: One can browse for information, but companies will track browsing activity, and use that towards targeted persuasion.

I'm not very vulnerable to targeted persuasion (partly because I consume so much high-quality information). There are enough vulnerable people to swing close elections, but it's a small minority. I don't think the answer is for those vulnerable people to cut themselves off, though.

How do you know this? Would you consider a system of targeted persuasion where the subject is aware they're being manipulated a successful one?

I'm being subtly guided towards products and services all the time through myriad sneaky marketing techniques I'm completely unaware of. I know enough of them to appreciate how much I don't know.

Well, the example given here is Transport for London, which is almost impossible to travel without using an Oyster smart card. So yes, pushing for legislation along the lines proposed in the article would make sense.

Would a government that has a CCTV on every street enact legislation to limit its surveillance abilities?

I agree that it's a good ideal, it's just incredibly naive to think that it will be achieved. Even if legislation were passed, it would very likely make things worse, in that the government wouldn't stop surveilling - they would just do it even more covertly.

There’s already quite a bit of legislation about how CCTV can be used in the UK.

Our democracies have lots of problems, but they basically work, and on issues like this, the main issue is apathy.

I’m a firm believer that if people are informed and they care legislation like the above could be enacted.

they basically work, and on issues like this, the main issue is apathy

People are apathetic because their democracies don't basically work. Elections and occasional referendums are very crude kinds of representation and are easily manipulated by those who have a lot of cash. Legislation is great but the reality is that you're on camera almost all the time in a large city like Londonand no politician is going to lay themselves open to easy attacks by proposing to dismantle that infrastructure.

Bentham's panopticon was a design for a prison, a mill "for grinding rogues honest." Now the whole society is constructed around that idea; you have legal rights but the implicit premise is that people are rogues. You're not free.


> on issues like this, the main issue is apathy

Is it? Or is it the notion that if you scream about something you don't like enough, someone else will fix it, through legislation, so you can stop worrying about it? And, of course, that legislation will work perfectly and never be abused or radically reinterpreted a generation down the line.

I'm for the belief that if you don't like something then don't partake in it - short of it being something being physically forced on you.

> is it the notion that if you scream about something you don't like enough, someone else will fix it, through legislation, so you can stop worrying about it?

This notion is often called collective action, and it’s how most of the social change of the last couple hundred years was achieved in the US (and elsewhere), from black people and women attaining the right to vote, to the end of child labor, to gay marriage, to the existence of weekends.

This 'screaming' is the basis of representative democracies; the system of government that we live under.

> if you don't like something then don't partake in it - short of it being something being physically forced on you.

In every country I'm aware of, surveillance of at least some physical space is non-optional, and could be called 'forced'. The dichotomy of "physically forced" and (by implication) 'voluntary' doesn't seem useful. Very few of the current and hypothetical problems with surveillance are because it's physically-forced on people; it's that a cost-benefit equation is engineered to make it practically irresistible.

Let’s say you’re a poor high-school student. To participate in classes you need to use the school issued Chromebook and an associated Google account. You could resist and not participate in class, but even given a full understanding of the privacy cost, this would be irrational and self-destructive. If you'd like to join your residents' association that may require a Facebook account, if you'd like to travel on an airplane that likely entails video-surveillance which will be used for facial recognition, if you take a job at a company with significant-IP (or not much cash), you're very likely to be working on a machine containing a surveillance-rootkit etc. etc. ad nauseam.

This article is about solutions to the societal problem of mass-surveillance (both from the state and industry). Retreating from society for your own protection (while things get worse for everyone else) is never going to be helpful for addressing the societal problem.

Perhaps you think the status quo is fantastic and shouldn't be changed. Perhaps you have the resources to live a puritanical life avoiding surveillance; I'm impressed! But the fact that you have the freedom to do so is a direct result of the collective action of others.

The Washington DC Metro switched over to their equivalent a few years back, the "Smartrip Card", retiring their previous paper with mag-stripe cards in the process.

My 'solution' to not being tracked is that the Smartrip Card I was forced to then buy was bought with cash, and has only ever been refilled with cash.

So their systems know that card # 828272823 (made up number) has traveled here or there (I don't use it much, so there's not many trips on it anyway) but they don't know 'who' is using that card.

If they ever drop the ability to use cash to top up the card, then the card will go in the garbage bin and I'll not ride the system again.

Your (attempted) solution illustrates the potential dangers of data collection.

There are many companies with some timestamped location data about you: cell phone company, email or social media, credit card company, etc.

Just a bit of that data combined with Smartrip data would de-identify you as the owner of that card, and all of your anonymity about your metro trips is lost. For example, three or four trips where you used a cell phone at either end of the trip; credit card purchases in different locations; maps directions on your phone; or so on.

This is not far from plausible. I could see the DC Metro naively choosing to sell "de-identified" data to third parties, who can also purchase e.g. credit card data.

You could just charge it up with a visa gift card or something, but (as others have pointed out) I'm betting that it's not nearly as anonymous as you think. If I'm a prosecutor trying to hang a charge on you and your Smartrip card is a key piece of evidence, all I need to do is correlate it a sufficiently tiny probability of being anyone but you and the jury will accept that.

They've now introduced the ability to pay by any contactless card, not just Oyster. I think long term they will phase out Oyster. One of the benefits to them of doing so would be the fact that it's not as easy to have multiple contactless credit/debit cards as it is to have multiple Oyster cards.

To be frank, though, I'm struggling to see what the fuss is about when it comes to TfL. It's their system and they can do what the heck they want with it. As their customer I am OK with them tracking me around their property.

I love RMS but I'm not holding my breath waiting for any government laws to protect me.

For what it's worth, neither is RMS, if you've read about how he uses the Internet.

Let's keep building the free software tools so anyone can easily get the level of privacy they want for whatever they're trying to do. We've made so much progress but there's still so far to go.

Do not to look UP to authority for hope - look AROUND at your fellow citizens working hard to bring free privacy software to everyone - that's who's going to fix this, because no one else will.

One of the problems is that the software you run on your own computer hasn't been the major source of privacy concerns in a long, long time. That isn't to say that software on your computer doesn't track you, but it's dwarfed by the amount of tracking and privacy issues that you need to compromise on to use the vast majority of the internet.

It's noble and laudable to take the same stance towards freedom and privacy that RMS does, essentially check out of the modern internet, and reduce your web usage to calling wget on a shell, but that's not a workable solution for the vast majority of people.

Give me a break, is this not classic fallacy of making "perfect" the enemy of "better"? With an attitude like that, things never improve.

Privacy protection exists along a gradient. Just because one does not want to sit at the far end with RMS does not mean we just throw up our hands and say "well, I guess I have to give up all my privacy, because I can't do what RMS does."

And as I said, there's still a lot of work to do. So let's get to it and stop the nay-saying!

I'm not arguing that at all - precisely the opposite, in fact. I'm saying that so long as we put privacy in the hands of end users and don't hold service providers responsible (the point I was replying to), we more-or-less cripple our ability to utilize the internet that exists today.

We can use strong end-to-end encryption to evade snooping middlemen. Beyond that, we hold service providers accountable by not using their services if they are not willing or able to provide the security and privacy we require.

If you want an authority to hold them accountable, you are just shifting your trust from one entity you have little/no control over, to another entity that you have little/no control over.

Look around, not up. An authority will not solve this. We must solve it, together.

> essentially check out of the modern internet, and reduce your web usage to calling wget on a shell

Minor nitpick, but he also uses icecat + Tor when needed

https://stallman.org/stallman-computing.html (grep for icecat)

wget leaks via dns

The article raises good points. One important aspect is missing though; the reliance on web services, SaaS, and the like. Free software is not enough. We need free software that runs locally and offline.

RMS is also against SaaS (or Software as a Service Substitute, SaaSS), but AGPLv3 SaaS would be a step in the right direction. At least then you could examine the source code and see what data is collected.

Edit: s/GPL/AGPL/

The source you directly interact with perhaps. Can you say for sure that the code running at the server end is the one in the repo or tarball?

Of course not, but the same can be said for local software. You usually don't build from source. But at least you can inspect your own network traffic more or less reliably.

Having everything be AGPLv3 is not perfect, but I think it would be an improvement over the current state of affairs.

That sounds like exactly the Affero GPL.

Whew, I confused those. Thanks for pointing that out

No I mean, the license you’re describing sounds like the AGPL. If you use AGPL licensed code on a server which a user interacts with, you must provide them with the source code. That’s perhaps the biggest difference between the GPL and the AGPL.

AGPL is “GPL for SaaS”.

When it's RMS, free means Free (under Libre terms) & Open Source. And yes, I agree; back to local and offline first applications is a must.

Best i understand, open source it charitably seen as a subset of the freedoms fronted by the FSF.

If you claim that RMS does open source in his presence, he will sternly correct you.

Alan Kay has made this point rather eloquently in the past. Like stallman, his good points were not enough to stop free market from perverting the technology, with disastrous consequences.

Apples privacy policy actually says they will always do "on device processing whenever possible". Where the line between possible and feasible is drawn isn't really specified but If they really live up to this it is reason enough for me to only buy apple products as the other 2 options seem to have a completely different view of this.

I'd love it if this were true, but we don't really know if it is true or not, and Apple refuses to allow us to see for ourselves. I really want to like Apple, but they won't put their money where their mouth is.

Maybe the way to go with this is to have an auditing system where third-party auditors go through the source code (with NDAs) and certify it meets $PrivacyStandards. This might be a way to appease the folks who want to keep their software proprietary and yet make sure they don't just get to do what they want. Kind of like financial audits.

Sorry, but no. For this to work you will need a select group of people with too much power. Why would I trust those people instead of the original creators? This is just going to create a parasitic industry, with no much value.

Man I'm so sorry I let this slip and didn't get back to you. It's not perfect for sure, but better than nothing and possibly something people will buy into.

The big problem is there is no licensing-yet- that could be removed if they were caught lying. So you're right- we would have to build a trust model somehow.

Obviously this proposal will be rejected because both companies and governments are addicted to the power to control and manipulate that data gives them. We need a strong leader who will go against the grain of these massive entrenched interests and be willing to stand up to the media onslaught such a stance will bring. The mainstream media will attack anyone who threatens their advertising based business model. When will there ever be a leader that doesn’t bend to the will of the media and deep state surveillance machine.

1. We don't need a strong leader. Fascism and authoritarianism have zero interest in the privacy of citizens. Never had, never will.

2. Power corrupts, so the overall answer is: never.

Leadership != fascism.

Change from below requires organization and coordination, and those things require leadership.

Were MLK or Ghandi fascist, because they were effective leaders?

Frankly, I find more of a fascistic tinge in the stock HN response to surveillance:

- Governments will never do anything; democracy is not fit for purpose

- A ‘natural elite’ of hacker Übermensch can avoid dystopia for themselves with hacking skills and expensive crypto currencies. The masses cannot be saved as they are inherently stupid and submissive. Things will always get worse, especially for them; abandon them.

Why such fatalism? The GDPR hasn’t even hit the books yet, and it’s having a transformative effect on how large US tech companies process data for everyone. We’re just beginning to see the tide turn on public opinion on Facebook.

I’ve hope for popular interest in privacy and anti-authoritarianism yet.

Leadership != fascism

Leadership + time = fascism

Are you really implying that all groups of organized people are fascist sleeper cells?

That MLK and Gandhi were saved from their Hitlerism by assassination by heroic ethnic-separatists?

I'm repeating the well-known saying that power corrupts. A leader is a baby fascist.

We need a strong leader

That's how you ended up with a surveillance society in the first place, silly.

And if you have a weak leader, you have companies running around doing whatever they wish (e.g. Amazon and Google now) because there is no oversight or regulations, and pitiful punishments for betraying their customers.

I think institutions are more important than leaders for sustainable social goods.

Simply put, never.

Future strong leaders will employ all of those means of deep state surveillance and invest in new ones in order to remain strong.

That cats out of the bag and it ain’t going back.

There are LED street lights that also double as security cameras. Soon, you won't even see where the cameras are. The cameras are nearly impossible to spot. They are being tested in a few cities. I believe there is still an ongoing legal battle in San Jose for these.

The cameras are placed in the public domain, but they can see into houses on every street. I believe Google had some issues with peering into homes and storing that data. Will cities face the same issues? The Google car only gets data when they drive down the street. These cameras will gather data 24/7 using low-lux (low light) cameras.

We need new business models which depend on user-owned data, rather than treating privacy like a charitable contribution for saving the trees.

Instead of citizens sending data to the cloud, corporations can send code to citizen-owned devices where local computation can take place, with data only leaving via an open-source content inspection engine that enforces citizen-consent policy.

It's interesting and something we're investigating at my company. Merely building out the marketplace/exchange for this stuff raises a ton of questions (which is good).

I think creating a legal regime where corporations are financially responsible for damages created by misuse or leaking of personal data would discourage companies from collecting the data in the first place.

By all rights, the equifax databreach should have ended the company. Instead, they're likely to profit from it.

I don't know about this, but I'd love to see Apple make 3rd party data aggregators individually authorized or not via system dialogs. It would require some policing, but shifting these to opt-in instead of opt-out would be an amazing step in the right direction.

App authors would then be able to decide whether they wanted to let you use their app despite you declining to let them send/sell* your data to a third party.

* most app creators would claim they don't sell users data. But they need to acknowledge that when they receive free telemetry services for their app from a 3rd party, they are doing exactly that: selling user data in exchange for a service that would otherwise have some expense (in dev time or otherwise).

> Video cameras should make a local recording that can be checked for the next few weeks if a crime occurs, but should not allow remote viewing without physical collection of the recording.

it would suck if the terrorist bomb also took out the cameras themselves.

I have cameras at my house wired to a local unit. It's mostly for convenience and looking for animals and not very serious for security; any well-respecting house prowler would want to take the recording unit from my server area so that I wouldn't have any video of who broke into my house.

The data could be stored encrypted on a VM and cycled off after {n} hours. With the right codecs and encryption, you could likely store a couple days of motion on a small amount of storage. If uploaded in chunks, you might miss a couple minutes prior to the unit being destroyed. I suppose protecting your internet uplink becomes important at this point.

Just store the encrypted files in AWS S3 (or equivalent)?

That works too. And maybe use ffmpeg to encode + 7-zip (p7zip 7za on linux) to encrypt and name the chunks based on date in a programmatic / deterministic manor.

then I can view the recordings remotely without physical collection of the recordings, exactly what RMS proposes should be illegal.

That's awfully close to movie plot threat thinking. Put your NAS in a secure back room and you're going to be fine for practically every scenario short of a 9/11 scale attack. And if you insist, just save your recordings to an off-site location. I'm pretty sure RMS is cool with co-located servers, as long as you control them.

> I'm pretty sure RMS is cool with co-located servers, as long as you control them.

that wasn't my reading of what he wrote. He wrote, "should not allow remote viewing without physical collection of the recording.". If a town has security cameras all over the town, and they all transmit recordings to a co-located server facility, there's no reason to physically collect anything - anyone with access to the servers can easily collect surveillance of the whole town for no reason.

> Put your NAS in a secure back room and you're going to be fine for practically every scenario short of a 9/11 scale attack.

Security camera systems are hoped to be useful in the event of a 9/11 scale attack, which has actually occurred, as well as any large-scale bomb attack that might destroy an entire facility, or may have the vault with the servers buried under tons of rubble where one needs the footage a lot quicker than that.

Well written and argued, but I have two issues.

First, it's arguing against the natural evolution of technology. Centralization is a re-occurring feature of software systems due to it's technical benefits. Data collection, organizational structure [0] and other considerations also affect centralization, but their existence doesn't negate the technical considerations. Whether an ATM or bus card system should be (de)centralized is partially a technical issue: what networks are available and how reliable are they, what's the cpu/mem/storage of the end points, how often do we update, etc...

Second, restraint of Governments might be aided by making collection more difficult, but Governments can deploy large groups of people for long periods, while individuals and smaller groups cannot. Youtube and cellphone cameras empower the weak, while the old technology of cameras belonging to local businesses [1] benefit only the Government (or very large groups.)

[0] "organizations which design systems ... are constrained to produce designs which are copies of the communication structures of these organizations." https://en.wikipedia.org/wiki/Conway%27s_law

[1] "There is a gap in the footage from about 9:18 p.m. to 10:39 p.m., which covers the time when McDonald was shot by Officer Jason Van Dyke on a nearby street." http://www.chicagotribune.com/news/ct-shooting-laquan-mcdona...

The Netherlands used to have https://en.m.wikipedia.org/wiki/Chipknip but then it was retired. Weird, I thought it was pretty sweet, and anonymous. Maybe not, or perhaps it was too good and that’s why it’s gone

I want OS vendors to wake up to this and give me a way of storing all the social-network'y stuff on a USB stick.

When I'm using that social networks interface, I put the USB stick in my computer - and my computer makes that data available to the network. Then, when I'm done using it, I turn off the interface, take the USB stick out and put it in storage somewhere. The only parts of the data that are available, are the bits I copied to the social network while I was using it, and it can have those bits while I'm away to do with as it pleases.

I truly believe this is a function and responsibility of the local operating system, not the network.

If only the OS vendors were paying attention to these issues instead of .. you know .. trying hard to be the next big social network.

For a privacy-centric approach that is also compatible with GDPR and business needs, there's the MyData conference in the end of August: https://mydata2018.org/

The larger issue is the natural tug of war between government and industry. Pretending that such a 'law' could be passed - and globally, ultimately industry would find some way to effectively collect this data by doing something that is completely legal - based on the text of the law. In some cases (say a public data company...) there could even be a legal duty to do so, in order to comply with varying edicts already on the book.

The real issue is that people opt to ignore Stallman - he was right. By opting to use these services, whether you pay for them or not, you're acquiescing to being the product.

Even if such a law would become effective somewhere I could imagine ways around this:

What would happen if a transportation service provider would also offer the service of collecting points dependant on the used stations and those points can be used for special offers.

This way they 'need' to collect movement data to check for abuse of the system in order to offer the 'basic' functionality.

I agree with RMS, but it might be difficult to express such law without loopholes for companies.

So I can pay Amazon anonymously. That's nice. Now what about anonymous delivery? It could be done, with the address being sent to the Post Office as needed, for example; with Amazon not delivering the package.

RMS might want a law saying that whoever sold goods, couldn't also deliver them?

Ironically, the article is published under a non-free license.

Look what I found at the bottom of the article: "Released under Creative Commons NoDerivatives License 4.0"

So you can distribute this freely, as long as you acknowledge the author. And since it's content, not software, it's not imperative to allow modification. (Fair use still applies to critical responses)

You're being way too picky.

I don't know if it's irony, but RMS has this (I assert WRONG) view that opinion statements like this article shouldn't be free culture. He wants to use the legal structures to make any translation or derivative work something he can censor unless he approves of it. He's worried about being misrepresented (even though the actual free CC licenses don't allow altered derivatives to be presented as though they were unaltered etc).

I think this article could very well be a valuable free-culture contribution. The chain of derivatives could go deep, and having to get permission at every step cuts it off.

This is a long-standing disagreement between RMS and those of us who support cultural freedom.

Hoping that governments are going to outlaw surveillance, when they are the biggest practitioners and customers of surveillance, seems... optimistic.

GNU Taler seems to be such a betrayal of the GNU project's principles. Baking state surveillance into software isn't something GNU should be doing.

It's as if GPG was an implementation of the Clipper Chip algorithm because, you know, otherwise it might be used for ~~tax fraud~~ terrorism.

Personally, of all the crypto currencies it's the one I'd like to pay with most. I want companies I transact with to pay taxes, while preserving my psuedo-anonymity. I have no interest in using a volatile asset for an single transaction.

Happily there are plenty of others to choose from, too, but I'm glad that one with these overall design principles exists.

"There are so many ways to use data to hurt people that the only safe database is the one that was never collected.


The basic principle is that a system must be designed not to collect certain data, if its basic function can be carried out without that data."

Frills on the system, such as [some feature conceived by PM or developer], are not part of the basic function, so they can't justify incorporating any additional surveillance."

These additional services could be offered separately to users who request them."

The title is "A radical proposal to keep your data safe."

But this doesnt sound "radical" at all.

To recap:

1. Dont collect data when unnecessary for the user.

2. Applications which perform a single function.

3. Additional "features" (i.e. more code) that add more functionality and make applications more complex are fine but they should be optional (e.g. not pre-installed or automatically added via "updates").

> But this doesnt sound "radical" at all.

It really is radical to the current culture of silicon valley where every product must be "smart," collect user data to provide recommendations and "analytics."

There is little restraint done by companies on what data they collect, only on access to that data once collected. It really is a radical culture shift for these companies, and all the engineers that work for them, to switch to a attitude of only collecting what is truly necessary.

Tell me how someone anonymous travels from day to day and I will tell you who he or she is.

To decouple identity data from other data does already not work in theory. Moreover, it is also in conflict with the ability to retract personal data. If data is anonymized, it is not possible anymore to get rid of your personal records.

Homeomorphic encryption is the only method that might make a dent here. Laws will be broken.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact