Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When a human looks at another person, are they "collecting data without consent"? Most would say no.

But if that human follows and looks at that other person 24 hours a day, would you still say that no consent is required?

So then CCTV should be illegal too then

It is, in many cases.



>But if that human follows and looks at that other person 24 hours a day, would you still say that no consent is required?

This. Scale matters. Scale matters. Scale matters.

It's such an infuriating and intellectually bankrupt slight-of-hand when someone implies that because some small thing X is fine, then 10000X is fine all the same.


> 24 hours a day [...] 10000X [...]

I find it infuriating that it's either small or all the time + many orders of magnitude more. It's unfair to say "any look is invalid" because of perceived/potential scale. Scale only matters when defined.


Well, then quantify the scale at which data collection should become illegal with unambiguous terms.

I don't see anything wrong with making observations about people, especially if it helps my business. I'm allowed to do so without computers: "Ah, sir, I see you are sunburned, did you know we have ointment for that on aisle 4?" (no one would say: "How dare you observe that I am sunburned! That's a privacy violation! You didn't have my consent to observe that!")

So tell me how far I'm allowed to go with computers.


So, you would walk up to someone and say "Hello, I see you are clearly having mental problems! I have some products you are going to LOVE!"and then expect nothing but a positive response?


Observations need not always end in direct interaction with the potential customer. If I notice a lot of people in my store are depressed, for example, then why wouldn't I stock more products that depressed people buy? If I see that my clientele is mainly women, I might tweak my inventory in other ways.


Well that’s easy — with computers, you need explicit consent to even “observe that the user is sunburned”, because there are no inherent scaling limitations with computers.


So you are saying if we remove some of the scaling limitations of the human brain with biotech (i.e. by enhancing memory detail retention, figuring out a way to serialize memories to computer-compatible storage, etc.) it could become illegal to look at a person without consent (since you would effectively become a walking, breathing CCTV)?


Yes, I think that’s the logical conclusion of GDPR-style thinking about privacy. I would certainly protest against being able to index that data any which way.

I understand the flipside that you’re implying and the argument that you’re making, I just don’t agree.

When an individual has a computer-indexed memory that is admissible as evidence in court, I think it’s pretty okay to use it for all of the things that we use memories for today. But what about reselling that data? What about data sharing agreements that subsidize your implants? What about hackers?

I really hope we don’t get truly infallible, computer-backed memory.


Well, at least you are consistent with your position.

I just don't like that we rely on what amounts to DRM in order to give the human brain exemptions to privacy and copyright laws.

I feel that I "own" my memories and nobody should be allowed to tell me what I can do with them. If there's a device that lets me dump them to computers and sell them, first of all, it should be legal, and second of all, I should have that right.

I feel that you do not "own" what I observe about you with my own senses and that I do not need your permission to look at you, listen to you, or generally infer things about you. I don't see how a memory dumping device is different from computer sensors, and thus I don't have a problem with computers collecting information about me in public.


I feel you. Quite literally, I am emotionally attached to the idea of ownership and my memories seem like something I should own.

I just think the entire concept of “ownership” breaks down when you have zero- (or epsilon-)cost copies.


This is only a problem in a world with corporate personhood. Presumably the issue with people would be their use of said actions.


> But if that human follows and looks at that other person 24 hours a day, would you still say that no consent is required?

What if I'm not even following anyone; I'm just sitting on my front porch all day and the I notice the same person keeps walking in front of my house? Is consent required even though I haven't moved? I've made 10 observations in 10 hours, yes, but only because the person has walked in front of my house 10 times.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: