Hacker News new | past | comments | ask | show | jobs | submit login
The surveilled society: Who is watching you and how (rnz.co.nz)
199 points by billybuckwheat 37 days ago | hide | past | favorite | 85 comments



> Billboards - The latest can watch you as you go by and tailor their ads to suit. The country had at least 1400 high-tech billboards last year; it had at least 49 that also have number-plate recognition cameras to count cars (not identify their owners, the operators insist). Some malls have smartscreens that can gauge your mood.

Hah, Minority Report is here. Digital signage around me tailoring ads to my preferences is a freaky idea though, will they start showing me "Hot singles want to date you"? I suppose they'll be able to scan the crowd and see who's the richest^W who's the easiest to pry money from and then show an ad targeted to them. And a network of these cameras can conceivably even identify where I live and work ("We see this face shopping at this supermarket most weeks", "most mornings we spot him at this coffee place"), and by doing so guess my income and spending levels.

And scanning cars is an interesting idea. "It's a Cybertruck, show the ad for the penis-enlarging pill!".


I worked with a vendor who has installed cameras at the doors of hundreds of bars that use facial rec to count the number of people entering and exiting, and their approximate age and gender

It was all sent to a real-time dashboard, and pretty interesting


Does it also pull up their social credit scores ?


At sports venues and in Vegas it does. They use facial rec to find people that have been previously banned. Not to mention the airport


EU: Yep, straight to jail.


One alarming method of surveillance I learned about this year was cars. Many cars come with built in surveillance and privacy violations. Several manufacturers allegedly sold the location data of car owners, tracked by their GPS systems (https://fortune.com/2024/07/26/general-motors-honda-and-hyun...). Some manufacturers revised their policies based on outrage, like GM (https://www.edmunds.com/car-news/gm-killed-program-that-sold...), but my recollection is others like Subaru did not. Many cars also have options for opting out of having your personal data resolve to data brokers and insurers but they opt you in by default.


> One alarming method of surveillance I learned about this year was cars.

And robot vacuum cleaners :) Although there's some hostility to cybershow here I'll reference a couple of episodes because surveillance is a frequent talking point. In this [0] one the discussion is exactly what you're talking about.. cars, and the incident with Tesla employees sharing 'funny' road traffic accidents.

In this other discussion about CCTV in general [1] two notable things came up:

1) From a security POV, there's an impotency of digital surveillance. It is a tool that works well as post-facto analysis (such as arresting a bunch of rioters the next day). But it has less value in real time unless you also have human resources - and in most cases (except in dangerous war zone) if you already have them they're superior to any electronics.

2) Communities are sold surveillance by an "insecurity industry". It's a big business. But they rarely reflect on its value or other effects. Surveillance is a sign of poverty if you take all quality of life factors into account. Overt security signals inner insecurity and social decay. The more cameras you can see in a place the 'poorer' that neighbourhood since a truly wealthy society is one with high trust.

[0] https://cybershow.uk/episodes.php?id=12

[1] https://cybershow.uk/episodes.php?id=26


One step backwards, it is greatly imprudent to match a tracking device (such as a GPS receiver) and one's name.

This would have worked in trustworthy societies - not these ones. (If a trustworthy society is still known in these times, please inform.)


Unless a person has given carrying a phone or keep it always powered off, they are already doing so. By definition, a cell phone always registers itself with the nearest cell tower, giving away its approximate location and history of movement 24x7.


Yes, with expectation of privacy the more the data is personal and sensitive.

You expect confidentiality from your lawyer, your medic, your wife, your telephony provider.


A society cannot be "trustworthy", but it can embody "high trust".

(Not wishing to be pedantic for its own sake, there's an important distinction)

High-trust is the average trust metric between any two randomly chosen individuals from the set at some time.

Trustworthiness (in one regard) is the historically accumulated record of positive performance against promises made. But a society is not a single entity. Maybe society's representatives might be trustworthy.


What I meant in the post was "In using some service, one trusts it will not be abused"; consistent abuse of service is societal - hence, said societies "are not trustworthy".


I didn't mean to undermine your comment, sorry.

So there's a push for "zero trust" society. What do you make of that?


> there's a push for "zero trust" society. What do you make of that

Given that what happens is what is allowed by the society in which it happens,

the priority is to flee - as fully bad indicators are there. The problem remains, "to where" - identifying a society that still recognizes and defends Dignity.


You raise a keyword that is paramount in our time. Dignity. Privacy is a tangible and common talking point. Anyone who wants to flee to a forest hut can obtain privacy, but not dignity, because escape is undignified.

While the concept of privacy makes it into laws, it's still just a minor component of a broader "dignity" that so many technologies seem set on destroying. A dozen or more major thinkers since the 1900s have noticed that human dignity is very fragile in the face of technology (Weber, Marx, Fromm, Freud, Jung, Nietzsche...) Surveillance is just one modern facet of undignified life with technology.

Question: can technology ever enhance dignity?

We've seemingly built a system (society) in which material and social success hinge on a willingness to forgo dignity. People who have a strong sense of dignity are disadvantaged and marginalised. So to answer your "to where?" To the margins. Unless one is prepared to embrace indignity in visible opposition. Struggle may be the last refuge of dignity.


> escape is undignified

According to what would-be objective judgement? Under a threshold you fight; over a threshold you flee. It is just sensible. Dignity may or may not be impacted, heightened or lowered - it depends on details.

> can technology ever enhance dignity

Of course tools and devices are meant to enhance dignity, we build them for a purpose - to serve us and assist.

> We've seemingly built a system (society) in which material and social success hinge on a willingness to forgo dignity. People who have a strong sense of dignity are disadvantaged and marginalised

Very correct. (Careful with those «we».)

> To the margins

Of what is not your society? Look at reality. It is a procustean coexistence of squirrels and Men. There will probably be societies less vile.


Add the credit card readers/POS tablets at stores, Starbucks, etc to that list, which mostly have tiny cell phone cameras built into them now (whether you knew it or not)


I think people would be alarmed if they knew the amount of detail that credit card readers can collect (Level 3 data).


> the amount of detail that credit card readers can collect (Level 3 data)

Please expand.


Traditionally we think of the information collected as:

8/11/2024 | Amazon.com | $50

But Level 3 data includes each individual line item:

8/11/2024 | Amazon.com | $50 | 1 Very Embarrassing item | some additional fields

This appears in all sorts of interesting ways, and is not restricted to B2B/B2G transactions as they state so prominently. Anyone can sign up if they have a certain number of transactions per year and save quite a bit on credit card processing fees for providing the data.

I can't find the article but there was a tire company that provided a branded credit card, and they had risk profiles for their customers. The riskiest went to some specific bar, and the least risky were buying snow removal tools. (Please forgive my memory if I have the details incorrect).

edit: Found it https://archive.md/gyde0

"Martin’s measurements were so precise that he could tell you the “riskiest” drinking establishment in Canada — Sharx Pool Bar in Montreal, where 47 percent of the patrons who used their Canadian Tire card missed four payments over 12 months. He could also tell you the “safest” products — premium birdseed and a device called a “snow roof rake” that homeowners use to remove high-up snowdrifts so they don’t fall on pedestrians."

Additionally if you try to buy large amounts of visa gift cards it can be problematic. This is one way they catch manufactured spend.

At the end of the day, some merchants are providing every single detail of your transactions down to the line item and all that information is being tagged to you.


Thank you. One note about the «Very Embarrassing item»: all purchases (in context) are private.

But: if the "purchased item" column is filled in the database of the credit card expenses, it means that the shop receiving the payment has transmitted the information. This is an unrequired deliberate action... The credit card company could just receive "Card ...1234 to pay 20u to Acme Inc. shop". That the shop transmit further information to the credit card company is a further action that should be made transparent to the card owner.


AFAIK level 3 data is essentially receipt line item level data.

I'd actually find it pretty cool to get access to my own level 3 data for smarter budgeting/analysis (eg: automatic tracking of food stocks, separation of spend on luxury foods from basics etc), but I've not found a way to get access as an individual yet


Merchants seldom submit L3 data with transactions for stupid legacy tech reasons. The card schemes encourage them to do so with bips off scheme fees for doing so, but it’s a minority of transactions I think with even L2 data.


The merchants usually don’t (to the data brokers at least), that is correct. But backdoored firmware on the POS could send it anywhere, no?


Yes, it was learning about this level of data collection that made me stop using my credit card for routine purchases and go back to using cash instead.


As far as I know, these are used for scanning various types of coupon codes and vouchers.


Add a physical shutter to cover the camera when it is not in use. (In addition to avoiding spying, such a cover can also sometimes avoid the camera being dirty that it would not work when you are trying to scan something.)


That's an extra moving part that will break, get jammed, or will trap dirt/particles between the cover and lens and effectively sand off the lens over time.

The solution is proper, enforced anti-spyware and anti-stalking legislation (so not the GDPR), not hardware band-aids that are trivially bypassed.


The real solution is a better software culture that looks like GNU/FOSS. Such a culture would generate laws like that if a problem persisted but likely wouldn't need them.


The solution is cash.


Mere software difference


How else should it be implemented?


NFC. We have NFC tags embedded in single use tickets for travel and events, the cost is marginal and most of the uses relevant to card readers could reuse cards.


Does that mean I can request all the pictures of myself checking out at Starbucks under the GDPR/CCPA? Has anyone done that yet? If not, any idea why not?


No, because ignoring the GDPR under all kinds of technicalities is standard practice: https://noyb.eu/en/microsofts-xandr-grants-gdpr-rights-rate-...


There are two types of ignoring that's been very common with the american and swedish companies I've battled with.

1. Protection against law suits. We reserve the right to not delete any information you have, since if there's a law suit we would need that as proof.

2. Freedom of speech. We are a publisher, so by removing your personal information, our right to free speech is threatened and since this is a foundational legal principle, it overrides any GDPR laws.


Could you expound on the first point in greater detail


I can give one perspective...

I worked in the communications part of a lender. We couldn't delete anyone's texts or other correspondence for a number of years due to compliance requirements.


Adding to this, I tried to do the same thing and after providing uuid2 they said "we don't know where it is, but if it exists we will delete it" or something like that, which of course is ridiculous because you can f-ing access the database and access the unique identifier. I'm gonna do it again in some time and try to file another gdpr complaint as soon as they tell "me nooo we can't do that silly ahah"


No, because no one really 'knows' it's being collected/not yet deployed at scale.


Until some DB leak happens.


This is news to me, although now that you mention it I do recall seeing a lens like thing on some of them. What are those for - I assumed it was for some payment method I am not using and therefore wouldn’t have to think about.


90% of the images would be up someone's nose, so why bother? It'd be far simpler to sync the terminal transaction data to the shop's CCTV data.


Anybody know how unique the blood vessels in our nose are? I smell (intended) a lucrative tracking business model!


Nasalprinting?!?

And I thought burning my fingerprints off was going to be painful.


Thing is, everyone will have to do it or you will be known as this one guy who burned off their nose vessels.


You’re right. Are we doing this at the same time? Or who is going first?

The other more inclusive “we”, not just you and I.


We could add taxis to that list that use the Curb ride hailing system, with their "passenger information monitors" located in the rear of the taxi.

https://www.gocurb.com/terms


You should be aware that with "connected cars", some contracts were extended with a duty of warning from the vehicle owner to any guest: "I need to inform you that if you enter my car, you accept to be profiled by the manufacturer and its partners" (see https://foundation.mozilla.org/en/privacynotincluded/article... ).

So, which "taxis"? Taxis are getting extinct - with cars. (You did not expect by entering a taxi to subscribe a sinister contract with unclear entities - and some will plainly refuse it. The "unlivable society" proceeds.)


This was in New York City.

There absolutely should be some kind of notice, or at least an opting-in (where the "opt-in" is not the act of simply getting into a cab).

It's irritating a vital service like this becomes an "all or nothing" deal, where I can't selectively opt-out of some shady practice and still use the fundamental service.


Is there really anything to do? Everyone is constantly uploading photos of my kids to Instagram and that's generating the same surveillance dragnet all these other things are building.

I don't see a way to opt-out without plastic surgery.


Individual action will never do anything on this. It has to come from privacy laws, the GDPR was a good step, but we need to go further.

Make storing personal data like storing hazardous material. Something you absolutely avoid if possible, and treat with extreme care when you absolutely must store it.

Unfortunately the users of this site would rather tell you to move to the woods, go off grid, and paint dazzle camouflage on your face before admitting that a solution has to come from society rather than the individual.


It's worth noting that this article is from New Zealand which has a privacy act that offers a degree of protection.

https://www.privacy.org.nz/privacy-act-2020/privacy-principl...

The concern is (as always) when the law is not adhered to by those tasked with enforcing it.

I'm tired of hearing minimizing language like "Police now admit their actions were not consistent with the law" and that being the end of the matter.


Your comment makes it sound like everyone's PII just spontaneously generates itself on platforms. The problem is that it's not how it works; individual action is what puts your information on adversarial platforms. Personal privacy is the sum of your actions, and the steps you take individually to not share your details with hundreds of people goes a long ways stopping that.

If you want to scare businesses, ban arbitration clauses and other self-absolving Terms of Service. It won't stop Pornhub from getting hacked but it will make their lawyers piss themselves imagining the consequences. Trying to enforce SOC2 on the entire internet is an exercise in futility that will end with Russian hackers selling your credit card to teenagers.


> Personal privacy is the sum of your actions, and the steps you take individually to not share your details with hundreds of people goes a long ways stopping that.

Of course it doesn't appear "spontaneously," it's the result of your actions and others' actions, hence the "cabin in the woods" solution. The commenter is implying that expecting each individual to carefully act to preserve their online privacy clearly isn't producing good outcomes, and would like to see collective action through regulation to encourage better outcomes.

> If you want to scare businesses, ban arbitration clauses and other self-absolving Terms of Service.

That is one potential way to implement the suggested "hazardous material" policy. If storage of any data opens a business up to legal action with teeth then they'll stop risking the storage of such data outside of when the benefit to them outweighs the potential risk. Ideally the risk would be such that it becomes standard practice to process data on-device and design protocols and services such that only the absolute minimum required amount of information leaves end-user devices.


My understanding of data fusion is that it wouldn't matter much what you do.

If you live in the modern world you are producing data about your actions and that data is going to be collected.


> ban arbitration clauses and other self-absolving Terms of Service

Society really needs to do this as soon as possible. These businesses give themselves the right to do anything they want by putting some clause in some document nobody reads.


> Individual action will never do anything on this.

Except supporting NGOs fighting against the surveillance: https://eff.org, https://edri.org.


GDPR is not a good start. GDPR is a joke due to its lack of enforcement, and I'd argue it gives people a false sense of security.

Not only we have (non-compliant) consent flows that destroyed user experience everywhere (without improving privacy in any way, since again they're not compliant and not actually designed to give you privacy), but the lack of enforcement means companies can now claim various things as GDPR compliant, knowing full well nobody is going to actually examine this claim (and if they do, the resulting consequences will be negligible) to give their users/customers a false sense of security.


> GDPR is not a good start. GDPR is a joke due to its lack of enforcement

These two statements are contradictory. You can't have a good enforcement without implementing a reasonable law first, which GDPR is.


The surveillance problem is a matter of balance problem: if we are all able to surveil all others or none is able to surveil essentially anyone else forces are balanced there are only marginal issues. If someone can surveil nearly all but nearly all can't surveil the small cohort who surveil them than forces are not balanced.

Surveillance per se might be useful, let's say you want to know how much live traffic is there in your planned trip, alerts for incidents, natural phenomenon and so on. The issue is just the balance of forces and what can be done in case of unbalanced forces those who hold the knife from the handle side.


Balance is definately not _the_ problem. I am not willing to exchange my information for access to someone else's information. Both should be private.


It is, because the willingness it's largely irrelevant, there are countless information about you left here and there. Of course ideally we have to choose a trade-off between the need of sharing and the will to keep things private, but the point is what we really have or not.

I like having Google Street View, I do not like have my gate on it but that's the trade off, I like Street View so I have to accept having my door on it as well. The point is who own what. If StreetView is like OpenStreetMaps it's a thing, if it's a private service where the owner decide what to keep and what to publish than there's a problem.

I like being able to see my car's cam from remote, but that means others cars owners will see me walking around as well, that's an acceptable trade off if it's balanced (anyone can see his/her own car's cams) it's far less acceptable if vendors can see and sell streaming cams, owners depend of them to partially see or not their cam streams.


What if you could always find out who and why accessed your information? Also, all "private" information is leaking from time to time...


Needed (partially) but far from being enough.

Let's say you are a big-edutech player, you have all infos collected by your platform on your infra. Even if children and families know what you have and "why" [1] they can't know you send an ads at a time small bits of information to drive the scholar path of talented children you plan to hire tomorrow or you try to push some students with political/philosophical ideas aside to avoid having them as active adults against you.

Long story short:

- we, of course, need personal ownership, the opposite of modern IT where most info are in third party "cloud" hands and users have just some modern dumb terminals "endpoints" to interact with third party services who own their digital lives;

- we of course need to know where our information go

but it's not enough, we need information fairness. OpenStreetMaps might have someone using data for certain business purpose, that's still fair, since anyone else can use and own the same data, it's a choice do it or not. Google Maps it's not. Google is the owner, others are customers.

If we share anything or nothing or anything else in between accessible to all or to no one, we are in a balanced situation, there will be some who takes better advantage than others because they understand how to do and they want to do so, but it's still a fair situation. Otherwise it's a recipe for a dictatorship witch we can more and more call "a corporatocracy".

--

[1] a small anecdote: a leading Italian bank years ago decide to ditch RSA physical OTP to access their services mandating a mobile crapplication, I file a formal protest asking for GDPR information and aside noticing they allow operation from mobile, de-facto nullifying the third factor witch is against EU laws (largely ignored the PSD2 norm mandating a separate device for auth and operation), they answer me after a significant amount of time politely that:

- they do ask camera permissions because the app allow to scan Qr codes form various payment systems and for live chat (see below), for similar reason they need gallery access;

- they do want speaker because in-app they offer live audio-video chat assistance so their operator can talk with their customers while being able to see and act on phone screen;

- they need to access filesystem because they allow their customer to pay some bills sent via pdfs by mail or downloaded anyway from some portals, their app need to allow the user select them to being automatically processed;

- they need precise position and phone sensors to being sure it's me acting on my device and not a remote attacker;

- ....

Long story short there are gazillion of plausible reasons for this and that, but I can't know if there are ONLY such legit use of my information or not. I can't be sure even with mandatory AGPL on all systems, because I might have the sources, but no way to be sure their are the very same actually running on their servers.


> but I can't know if there are ONLY such legit use of my information or not

AFAIK even if the bank has "legitimate" use cases for your private info (and I'm not convinced that those you mentioned are), they aren't allowed to use it for something else without your consent, according to GDPR.

> I can't be sure even with mandatory AGPL on all systems, because I might have the sources, but no way to be sure their are the very same actually running on their servers.

With AGPL, they must share with you actual source code running on their servers.


> they aren't allowed to use it for something else without your consent

But I can't prove they respect the law. That's the point.

> With AGPL, they must share with you actual source code running on their servers.

Same as above, they can share a nearly identical system I can see matching but I can't verify it's the same.

Go far, take a look at xz "backdooring".

That's still a balance problem, some have taken advantage on someone else mimicking something legit. As long as a NK project get equally backdoored we would be in balance. You spy on me, I spy on you. You can act behind my lines, I can do the same.

As long as there is enough balance there will be peace and prosperity because the personal advantage became the common one, we all evolve.


I would just have access (control) to my information which I don't have (at all or anymore) but others do (often wrong).


It's not enough: let's say you have signed XML transactions from your bank, so you own your accounts because you have a provable balance and transactions in your own hand, but... You still know just you. The bank knows the finance of anyone, so it can makes informed decisions you can't make.

Let's say you downloads all news articles you read in an accessible, searchable etc format, let's say a feed reader storing posts, and them are full articles in the feed, plus you have historical archives downloadable like old usenet exports. You still get only certain news, others have access to much more news so might know things you do not know and decide to not publish them to take advantage other all the others...

In finance there is insider and outsider training defined and forbidden for a reason, but nothing exists for information. Alphabet or Apple have an immense knowledge from iOS and Android users any of their user have not. How easy for them could be find talents in schools thanks to their education penetration and an ads at a time convince them to take a certain path they like, not necessarily the child like, than hire them while pushing others let's say uni students of some humanities with ideas they dislike in bad roles? What if the owner of an insurance company is also the owner of an insurance comparison service?

We can't rule nature, we can't design a forever perfect society, but as much fairness there is as much positive evolution for all we might elicit. As much as we came back to feudal like society as less positive evolution we might hope.

Of course owning our information, like our home, car, .... is MANDATORY but far from being sufficient. And actually if you see trends... The 2030 Agenda where "you'll own nothing" it's already there in the IT world, it's already there in modern connected cars and so on, it will be there soon in the whole society and that's the topmost asymmetry of information and ownership we can imaging.


At least. (that may be enough to not let split my personality into myself, not considered, and pieces of 'me' not mine (*) used, not in my context and out of my reason: holding me responsible for whatever I can't control, at mercy of 'random factors' - if I knew, I would sue)


More complex than that, let's say your car state it wasn't on autopilot when for "unknown reasons" a pedestrian was hit: how can you prove you are right, the car was indeed in autopilot mode, you try to avoid the impact, but the car does not react to your actions?

So far 99% of cars have purely mechanical breaks and steering so even without assistant systems A BIT you can act, but very few start to be "by wire" (like Tesla Cybertruck with the famous "lagging steering")... Ownership is more vast then "just my files on my desktop storage". Similarly try recalling the recent "Google position scandal" wrongly accusing some people to be on a crime scene. Materially it's still some data in some file but...


IMO the problem isn't so much cameras everywhere per se, but rather how the data may be shared or centralized or otherwise enable abuse.


And that it's profitable to collect and trade such data for whatever reason


Its really easy to point out that these systems exist. The cameras are often only slightly hidden. The problem is knowing what is being done with the data. Where the data goes and how it is used. You don't know who has pictures of your face, your current location, or a list of people with your religious beliefs/habits/likes/dislikes/political beliefs.

The combination of those items give people a motive and opportunity, and the only thing that is lacking is a choice of weapon. People have been mass murdering others over their political beliefs since forever. Poisons are a covert weapons, and car accidents can be created.

It is conspiratorial to think this way, but it is naive to look through the history until even recent years to dismiss these concerns.


In retrospect I'm glad I spent the last 20 years wearing a hat and sunglasses virtually everywhere.


I quite enjoyed being able to mask up with sunglasses and a ballcap during the pandemic. Felt nice to be a little less concerned with corporate surveillance for a while.


You can keep doing it if you want.


I do, but before the pandemic, many establishments in my area banned masks.


Do we need public "digital twins" of watcher supply chains, as they construct digital twins of watchees?


Excellent article. Reminds me of an essay I read recently about surveillance capitalism: https://medium.com/discourse/defending-freedom-in-the-era-of...


I've noticed that whenever the subject of cameras in public places comes up on traditional media platforms, there's always talk about the 'debate as to whether we want these things/should allow them'. I suspect this is a planted misdirection (†) - the government and corporations that profit from this information would love you to waste your time talking about whether we should have this kind of surveillance, because it's already here. It would be far better if we could have a debate about who should be able to access this data. For example, there's a difference between a sworn police officer accessing public camera footage, and the same footage being send to a 3rd party by the police, being analysed in a foreign country by foreign workers.

† I'm not alleging a giant conspiracy theory about direct corporate control of the media, but it is well known that businesses 'seed' articles by sending unsolicited 'fact sheets' and talking points to reporters.


Even WHEN i don’t know much about computers, i never believed I could have TOTAL access to my spouse’s iPhone, TEXT MESSAGES ,snapchat, call logs , INSTAGRAM ,FACEBOOK, Line and WHATS APP without having physical contact until i was recommendation by my best friend to a professional hacker. He only asked for little information and the phone number of my spouse i never wanted to contact him at First because i was so scared but trust me it was worth the RISK because i was happy to get proof for my lawyer to file a divorce.

Contact him via email:spyrecovery36 @ gm ail c om.


[flagged]


This is an AI spambot.


This is one of those situations where government enforced limits would be the only viable long term solution. Personal choices can only go so far.

The question is if governments benefit from this surveillance as well, if so they don't have much incentive to change unless pressured by the people.


In some countries this is already starting to happen. For example, in Spain, cameras and camcorders installed in private spaces cannot capture images of public spaces unless it is indispensable for the intended surveillance purpose or unavoidable due to their placement. Which means you can’t even have a doorbell cam if it accidentally captures public space.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: