Hacker News new | past | comments | ask | show | jobs | submit login
mmWave radar, you won't see it coming (joshhorne.com)
258 points by hornej on Feb 2, 2022 | hide | past | favorite | 183 comments



> Because mmWave radar doesn’t process visual light, what it sees is not personally identifiable.

With machine learning improvements, identification and more are possible. From a 2021 paper on IEEE 802.11bf Wi-Fi Sensing, https://arxiv.org/pdf/2103.14918.pdf

> Indeed, it has been shown that SENS-based classifiers can infer privacy-critical information such as keyboard typing, gesture recognition and activity tracking ... since Wi- Fi signals can penetrate hard objects and can be used without the presence of light, end-users may not even realize they are being tracked .. individuals should be provided the opportunity to opt out of SENS services – to avoid being monitored and tracked by the Wi-Fi devices around them. This would require the widespread introduction of reliable SENS algorithm for human or animal identification.

Prior discussion of WiFi Sensing:

Jan 2022, https://news.ycombinator.com/item?id=29901587

May 2021, https://news.ycombinator.com/item?id=27123493

400+ research papers on wireless activity sensing, showing steady improvement in detection techniques, https://dhalperi.github.io/linux-80211n-csitool/#external

Today, specialist devices or targeted attacks can monitor human activity through walls and closed doors. But that's a world away from the ubiquitous transparency scheduled for commodity WiFi 7 in 2024. If regulators or lawyers don't step in, homes and some businesses may need RF shielding.


Right on. The whole "mm wave radar is detail rich but gives privacy" line is a contradiction. Any data that is detailed enough to be useful is detailed enough to be a privacy concern. While it might not be the same as leaking nude photos from someone's camera, radar data is something that needs to be carefully thought out. Radar data from a car can provide information about people's driving behavior even without location data- do they follow too close, do they make reckless lane changes into tight spots, do they exceed the regular flowing traffic speed, etc. Radar data from inside a building used to track people can tell when you are at your desk or somewhere else, who you congregate with in the break room, how long you spend in the bathroom, etc. If it's detailed enough to assist you, that data can also be used to monitor you.


> leaking nude photos from someone's camera

whatever happened to that meme about common ccd cameras picking up uv images that make out details through eg swimsuits, but hopefully filtered and removed in software before making it to the images?


Cameras usually include an infrared filter (also called a "hot mirror" or "IR cut") that reflects infrared radiation. Mostly because not blocking IR leads to wrong colours for scenes like sunsets where a disproportionate amount of IR radiation is there.

Many smartphones (used to?) exclude this filter on the selfie-camera because no matter how hot you think you might be, it's not going to be an issue.

If you want to check an IR remote for function, try with the selfie camera of your phone and not the normal one and make sure you are trying in a dark environment because the LED on the remote is going to be rather dim either way.

https://en.wikipedia.org/wiki/Hot_mirror


As an aside, the IR filter is often removed[0] from many cameras when using them for astrophotography. Canon[1] sell dedicated astro cameras that are modified in this way, along with other modifications.

[0] - https://skyandtelescope.org/astronomy-blogs/imaging-foundati... [1] - https://www.canon.co.uk/cameras/eos-ra/


I recall the panic over that back in in the day, IIRC it was near-visible infra-red, not UV - if you didn't have sufficient IR filtering over the sensor, were using flash (and the flash included a decent amount of IR), and the subject was wearing something thin, you'd potentially get unintended skin detail with the IR reflection.

As I understand more or less a solved problem on modern cameras such as on your smartphone... instead of a bulb you tend to have white LED for your flash that doesn't really give off useful (any?) IR, and there's better/more consistent IR filtering on the sensor.


I still have a Sony CD camera (as in disk not ccd) which has near IR and a IR led. Fun to play with for night recording (its intended use) and you can see though thin clothes a tiny bit but range is very limited. Skin patterns much less due to lacking contrast. Resolution in that mode is also lacking so detail is not really a thing. I can understand the problem and I think that is the reason my IR camera in my laptop has a moire pattern in the hardware to make it almost useless to the common eye. But that last part is just a guess.


Sony was forced to cripple the "night vision" feature of some cameras because it was too easy to abuse to see through clothing.


it’s likely this was actually infrared imaging capabilities

https://www.theverge.com/2020/5/15/21259723/oneplus-8-pro-x-...


UV or IR?

If it’s IR… I have a PVS-14, I’ll have to dig one of my wife’s swimsuits out of the closet and see if it’s transparent. I’m curious.


Those camera manufacturers quickly eliminated that capability, likely via optical filtering. That was about 20 years ago. Consumer-grade cameras do not have that ability nowadays (bad publicity).


Why not just fix the swimsuits? Seems like a much more straightforward solution.


The problem was not swimsuits. It was any clothing that was relatively thin. Swimsuits was just an example.

In any case why would we be required to don tinfoil suits every time we go to the airport?


50/50 on privacy. Sure, privacy is cool but I pity the people whom must endure seeing the shapely flabs and curves of everyday people. That's got to have a toll on anyone's psych over time, as an "occupational hazard".


I mean, the material has a bug that wasn't discovered until recently, so we fix the bug and re-deploy the material. It's that simple. Just embed some carbon fibers or something in the swim suit, there are tons of ways to block UV.

It's like if your car seatbelts have a bug you issue a product recall. Same thing.


It would seem easier and more efficient to fix the cultural taboos around nudity and over-sexualisation.

Breastfeeding mothers would be collateral winners of that change for example.


You do realise that you have to add this to every swimsuit on the planet, right? And all other clothes that may exhibit it this behaviour?

It is not the material that’s buggy, it’s new technology which can look under clothes and damage privacy that’s the problem.


Not to mention changing materials would likely have continuous costs, and will impact the materials properties and longevity - potentially in ways that just don't add up to a feasible product.

And then there's the fact that this is a hugely international business, so unless we convince the whole world to do this, we'll be impacting supply lines and flexibility there too.

The whole undertaking seems like it would have an absurd scope - just to avoid adding IR filters that cameras need anyhow to achieve accurate color reproduction, and therefore most cameras already include!


No. That's like saying thieves are the problem so tie up the hands of every would-be thief and don't bother lock doors.

It's much simpler to lock doors, and that prevents thieves 99.9% of the time.


The filter IS the lock. It can be circumvented by the determined but it prevents most of the abuse most of the time.


Or just add a near infrared filter to cameras. Much easier and can even be mandated by law if necessary.


No. I can rip off infrared filters from any camera I own.

I already do that, because I image nebulae and galaxies and sometimes landscapes in infrared.


Most people cannot, believe that they cannot, or would not even if they could. We can be sure of that because despite of billions of cameras in phones, pocket cameras or webcams being out there, the web is not flooded by pictures like these. Putting filters into cameras definitely works. It is not an obstacle to the determined but it prevents most of the abuse most of the time.


Hardware deployment is order of magnitudes more costly than software.


My main point is the data is anonymous at source. I think having control over how that data is correlated back to an individual is important


Wi-Fi light bulb won CES 2022 award, expected to ship in 4Q 2022. Logs of detailed human activity, in-room 3D location and timestamps can tell a story over time. https://us.sengled.com/blogs/news/the-biggest-ces-2022-smart...

> We earned this year’s award for a product targeted to launch in the fourth quarter: our Smart Health Monitoring Light. Featuring a Wi-Fi, Bluetooth Mesh dual chip, the bulb will provide a number of features, including biometric measurement tracking of heart rate, body temperature, and other vital signs, as well as sleep tracking. By connecting multiple bulbs via Bluetooth Mesh and creating a virtual map across your home, this product can even help detect human behavior and determine if someone has fallen and then send for help.


With decent processing a mm-wave picture would be good enough that you could recognise a person, so I don't think there is any inherent anonymity. The image resolution is a function of both wavelength and aperture and a mm-wave antenna can extend over a significant number of wavelengths (ie. large aperture). A stable timebase would further allow processing across time, enabling synthetic aperture.

Also, a person is not a randomly shaped object. If the processing is specifically tuned to detecting and identifying people an awful lot of degrees of freedom can be eliminated, giving more detail in those features that do identify a person.


I think you are still off about the privacy aspect. At the source, a return from a single radar pulse is pretty uninformative. But a single pixel on a camera sensor is also uninformative. Even at the source, an aggregated set of radar returns is still a privacy concern on the level of an aggregated set of camera pixel values. Not operating in the light domain doesn’t make the privacy concerns go away, it just means that laypeople aren’t going to understand the risks as intuitively.


No its not. There is no anonymity at source. Once this is there, it will be coupled with other PII. Anything otherwise is plain foolish and lunacy


My reaction as well. Given how many ways there are to uniquely identify individuals this isn't a selling point for the tech. It's just a different kind of camera and of course it can identify people it's just that it will not reproduce a picture of people's faces to do so.


Not yet, because the resolution is too small for now but you can count on progress. Thinking more globally, couldn't 5G mmWave make it possible to get to know exactly how people are moving ? (let's say, someone gets out of his house, the government can know it in realtime, even without a phone, because the mmWave could map its position).


>couldn't 5G mmWave make it possible to get to know exactly how people are moving

If you've got the signal density to illuminate the stuff you want to look at there's no reason you couldn't.


Yeah. Up front they say:

> what it sees is not personally identifiable

Then later on in the article:

> [the new HomePod could] [detect] the user's identity based on their gait, height, and physical profile

Uh-huh.

…did they read their own article?


It just depends on whether they’re talking to Congress or the Police.


What an insane thing to say. 60 GHz is more than enough to get completely recognizable images with a sufficient transceiver

https://www.researchgate.net/figure/Millimeter-wave-40-60GHz...

There's a huge difference between a phased array scanner and the transceivers described in the OP, but mm waves are absolutely not "privacy-preserving"


I was about to come and say the same thing.

Its perfectly possible to create feature descriptors for 60gig radar, and use it to image rooms, things and possibly people.

Its a privacy problem we currently are trying to overcome in AR. The problem is that because feature descriptors are so small, researchers assume that they don't contain PII. So its going to take a scandal before its taken seriously.

Given that most of FAANG are making some sort of 3d map, I'm sure we are going to get pictures of people's bedrooms leaking soon enough.


Not to the same degree but roomba seemed to survive mapping peoples houses privacy problems, and everyone seems to love Alexa. The general public prefers features over privacy.


They certainly do, up to the point when they start to see the downsides.

Facebook for example. It took a few years, but people are now wary of information being grabbed by them.


And lets be honest here, mmWave radar won't be used on its' own, but instead will be used to in conjunction with visible light and other forms of privacy violating sensing technology to increase the resolution of the panopticon.


We're closer to the star trek future of "sensors" that can sense a planet's surface from 1000s of miles away.


This statement about privacy of mmWave radar you raised made me wonder if someone haven't already tried to implement SAR[1] with mmWave radar, and quick search reveals that indeed, this is being looked into, eg.:

- https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9136646 - https://arxiv.org/pdf/2011.00636.pdf

That said, it doesn't mean that every mmWave radar could work as SAR.

[1] https://en.wikipedia.org/wiki/Synthetic-aperture_radar


Fundamentally, any radar that can be configured to capture coherent data (ie: amplitude and phase rather than just amplitude) can be a SAR. SAR is basically post-hoc digital beamforming to sharpen the image. If you have amplitude and phase information (which you can get if your radar can be configured to output raw quadrature baseband) you can do SAR processing. There's no fancy hardware required, SAR is largely a software problem.

You do also need information about where the sensor was located at each pulse, but it doesn't have to be extremely precise - there are algorithms that do iterative processing to accomplish motion compensation.


> If regulators or lawyers don't step in, homes and some businesses may need RF shielding.

Creating a new job market: RF shielding installation.


I agree that if all of a sudden our routers started mapping out the world like in the Batman movie that would be a bad thing.


Hm, I the EU due to GDPR this services are required to make you opt in to them.

Does that mean many of this applications are outright illegal in the EU, due to not having a good way to handle the case where the user doesn't opt-in?


Thinking about it a bit more in Germany there is a law about spy devices.

Most spying on users through websites etc. doesn't fall under it do it being formulated in the pre-internet age.

But would WIFI-7 enabled routers fall under it? It's not complete unrealistic that it might happen.

The think is this law is rather strict in how illegal spy devices are to be handled => destructive disposal. (Required, by anyone owning such a device).

That could be quite a big fall out.

This law did hit some IoT devices, like some toys with integrated voice recording.


Well it's nice that the most well behaving countries (not that they are THAT well behaved) play nice -- but for countries in mmmm...Asia, Africa and the Middle East this will be a boon, sadly.


Yes I would say WIFI 7 falls under irresponsible technology.


They can already identify a person from typing, gait or even how you move your mouse.

It is either very dishonest or extremely stupid for an expert to say mm waves can't be used to identify a person.


I've been somewhat disappointed that the learning material around mimo radar processing that can be done with popular TI automotive radar chips (and others) is somewhat limited after you get through with the absolute basics.

TI's documents are a good starting point: https://training.ti.com/sites/default/files/docs/mmwaveSensi...

But it doesn't really go into the different schemes for sending out orthogonal waveforms from the transmitters, the design of 2d arrays, much of the basic signal processing (windowing, matched filtering), etc.

I was pleased to see https://hforsten.com/mimo-radar-antenna-arrays.html pop up on hacker news quite a while ago, which was very interesting. Goes into how to plot the beam width for a mimo array, with some nice python code. I want more, but I'm afraid I won't get it unless I read through a phd thesis. Any ideas/tips?


> Because mmWave radar doesn’t process visual light, what it sees is not personally identifiable. This makes it an ideal choice for monitoring where a camera would not be appropriate.

Is 5mm resolution not enough to identify a face?

What’s the difference between a 3d model of a person constructed from high res radar and a black and white photo?

The radar sounds even more identifiable.

Edit: Couldn’t you construct a mosaic and have much higher resolution?

And also remove their clothes?


I worked with mmWave radars previously, but not professionally. In my opinion, it would be quite challenging to extract enough features from the current generation of mmWave radars, such as TI IWR1642 ([1]) which have around 4 receivers and 2-3 transmitters, because the incoming data, while having a lot of temporal resolution is very limited in spatial resolution.

With greater number of antennas something like what you describe is theoretically possible, but becomes cost prohibitive.

By limiting the number of transmitters / receivers, we can have an almost perfect privacy monitoring. For example, it would be possible to detect a attack / a fight in a public bathroom, while not exposing anyone.

1. https://www.mouser.ec/datasheet/2/405/1/swru521c-1954464.pdf


To an extent, but depending on the sampling rate and frequency and your ability to control the area being observed, there is still a lot of information available for modeling and biometric identification. In the extreme you can detect things like heartbeat, rate of breathing, etc.

For example, I have a CDM324 24Ghz radar module here on my desk. I set it up to 'watch' me type this comment from across the keyboard. This is an extremely simple module that I have powered by a bench power supply and the IF routed directly to the audio input on my desktop. It was sampled with audacity and amplified a bit to help with visibility. Towards the end of the recording I'm expecting a flat spot followed by regular motion followed by 'noise' as I pause motionless for ~10 seconds or so, then take about 5-6 exaggerated breaths, then resume typing.

This is with zero design, the wrong frequency for the job, and next to zero signal conditioning.

(post: i've included a zoomed in image of the 'motion demonstration' to show still/breathing/typing, then zoomed in on typing to show the detail, then a spectrogram and waveform of me reaching up to scratch my ear.)

https://imgur.com/a/0JmENYu

Bonus: yo check out my soundcloud - This is what the doppler signal actually sounds like:

Scratching my ear - https://soundcloud.com/buckrunner2/scratch

Talking directly at the module - https://soundcloud.com/buckrunner2/talking


“Privacy” seems to be a marketing term here; what they really mean is that you can’t take naked pictures with it.


Which is also nonsense of course, the pornotron of TSA fame is mm wave radar.


The difference is the number of transmitters / receivers. TSA uses an array of them (from top to bottom) with narrow beams and they also rotate them and ask you to not move, so that effectively their number is considerably higher (ten of thousands), when reconstructing the image. Additionally, they use a higher frequency (160-400GHz), which additionally helps with resolution. See [1] and [2].

If we limit the number of transmitters / receivers to 3/4, the image reconstruction becomes impossible, similar how you would not really make a real photo from a few color sensors stationary looking at a scene, while a linear array of them that moves around an object would make a perfect scanner. It's a non-ideal analogy, but the best I can offer.

1. http://mt-fedfiles.blogspot.com/p/tsa-frequency-updates.html

2. https://www.3dbodyscanning.org/cap/papers/2017/17263mcmakin....


You can reconstruct the images with neural networks even if you have little signal to guide.


This is true in the sense that NNs offer a convenient way to estimate a distribution of hidden values conditionalized upon visible values. That is an approximation of what is usually understood by "a reconstruction". I would argue that a reconstruction per se would only include parts for which the credible interval could be systematically bounded within a suitable tolerance threshold.


Sorry, don't buy it.

Gait analysis is excellent at identifying people, and that's projecting from a 2D image at a distance.

Anything which can draw a moving voxel cloud around a human is going to figure out who that human is eventually.


I don't disagree but I think this version of privacy is reductive.

Plenty of real world privacy exists on a spectrum. My family knowing I'm in the bathroom is a far cry form them pointing a camera at me.

If stores track you "anonymously" with this ... Who cares? It's less invasive than the ubiquitous security cameras they have pointed at you.


Google knowing every time I’m in the bathroom and keeping a detailed log of the times is a far cry from someone in my family happening to notice I go in there once.


I have no opinion on your reply, I was discussing technical feasibility in response to someone claiming it was infeasible. The normative claims aren't germane or particularly interesting to me.


By not exposing anyone I meant reconstructing an image. Identification is a different story and there are multiple ways on how to achieve that (heartbeat classification is an obvious one).

If you have ideas how to reconstruct an image with 4 receivers / 2-3 transmitters, please, buy a dev kit from TI (they cost <$200, https://www.mouser.com/c/?q=mmwave) and show us a demo.


I have a lot of experience with TI's IWR6843AOP chip and dev kit. It has 3 transmitters and 4 receivers and it'd be impossible to create a facial reconstruction with it.


Is it possible to grab the raw ADC outputs from that dev kit alone into the PC by USB or is another ADC kit or something needed?


I've just used it for the point cloud data which can be streamed via USB. If you want to stream the raw data I think you'll need https://www.ti.com/tool/DCA1000EVM. it has a 1Gbps ethernet port on it.


What about correlation? If you have other information that can tell you who is alone in the bathroom (common area CCTV, log of RFID key usage, etc) and your radar can infer that something very private is taking place there because the person thinks they are alone, then isn't that still a huge breach of privacy?


I really must have phrased my message more clearly. I meant reconstructing images, not identifying who is inside. The latter is quite feasible, and you're correct that extra data helps with that as well.

Just like with browser pinning, we have a variety of signals, which combined together give a very high rate of correct identification.


Got it, I think I read your message as defending the idea more than it was.


...so what I'm hearing is "hard today so probably trivial in 5 years"? Or is this an exception to the usual march of technological progress?


Temporal res can be turned into spatial res.


Theoretically: yes. Practically, there are limits to that.


A couple examples to show what the mmWave data looks (bathroom, living room, etc)

Intelligent fall detection using TI mmWave radar sensors https://www.youtube.com/watch?v=lsIo_HIk4GY&t=93s

3D Occupancy Sensing For Home and Office https://www.youtube.com/watch?v=9EDhJQbLNyo&t=83s

As you can see the radar doesn't reveal too much, while still being able to discern a lot of what is going on in each scene.


The visual signal there is utterly meaningless for assessing how much is exposed.

For analogy, take https://en.wikipedia.org/wiki/RANDU: it was very popular in its day, and its distribution looked good to the naked eye, but plot it in more than two dimensions and you see that it’s actually extremely terrible, failing spectral tests badly.

The information has been captured; the fact that one visualisation of it doesn’t expose that information is irrelevant and gravely misleading (as in: I don’t want someone that hasn’t already realised this working on this kind of stuff, because they’re dangerously ignorant and/or naive; I would rather they stop, and go and learn about the risks in detail before continuing).


> As you can see the radar doesn't reveal too much, while still being able to discern a lot of what is going on in each scene.

No, you can't see that, because it's a video of a specific visualisation of the data from one specific sensor.

If it can see through walls and measure biometrics like heart rate, mass and body type, build a 3d point cloud of limb positions and measure vibration, then it's more privacy invasjve than a camera, not less.


all sensors/technology are invasive to a degree, but I would argue radar fairs well in usefulness and anonymity in comparison to other sensors.

if the data were leaked, would you rather have some of your vitals and point cloud data exposed, or a video of you taking a shower?


I'd rather neither. I've handled the latter by not installing a camera in the shower.


You don't need to identify a face; you can identify people by Gait. If you can resolve something as big as a leg you've got enough, as long as you have it over 10 seconds or so.


It's odd that people care so much about technology "identifying them" when they have cell phones in their pocket and license plates on their car. There's not much of your daily life left after that, but I suppose it makes them feel good to imagine they're important enough to be tracked.


I opt into carrying a cell phone and it’s when I want. I don’t opt in to my neighbor’s Alexa tracking my comings and goings by scanning through my walls.


You can do that by sniffing wifi packets even if you can't decrypt them, I think?

Could be a good opportunity for home Faraday cages.


I don’t emit WiFi packets nor do I carry a device that does.


The point is that total wifi traffic in your house is correlated with you being at home, because when you're out, you're not using the computer.


And I’m saying that’s bullshit. I don’t use WiFi at home. I’m an old fuddy dud and have Ethernet plumbed everywhere.

The difference is now the neighbors WiFi can scan my house and see me inside it without me using WiFi.


If they were doing that on purpose, they could already do it with an IR camera. Don't see what a wifi standard has to do with it.


There is nothing odd about it, and it has nothing to do with being "important enough to be tracked".

It has become almost impossible to participate in society, without every aspect of your life being recorded in some way.

My Bank sells my transaction data to third parties to track my credit history and sell advertizing - they do this without my consent, or rather if I disagree, they will not have me as a customer. I could switch banks but it's basically the same everywhere. All I wanted was a place to store and transfer money.

My phone company sells my location and call data the government and to marketing companies - they do this without my consent, or rather if I disagree, they will not have me as a customer. I could switch providers but it's the same everywhere. What I actually wanted was to be able to make phone calls and have data connectivity on the go.

My ISP sells which websites I visit to marketing companies - they do this without my consent, or rather if I disagree, they will not have me as a customer. I could switch ISPs but it's basically the same everywhere. All I wanted was to have data connectivity at my home.

My airline sells my flight data to the government. They do this without my consent. There is no alternative. I just wanted to visit some relatives.

The government tracks all my phone calls. They do this without my consent. There is no alternative. I just wanted to call my friends.

The government tracks where and when and whom I send emails. They do this without my consent. There is no alternative. I just needed to communicate for my business.

When I go out in public, my image is records by a multitude of cameras, by a multitude of actors. They do this without my consent. There is no alternative.

My licenceplates are scanned, depending on my locality, more or less regularily. What I wanted was to make going around a bit easier over longer distances. There is no alternative.

My medical records are ... well there's something called HIPAA, but the reason it has to exist in the first place is that some seedy actors are trying to get their grubby little hands on this most private information of mine. I do not consent to this. I am sick. There is no alternative to medical care.

I go out in public and depending on where I go, cameras are being equipped with all sorts of biometric detection capabilities. All I wanted to go out for a drink, suddenly I am in some database.

I go online to browse the internet. No further explaination needed.

All I wanted was to have wifi connectivity at home. Some asshole has crammed functionality into the standard for wireless connectivity, that makes it possble to track where my body is. There is no alternative.

And so on and so on and so on.

All of these things taken together create an information asymetry. One by one they aren't neccessarily that terrible, but by accumulating all this information - for which there is constant pressure by the more authoritarian elements of our societies - creates a digital me, that absolutely has an effect on my day to day life.

Easiest example is my credit score, I am sure one can think up reasons to justify it's existance. But the point is, that this device exists as an incarnation just for me - without my consent at all. It's not that I need to be very important to have a credit score. I just have one by default.

It does not make me feel "important" to be tracked. It makes me feel oppressed. It's the opposite of important. Being treated like this, decisions being made about me by some usually invisible faces, makes me feel like cattle. I am being treated as a good that can be sold, and from which value can be extracted, in other ways than the exchange for goods and services for money. I am the commodity itself. By default I am presumed guilty, and evidence is being collected against me, even if I will never violate any law of society.

My right to live as a free interdependent member of society are being violated by the people who think they can use me and the data they can glean about me for whatever their purpose is - and they do this without even considering my consent.

Information asymetry enables control. It's the anti-thesis to freedom.

It has nothing to do with feeling "important" and the choice you are implying people have is not a free choice. It's forced.


> All of these things taken together create an information asymetry. One by one they aren't neccessarily that terrible, but by accumulating all this information - for which there is constant pressure by the more authoritarian elements of our societies - creates a digital me, that absolutely has an effect on my day to day life.

Well said.


(Genuinely curious) how do you know this?


There has been research done on gait recognition https://techxplore.com/news/2018-08-artificial-neural-networ...

mmWave can be used to identify unique traits http://www.cs.ox.ac.uk/files/10889/%5BDCOSS19%5DmID.pdf


i recall a comment by someone claiming to be working in an antarctic base- everyone working outside would be wearing the same suit, but you could subconsciously recognize who you were looking at, at great distances, simply by the gait.

walk without rhythm ...


we're about to enter worm territory. we can't walk like regular humans. https://www.youtube.com/watch?v=1YFrFSX4cNw


It doesn't even need a radar sensor. Everyone moves in a different way, so that extracting patterns from the phone accelerometer can help to identify a person. Two people carrying the phone in the same pocket or hand will produce very different patterns that can be analyzed to extract the different behavior. All it needs is one chance in which that pattern is successfully paired with the person's identity, and from that moment on all their activities can be identified.


In terms of an IoT device or phone sensor form factor, not yet, but in the future I expect it will be possible.

Airport body scanners use millimeter wave radar, and they use software to mask the image so it isn't personally identifiable. There are even passive versions that just use background radio waves in a compact form factor:

https://www.qinetiq.com/en/what-we-do/services-and-products/...


Each time something comes up like this, I call for jammers. If someone wishes to exploit the shared physics of the environment, I'll have no qualms to do the same.


There's some research, but more funding is urgently needed, https://ans.unibs.it/assets/documents/COMCOM-ext-WONS_Prepri... (Dec 2021 paper)

> This work explores the possibility of countering CSI based localization with an active device that, instead of jamming signals to avoid that a malicious receiver exploits CSI information to locate a person, superimpose on frames a copy of the same frame signal whose goal is not destroying reception as in jamming, but only obfuscate the location-relevant information carried by the CSI. A prototype implementation and early results look promising;


tl;dr is that the technology is very very different than a phone camera. So it's not the same as an IR photo. E.g. Google Pixel 4 has a radar chip AND IR dot projector+camera for face unlock. Presumably, if you could do face identification with Pixel 4's radar sensor only they would not have needed the IR tech.

Disclaimer: Work at Google, but not on Pixel 4.


"One of mmWave's major advantages is its privacy-preserving nature. Because mmWave radar doesn’t process visual light, what it sees is not personally identifiable."

Bullshit!


Yup. I work with MRI data. When de-identifying brain scans we have to strip the face off [0] because that's easily identifiable.

[0] https://github.com/PeerHerholz/BIDSonym for example


Perhaps they mean it skirts current privacy rules through legislative technicality.


Good point. In that case, legislation can be amended to close previously-invisible loopholes.


I've been following and building mmWave radar stuff for a little over a year now and I wanted to write an intro to help anyone get up to speed on how it works, why it's special, and what's been happening in the space.


Please write up how to make a simple passive detector so people can have a chance to know if they are being tracked.

This is going to be in every home in 10 years embedded in phones, TVs, smart speakers. All public venues. Trams, buses, subways. Everywhere. But most importantly homes. Every. Single. Person. 24/7. Breathing rate. Abnormal motions. Current location. Current activity.

Maybe, just maybe if we raise a some awareness we can curb the proliferation a bit.

Surely I can't be the only one who sees this as a nightmare.


If it's going everywhere, making a detector won't be very useful. Perhaps a jammer / shield.


How is this any worse than smartphone tracking today? What does it matter if I buy technology that tracks me because I feel like it'd be useful to me?


How is the power consumption?

Is it sensitive to noise or interference?

The VR headset application appeals to me. If headsets could be dumb terminals they would be a lot lighter and untethered wouldn't suffer from performance loss.


Power consumption for radar can get down to around 0.5mW (edit)

But I imagine for 60GHz WiFi it's probably pretty power hungry.

At this point there's not much noise in the mmWave spectrum (it won't be competing with 2.4GHz and 5GHz anyways). And mmWave doesn't travel that far which is one of the main reasons the FCC unlicensed it in the first place.

Might be worth keeping an eye on UWB for AR/VR headsets https://www.embedded.com/wireless-transceivers-use-uwb-for-l...


Checking some point to point wireless radios and it doesn't look like 60Ghz needs much more power. Well, it might be a few times as much, but definitely the same order of magnitude.


great article, I really appreciate it! sorry that the knuckleheads here got caught up on privacy instead of discussing this interesting technology


I've been waiting for someone to finally make a consumer stud finder that works and this just might finally happen after this deregulation of the spectrum.

The construction industry would hand over wheelbarrows full of money for a BLE phone-paired puck that located studs and pipes behind drywall. Someone could even sell a "pro" version that combined the data from multiple pucks to render a 3D point cloud of the inside of the wall on the user's phone.


Would the Walabot fit the bill for you? I bought one when it first came out and seemed pretty good. I haven't messed with it in a couple years though.

https://walabot.com/see-through-walls

This seems like a pretty good review:

https://www.bobvila.com/articles/walabot-stud-finder/


I tried one and it's pretty disappointing. Plenty of room for improvement.


Most studs have things (like drywall) nailed to them, and the nails are usually made of iron. So, all you need is a high-power rare earth magnet. I use a particularly strong refrigerator magnet.

If you want to splurge (or are willing to couple trips to the coffee shop), you could buy one (or two) of these:

https://www.homedepot.com/p/C-H-Hanson-Magnetic-Stud-Finder-...

I was tipped off to these by the someone in the construction industry. He buys a few whenever they're in stock, so that he can have them in every toolbox, lose them, etc.

Note that, once you find a nail head, you can just leave the magnet hanging in place. If you have two, you can hang both, then use a straightedge or chalk-line to mark the point between the two where you want to make a hole. The fancy radar puck thing can't do that. (Maybe, someday, fancy AR goggles will solve that problem, but we're a long way off from that.)


This doesn't work for conduit and plumbing. mmWave radar does.



They also seem to have development hardware available https://walabot.com/makers


This stud finder uses UWB radar (lower frequency) https://www.amazon.com/Bosch-D-Tect-Floor-Scanner-Technology...


As a radar guy I'd love to hear more about this. Stunned that's not a solved problem.

To be clear you probably wouldn't get a 3D view of the whole room, just a narrow window like a conventional stud finder.


Yes, this would be incredible.


A technical error in one of the diagrams: Hall effect sensors are not binary, but one-dimensional, and are used so; their output voltage is directly proportional to the magnetic field strength. As for PIR sensors, fundamentally they’re certainly not binary, but I have no idea if they perhaps expose only a binary signal. A better example of a binary sensor might be an on/off switch.


thats a good point about the hall sensors. i'll update that. PIR sensors also output an analog signal, but in practice they are binary.


There’s a funny autocorrect typo in the Adams quote: Zaphod -> Zappos.


lol, thanks. fixed it


in practice they don't have to be binary


If you include time dimension, hall effect sensors are two dimensional.


analog PIR is typically a differential signal between two receivers. The receivers are placed behind a fresnel lens (where the lenses are designed to point into each of the sensors). So when an IR source (like a person) walks in front of the sensor it creates a large spike, because they are being compared against a low IR area.

There is a form of PIR sensing where it behaves like a thermal camera at a very low resolution (sometimes as low as 2x2 pixels)


most hall effect sensor applications i've seen use a schmitt trigger and output a digital (on/off) signal


the only true binary sensor i can think of are good old fashioned mechanical switches.


Mercury switch on a bimetallic strip (old-school thermostat) counts, same basic principle.


reed switch


is an old fashioned mechanical switch.


> mmWave radar doesn’t process visual light, what it sees is not personally identifiable

that is not how "personally identifiable" works

as long as you can (re-)identify a person with it it's personally identifiable data. Even if identification is done by using a mmWave radar and some simple ML.


Using radars with only distance and relative speeds data, no direction at all, to identify equipment has a long history in the military.

https://en.wikipedia.org/wiki/Radar_MASINT#Non-Cooperative_T...

Humans come in greater variety than airplanes but recognition, especially with things like gait analysis, should be quite possible. Now, I'm skeptical that a photo of a person contains enough information to allow for mmWave recognition though, but I'd assume video does.


Interesting, though it's enough that two independent mmWave radars can re-recognize a person to count as identifiable. No need to involve visual data. Especially if mmWave radars become common place.


Yes, it seems much too early to say that this is privacy preserving just because visible wavelengths aren’t involved.


"The promise of ambient computing is that technology gets out of the way while becoming more helpful at accomplishing what we want. It anticipates our needs by understanding the context of our environment and situation. It serves us, rather than us serving it. "

This is the optimistic case. Perhaps more likely is that 'ambient' computing will smother us in surveillance, ads, 'nudges', restrictions (you seem tired or drunk, you may not drive), etc.


It will probably be both good and bad, but for certain is that we're going to live inside a computer. Every move monitored.


Challenges to keep devices affordable aside ($3.65 is a lot for a BOM add on), Radar is one of the first non-human senses I’ve had to design for. From an ML perspective it offers significant advantages over static images, being independent from ambient lighting, and sample rates are higher than low cost cameras without ISO noise.


> mmWave radar used inside the cabin as a sensor to detect a child left behind in a hot car

Call me a cynic but since the power button stopped being a power button I have trust issues. In the end we'll have mmWave chips our company devices that can tell middle managers if we've been at our desk for all 28,800 seconds of the work day and exactly how hard we've been working, that's a more likely outcome than saving a kid in a hot car.


You could do that today with normal tech today. That's why you need laws in place that prohibit work place monitoring.


You could argue that today's circumstance is why you DON'T need explicit laws prohibiting workplace monitoring.

Anyone with the desire could set up such systems today. But they don't. Because they're expensive, inaccurate, and ultimately not all that useful in the social context. There are places that do employ such monitoring, but they're a corner case.

Trust is a core commodity in the workplace. It's usually more profitable to establish a trustworthy workgroup to accomplish some goal than it is to accomplish some goal WHILE monitoring the workgroup's every move for infractions.


> There are places that do employ such monitoring, but they're a corner case.

Don't amazon warehouses already monitor workers in this much detail? Just yesterday I read someone talking about how managers get reports containing, amongst other things, how often a person was standing still. Lots of companies squeeze their employees like this, especially low paid employees.


Amazon has contracted firms to produce boxes for their facilities which include 'gait recognition' code.


I have a theory / idea I call MOOP - massive open online psychology - the idea is that our phones / radar devices monitor our daily interactions - tone of voice, body gestures - to get a state of our anger / love / etc between our families (and work). And then uses simple epidemiology to help guide / train us.

The basic idea is your phone becomes your life coach.

This "privacy protecting" radar (which does not seem privacy protecting at all, just avoids leaking images which is just bad PR avoidance) looks like the same direction.


Nbd, but moop means garbage at the burn.

I agree this will happen, I suspect Apple is not only in the best position to exploit mmWave hardware but already well down the road in related interaction design.

Apple Watch already offers limited physical sense-based life coaching in the form of hand washing duration tracking and feedback.

Apple Watch’s assisted interaction with UI that quietly debuted in iOS 15 was an important set of physical sense-based interaction primitives in production.

They suggest to me the company has likely progressed capabilities sans mmWave.


I assume moop is a slang term then? what's the burn?

And yes, I am sure many companies are thinking (or at least have people in them thinking like this). A few issues are the legal and cultural framework (I am quite happy for my personal data to be shared with accredited Health (ie NHS) researchers, but not with the Apple Watch product manager)

Epidemiology at this level is going to have transformative effects, but it needs to be trusted. I mean the only difference between a utopia of guided humans and a surveillance state is who gets the data and what they do with it. the people in the free country abs the ones in the oppressive totalitarian state will be wearing the same devices with the same radar.


At Burning Man, moop is an acronym for Matter Out Of Place, a reference to the leave no trace principal of the event. People will run after a small piece of material. Again, NBD, but in tech circles this might be the first thing people think of with that acronym.

I do think Apple captures plenty of meta data, but I do not know how much physical motion is getting sent back. One of the goals of the secure enclave was to store biometrics. And despite some notable concerns, the company has generally aligned its business model and branding to be privacy focused.

I can't think of another large company I'd trust more to handle this kind of information.

>Epidemiology at this level is going to have transformative effects...

The transformation is already underway. When Apple delivered Google Maps on iPhone with its pulsing blue dot, those with the technology instantly leapt ahead of those without it. I wonder how much stress has been relieved just from people knowing where they are.

I agree the technology can be misused, even to oppress. But this is true with any tech.

My concern is more about the disparity of those who have this tech and those that don't. The enhancements could start to show in a matter of years and quality of recommendations could coalesce into what will look like a new plane of daily living.


Do any of you see avenues for a conventional radar company to contribute to this emerging trend?

We produce high-end FMCW radars (http://dopplerradars.com) and I have been looking for ways to align us more with Tech and less with Defense.


Arms race: help evolve 802.11bf with deep understanding of risk management. Sell souped-up localization and CSI countermeasure devices, e.g. for industrial laboratories. Work with two celebrity neighbors on a sanctioned, comedic public demo of seeing through home and business walls, in advance of Wi-Fi 7 launch. Publicity will mean a larger market for both attack and defense products. If it's done early enough, it could materially influence the WiFi Sensing standard and thus devices racing to be the first to support not-yet-official WiFi 7. There are legitimate use cases, but consent and bounded scope are critical for mass-market acceptance. Non-naive experts can help.

http://www.orca-project.eu/

https://ans.unibs.it/projects/csi-murder/

> Imagine that someone wants to illegally track the position of a person inside a laboratory, for instance to measure how much time is spent doing different activities at different desks, as depicted in the upper picture. How much effective can this attack be? ... With CSI-MURDER, the localization becomes impossible because results will seem random, thus preserving the person privacy without destroying Wi-Fi communications


Presence detection. Make a $50 device I can run on a battery/Power over Ethernet/etc.

Make it all locally manageable, and make it work with HomeAssistant, so I don't have to deal with it "calling home" or shipping data outside my LAN.

I'm not a huge fan of the current crop of presence detection, and mmWave seems like just the thing. Eventually, person detection would be fantastic, but it MUST NOT call home outside the LAN. I don't want that junk going on the internet in any way, shape, or fashion.


Use cases:

I need a presence detector that's able to detect children near my pool.

I need a presence detector that can, via gait detection (or other), detect when a specific person crosses a threshold (e.g. autism or Alzheimer's with a tendency to run off)


As a guy that works on old cars (and drives them fast) that will never have OEM support for e.g. ADAS, I'd love to have access to a small, discreet (i.e. not visually excessive, but "hideable"), modular set of sensors that could be used to build e.g. ADAS/parking sensors/(pipe dream) automated driving.

I'm sure the OEMs would be interested, but I'm looking for something I can retrofit.


This is a fantastic idea, i.e. refactor the timeline of automotive electronics to backport modern technology options into old car branches, retaining supply chain control in the hands of a individual human mechanic-sysadmin-owner. Reduce the false conflict between technological convenience and security/privacy.


Very dumb question: If mmWave is operating at 60GHz, how would it be possible for processors (which are operating at less than 1/10th of that clock rate) to process that sampling rate?


Yes. The key here is the difference between frequency and bandwidth.

The radar isn't using the entire 0 Hz to 60 GHz band, it will be using something like a few GHz or sometimes much less. i.e. 60-64 GHz

Also much of the processing isn't done on general purpose processors but in FPGAs or ASICs which can do lots of the parallel computation very fast.

For comparison 802.11ac has a max of 120 MHz channels, and that's cheap and everywhere these days (but runs on 5 GHz).

There are lots of basically tricks to reduce the amount of data in successive stages so that the processing is quite possible.


In summary, 60 GHz is the CARRIER frequency (the central frequency) around which a modulated signal is centred or otherwise aligned. The modulated signal BANDWIDTH (channel width) is where the signal information is encoded and is going to be measured in Mega-Hertz (MHz).

E.g. for 802.11a/n/c the CARRIER is in the 5.x GHz band but the modulated signal BANDWIDTH will be one of 20, 40, 80, 160 MHz around a central frequency [0]. The 'base' 20MHz bandwidth at best performance (signal to noise ratio) will be able to modulate (encode) at a raw 54Mbits/second.

Intel publish [1] an easy-to-read set of tables that show the relationships between standards, frequencies, bandwidths, MIMO, and modulation schemes.

[0] simplified - actually there is a (frequency-hopping or direct sequence) spread-spectrum method

[1] https://www.intel.com/content/www/us/en/support/articles/000...


That’s a really clear answer, thank you!


The same way you can receive/transmit a 5GHz wifi signal using devices running at a fraction of that clock rate: most of the critical high-frequency signal processing happens using analog electronics. The actual sample rate of the digitized signal is much lower.


Yes. The relevant information may be contained in a narrow swath of bandwidth riding within that 60GHz. Say you use 60GHz as a carrier but only care about changes happening below 10kHz. You can separate out that from the returned signal and only process 20k samples/second.


The other comments are good answers to your question, but here's a fun little fact: if you actually want to directly read a 60Ghz signal, you actually need to sample it at 120Ghz to get it all! Good old nyquist.


How hard is it to make stealth clothing, so your body doesn't reflect back to the radar system?



> In October 2019, Soli shipped in the Pixel 4, marketed to consumers as "Motion Sense." It provided users with a faster face unlock along with some basic touchless gestures like controlling music. Google must have decided it was still too early as they haven’t included Soli in any of the Pixel phones since.

Wasn't the blocker regulations in India?


quite possibly. i didn't dive down that rabbit hole so just said 'too early' ¯\_(ツ)_/¯

would be interested if you have any links about it


The science fiction novel Snow Crash mentions and features mmWave radars frequently.


My original intro was about Snow Crash. Still waiting for mmWave on my skateboard http://pages.erau.edu/~andrewsa/Project%203/Thompson_David/S...


Hm, the article makes it sound as if mmWave radars are a fairly new addition to cars. But 77 GHz radar sensors have been in cars since over a decade now, thanks to suppliers such as Bosch and Continental.


Privacy is a political/legal problem, not a technology problem.


A cryptopunk would say just the opposite: that privacy is a technical problem that, properly solved, alleviates us of political concerns.

Both views seem to miss the mark. The crypto folks see mathematical certitude while ignoring the dynamics of the social/legal context in which it must operate. By contrast, those who argue that "artifacts don't have politics" would absolve technologists of an ethical duty to consider how their products impact society.

In short, privacy required both technical and legal protections as well as a society that demands it as a fundamental human right


Right, and a standard response to cryptopunk fundamentalism is pointing at rubber-hose cryptoanalysis [1].

[1]: https://en.wikipedia.org/wiki/Rubber-hose_cryptanalysis


It is a technological problem too. Telegram has a much better UI/UX than Signal, and not having end-to-end encryption by default is part of the reason.


Wire has a good UI/UX with always-on E2EE.


technology is a political/legal problem. the printing press, gunpowder, the internet. technology has always been in the middle of political/legal problems.

can't be evil > don't be evil.


So far these applications seem pretty innocuous, but for things like gesture & other control systems, can mmWave be jammed or spoofed?


there is a risk of being jammed or spoofed which is why there has been a bunch of research done in the automotive industry. https://www.caee.utexas.edu/prof/bhat/ABSTRACTS/SecurityOver...

But I don't think there is much risk in gesture control.


this is terrifying? even more surveillance, but passively, all the time, in spaces that are deemed too private for cameras.


So even more surveillance, but passively, all the time, in spaces that are deemed too private for cameras.


man that mmwave stuff built into phones and watches looks great. Would love to see that on my watch and Galaxy Fold3... less touching but gesture control. Want.


> "Consumer Technology Association (CTA) announced Ripple, an open API for developing general-purpose consumer radar systems"

Why using the name of a cryptocurrency for something that has nothing to do with it??


Ripple crypto doesn't own that name in any domain other than crypto so it can be used in other applications without an issue. The name makes a great deal of sense in this context since mmWave uses tiny waves; otherwise known as ripples.


If we won't see it coming, and as many commenters have pointed out, it will indeed comprise personally-identifiable information - despite the (absurd?) claims - how does this all fit with the GDPR right to be forgotten?


"One of mmWave's major advantages is its privacy-preserving nature."

Ha, haha, hahaha!

It can measure your breathing, heart signature, gait, etc. With machine learning this data will certainly be enough to identify you.


I want to get off Mr Bone's Wild Ride.


The ride never ends.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: