
A 60 GHz phased array for $10 - blueintegral
https://www.hscott.net/a-60-ghz-phased-array-for-10/
======
stefan_
I love how industry came up with ever crazier schemes to stream content from
phones and laptops to TVs. There must have been three different attempts
involving WiFi alone, but this phased array mmWave 60 GHz million bucks basic
research abomination surely takes the cake.

Meanwhile, some Google engineer realized you could solve 90% of phone-to-TV
streaming applications and 100% of the hard technical problems by just telling
the TV to download and display the YouTube video itself. Genius!

~~~
TehCorwiz
Yes, genius if the content is asynchronous.

Any real-time or interactive display will need to be able to stream at sub-
frame latencies. At 60fps that means less than 16ms, at VR friendly refresh
rates ~90fps that means 11ms.

While their approach works beautifully for their core competencies, static and
non-interactive streaming content, it doesn't really work for any other
application.

~~~
jrockway
To be fair, the idea of a centralized computer generating content for dumb
terminals to display has been around since the dawn of computing. Terminals
connected to mainframes. Local X servers drew content as requested by remote X
clients. The idea of having all your software and data on a very powerful
computer inside your home (or pocket) is the crazy new one.

Certainly the concerns and dynamics of the situation are different now than in
the 70s and 80s, but some of the thought processes are the same. People want
to stream video games because they don't have $2000 up front to lay out on a
gaming PC. Streaming lets them pay $5 a month instead, and unlike credit,
there is no commitment. That's valuable. Greed is another reason for the
cloud. There is no reason why someone should pay $10 per month for Photoshop,
but since it's the only option, people do. That's free money for Adobe's
shareholders.

I can see why people try to poo-pooh this stuff; computing is built on
hobbyist experimentation, and the cloud takes all that away. You can't write
your own video game. You can't tweak settings, or make mods. You just get a
game that someone else made. But from a technical standpoint, streaming stuff
is probably going to work. I have less than 1ms ping to a nearby datacenter
(speed of light distance: 8 microseconds), and so do 10 million of my
neighbors, so it's probably quite profitable to have a collection of high-
density GPUs and CPUs rendering games for a few peak hours a day and then
training machine learning models outside those hours. The technical challenges
are minimal; the idea has been around for 50 years. The actual challenge is
getting the people who own the cables in the ground between your house and
that datacenter to actually switch packets quickly enough to make it all work.
When you were connecting a mainframe in the basement to terminals upstairs,
you made it work because it was your job. But now, one company owns all the
cables and another wants to make content to send over those cables, and the
incentives no longer align. Sure, Spectrum COULD update their core routers...
but they could also not do that, and then your video game streaming service is
dead. (Meanwhile, they dream of showing up and making their own video game
streaming service. They have as much time as they want, because they own the
cables!)

~~~
fulafel
I think it'll become easier to write your own cloud-streamed video game in the
maker/hobbyist kind of way, even if the cheap or open source Stadia-workalike
backend hasn't arrived just yet. (Of course Google might open up Stadia itself
at some point too)

------
lachlan-sneff
Phased arrays are very cool tech. Personally, I can't wait for visible-
wavelength optical phased arrays to hit the mainstream (they're just now being
implemented), since they'd enable tech like legitimately holographic displays
and video cameras with digitally programmable optical zoom.

~~~
jobseeker990
Have any further reading I can do on this?

~~~
hwillis
[https://www.spar3d.com/news/lidar/mits-10-lidar-chip-will-
ch...](https://www.spar3d.com/news/lidar/mits-10-lidar-chip-will-
change-3d-scanning-know/)

holographic displays would use eye trackers to show each eye a different
image. Solid state zoom is maybe a bit of a stretch, but it would involve
pixels becoming sensitive to angles more inward or outward from the sensor's
center.

~~~
lachlan-sneff
I'm not an expert, but I believe that's not how holographic displays would
work with optical phased arrays. I believe a phased array can make it seem
that light is being emitted from any point above the display (within the
display angle of the opposite side of the display). There's no need to track
observers, because it would be an honest reconstruction of the light emitted
from a real 3d dimensional object.

~~~
Fredej
I believe that's correct - see [http://www.phased-array.com/1996-Book-
Chapter.html](http://www.phased-array.com/1996-Book-Chapter.html) "Front
projection Images" for details.

------
Traster
>Now the bad news: SiBeam was bought by Lattice Semiconductor, and right
before I gave this talk, Lattice shut down the entire SiBeam organization and
ended support and production of this part. I didn’t find out about this until
months later, when I contacted the sales engineers I had been talking to about
this part and they told me what happened.

This is one thing that really pisses me off. Time and time again you've got
small(ish) companies doing interesting stuff, succeeding and then they step on
a landline. They do something that gets them in the cross hairs of a big
company and suddenly BOOM big company buys small company for ridiculous money
and then inexplicably shuts down 90% of what the small company was doing. The
sale happens for a nice premium and yet the second the sale is closed 90% of
the things that the company did that made it valuable are jettisoned. How can
it be that these companies can afford to buy companies at a premium, throw
away massive parts of the value of the company and yet: this obvious value
destruction seems to be standard operating procedure for large companies.

~~~
ethbro
> _How can it be that these companies can afford to buy companies at a
> premium, throw away massive parts of the value of the company_

It's almost like the lack of robust anti-trust prosecution by world
governments have so enriched large, rent-seeking companies that they can
literally afford to burn money and still come out ahead...

~~~
big_chungus
On the other hand, preventing acquisitions reduces available exits and might
discourage future innovation (which in turn might promote more trusts).

~~~
bordercases
There needs to be a larger gradient of funding options than "Waste cash until
unicorn" or "rent-seek until next bailout", and "dominate small-to-medium
market niche" or "sponsor and penetrate next manufacturing commodity".

We've seen so much wastage from the prevailing financial model in SV tech.

~~~
perl4ever
I don't understand. Your options are (1) be small, try to grow fast, (2) be
big, (3) be small, don't try to grow fast, and I'm not sure what (4) means.
What else is there?

~~~
bordercases
What behavior would prevent a technology like ultra-cheap phased arrays from
being locked up due to the corporation seeing some potential in either the
technology or the team to buy them, but then not giving both the leeway to
develop the market for the technology further?

In this specific case I guess we don't know the full picture of what Lattice
Semiconductor intends to do, but there are many examples in software of
startups getting acquihired and then the team dissolving into new projects
that are more familiar or closely aligned with the pre-existing business model
of the company.

Since it's always possible to just turn the startup into a subsidiary I'm
sometimes confused as to why this happens, unless if it's an issue of maybe
brand dilution or the market opportunity being too small to be worth the
overhead of keeping a separate entity tied to a larger one. Which is a part of
why more opportunities for low-growth or long-tail companies would be
important, since now in the case where the means for bringing the IP to the
market are eliminated no one gets anything at all.

~~~
ethbro
Use-it-or-lose-it provisions for all acquired technology, by entities over a
certain size?

If you are demonstrably developing a piece of technology, kudos. It's yours,
you bought its owner.

If you are not doing anything with it, you're required to offer FRAND license
terms to anyone interested in the technology.

Would at least make the tech available that's currently getting tossed in a
corporate closet in the basement.

------
tlrobinson
> What would be really cool is to build a USB board that plugs into one of the
> SB9210 boards and connects to gnuradio. You could do all kinds of neat radar
> experiments, presence detection, beam forming, you name it. Kind of like a
> 60 GHz RTL-SDR.

Maybe a dumb question, but how is it even possible to do SDR with 60GHz signal
on a ~4GHz CPU via a 5Gbps USB3 connection?

EDIT: I guess via down-conversion?
[https://en.wikipedia.org/wiki/Digital_down_converter](https://en.wikipedia.org/wiki/Digital_down_converter)

~~~
blueintegral
The data itself isn't 60 GHz, you're just modulating it onto a 60 GHz carrier.
[https://en.wikipedia.org/wiki/Modulation](https://en.wikipedia.org/wiki/Modulation)

------
robocat
Since the wavelength of 60GHz is approximately 5 millimeters, this technology
is sometimes referred to as millimeter-wave (mm-wave). (Copied quote).

That explains how close together the antennas are - close enough compared to
wavelength to be able to beamform.

Edit: also explains why it would be extremely difficult to build something
yourself at 60GHz - where every wire needs length to be matched to
submillimeter length, and a submillimeter tail acts as an antenna and as an
electronic component.

------
hinkley
Does anyone recall in the Long Dark Ago when there was a startup that was
planning to embed a phased antenna array into a cubicle wall?

It still gets me the level of miniaturization that happens when you come back
to an idea 20 years later, instead of watching the incremental changes along
the way.

------
nsxwolf
Is this appropriate for creating a wireless HDMI interface for VR headsets?

~~~
opwieurposiu
Datasheet says it add 5ms latency, latency in VR causes nausea.

~~~
londons_explore
5ms isn't yet at nausea levels though...

~~~
sp332
In 2014, Michael Abrash gave a talk summarizing what's needed for a feeling of
presence in VR. He said 20 ms motion-to-photon latency is required for the
virtual world to feel like it's "nailed in place". So 5 ms is 25% of the
latency budget.

~~~
nathancahill
The number I've heard, don't ask me from where, is 16 ms.

~~~
AYBABTME
That's framerate (how often images need to be rendered) at 60hz. Latency is
separate from that: you can have a 60hz framerate with 24h latency if you
watch a video you recorded yesterday.

In the parent's case, 20ms latency from movement to visible motion is part of
the pipeline that:

\- reads input

\- evaluates solution

\- returns solution to your screen

All kinds of things add to this latency: polling frequency of your input
device, bus speed of the device, how fast you can update the world, how fast
you can render the update, how quickly that updated image can be sent to the
screen, how quickly the screen is able to turn this into visible light. etc.

------
philprx
Maybe silly question, Are there some other commercially available chips with
phased arrays?

I believe RTL SDR did extend the RTL products end of life much further.

COuld this occur with this (or similar) phased array chips?

~~~
petee
The closest thing i can find is this similar sounding 60ghz radar chip on
digikey by Acconeer, but it doesn't look like you could control it or use it
like the sibeam does...

[https://www.digikey.com/product-detail/en/acconeer-
ab/A111-0...](https://www.digikey.com/product-detail/en/acconeer-
ab/A111-001-T-R/1891-A111-001-T-RCT-ND/8040771)

~~~
petee
Yes, I was indeed mistaken, this is just a Doppler module and not a phased
array. Bummer

------
droithomme
I wonder what sorts of things the chips were used for inside laptops and smart
TVs. He mentions it being used for streaming, but it's a directional radar
chip, seems it would be used for doing a 3d scan of an area?

~~~
aidenn0
It's a directional transceiver. Directional transceivers happen to be usable
as radar, but these were not intended for that usecase:
[https://en.wikipedia.org/wiki/WirelessHD](https://en.wikipedia.org/wiki/WirelessHD)

~~~
Animats
Whatever happened to that Google millimeter radar project to allow devices to
see your hand positions? It ended up in the Pixel 4 as Motion Sense, but all
it does is let you make swiping gestures in the vicinity of the phone. There
should be more useful applications for that technology.

~~~
Reelin
It's not limited to Google. For example, this TI mmWave sensor
([http://www.ti.com/product/IWR6843](http://www.ti.com/product/IWR6843)) has
an associated reference design for gesture control
([http://www.ti.com/tool/TIDEP-01013](http://www.ti.com/tool/TIDEP-01013)).

(Disclaimer: I've never actually used that chip or reference design and have
no idea how well it actually works in practice. I just think it's really neat
that mmWave radar chips are readily available at very affordable price
points.)

------
inetknght
I wonder what sort of API could be used to control that kind of phased array.
I guess that's the point of the blog post though: asking for help with reverse
engineering its interface.

------
dclaw
Bought some from ebay... Cost about $26 but the post has been out ~2 weeks.
Can't wait to play with it and see what can be done.

~~~
glax
Hope you make a post about it.

