Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: FPGA Ultrasound Imaging on a Raspberry Pi (un0rick.cc)
199 points by kelu124 on April 30, 2018 | hide | past | favorite | 34 comments



Seems worth mentioning that incorrectly used an ultrasound can burn bones. (Specific frequencies in bony areas moved too slowly).

I didn't do enough research to understand this (but just enough to verify it). I learned it initially from my wife who's an NP. When I mentioned a previous DIY ultrasound article to her she looked skeptical and mentioned the risk of bone burns.


I have never heard of this being an issue for diagnostic ultrasound. Since I’m trained in (cardiac) ultrasonography, I think this would have come up... I also did a brief lit search and didn’t identify any obvious examples.

I suspect your wife is referring to therapeutic ultrasound, which can damage bone [1]. But this is not the ultrasound people are getting of their heart, abdomen, or pelvis in general.

1 = https://www.ncbi.nlm.nih.gov/m/pubmed/11731056/


Even diagnostic ultrasound has been implicated in foetal brain damage (through cavitation). And ultrasound operators are known to often suffer from damage in their dominant hand.

https://onlinelibrary.wiley.com/doi/full/10.7863/jum.2009.28...


> ultrasound operators are known to often suffer from damage in their dominant hand.

Do you have a citation for this? I don't see anything about it in your link.


The comment from memory, but I found this paper:

https://www.ncbi.nlm.nih.gov/pubmed/8066232


US Bio effects of diagnostic US are discussed at (painful) length in undergraduate radiography training here in New Zealand. We are fairly well aligned with the UK in our training pathway but in my (limited) experience echo techs don’t seem to come from a radiography background while ultrasound ones do. Could be a local situation though. Broadly, the priciples seem to be ALARM and perhaps don’t do ‘souvenir’ scanning of foetuses. I wonder if anywhere follows that.


Good point, my training excludes fetal ultrasound, so this might be one reason yours and mine differ.


See! I learned something new today! :D


funny, while trying to learn more about that, I find this https://www.verywell.com/ultrasound-for-faster-bone-healing-...

:)


The github is a little more descriptive and answers a few more questions: https://github.com/kelu124/un0rick/ (though don't expect all your questions to be answered here either).

I'm curious what the FPGA approach offers above a regular CPU/DSP approach in plain English (quod non intellegunt: technerdese attracts followers, latinnerdese less so). I've been interested in FPGA's since dabbling with them a couple years ago but haven't come across a use case that's tangibly beneficial yet. If this is one, I'd love to contribute, but I'm not so far gleaning that from the docs. Can OP assist here?


Ultrasound processing is actually pretty niche: you have a whole bunch of channels with a very high sample rate (tens of MHz). In the case of medical ultrasound, a typical handheld ultrasound probe can have range from 50 to 150 parallel channels.

Because you need fast, parallel DSP, FPGAs actually really shine for ultrasound. And because production volumes are relatively low, the NRE to jump to an ASIC generally isn't justified. Just about every modern ultrasound machine does signal processing on an FPGA.


Today a graphic chip is also good to compute ultrasound images. I suspect that you can even do it on a CPU. I'm not sure if it is that practical to process in a FPGA if you want to compute non-potato image quality: you would need to handle tons of data by the FPGA, so either use a gigantic one (with a ridiculous price) or somehow page the data (but that's quite complex to do then...) -- this seems way more difficult than simply addressing the GiB that are avail in any modern CPU/GPU.


You don't understand. FPGAs are not used for their computing prowess, but for the ability to do highly timing sensitive parallel data acquisition. At 64 Msps you have <16ns for reading a sample.

Their most important feature is determinism. You can reason about and get an upper bound on how long something takes even while programming them.


Determinism is certainly a huge part of it, but FPGAs are insanely badass at computing prowess - you can create a custom architecture to execute whatever it is you're trying to compute.


Well kinda. The flexibility comes with a cost. Clock speeds are lower and for the same logic it's going to be less power efficient than a GPU or CPU or ASIC. Logic element density will also be lower so it'd be impossible to reproduce the entirety of a GPU or modern CPU (huge caches etc) on a similar-sized FPGA. I'd also imagine that the physical distance of gate paths are highly optimized in a CPU such that more gates can be traversed per clock tick there than in an FPGA.

If you have very specific compute pipelines with "colocated" (there's probably a better word -- basically data directly on the wire, not in memory lookups) data in parallel an FPGA can offer very high compute throughput.

But if you're having to do memory lookups and such for your algorithms then the memory bus is going to be your bottleneck and most of the FPGA's advantages are out the window. At that point sure you've got a few places where the FPGA does a few operations at once, but ultimately you just have an early 1990's speed processing pipeline with a few hardware-based micro-optimizations. A GPU and its custom designed baked in logic around memory buses and caches and such are going to blow it out of the water in terms of logical efficiency, power draw, heat dissipation, logic element count, and clock speed. Total throughput will be several orders of magnitude greater.

So, it depends what you're trying to do. It's dangerous to say FPGA is always badass. Sometimes they are, and sometimes they're not even close.


Its not just about the compute, but also the data acquisition. You can connect directly to your ADC without extra glue logic (and in some cases use the one embedded on the FPGA). Also a GPU is a lot more expensive than a $20 FPGA.


Not really: people have reacted to more capacity being available by getting in more channels. You can control a 1980s 20 channel ultrasound using even an atom cpu, not a problem (though will still get close to max cpu capacity). So what did we do ? Well we made chips, I believe the best one now has 4000 channels and there's test models of 8000 channels. A quad xeon can't process that. But it gets worse.

Because information is encoded both in the direct signal and in the interference patterns between different probes you'd want to evaluate both of those (if nothing else it gets you a wider field of view, but really it gets you more than that, for instance you can get full 3d from a cross-shaped array). Unfortunately that scales with the square of the number of channels. This doesn't even work on FPGAs at the moment, so they basically limit interference processing to only 100 or so of the 1000-2000 probes they have these days.

This is a problem that keeps coming back. CPUs can do nearly everything in processing, but with one exception they suck at it.

That means that when doing these things, you can use CPUs, and people are always pushing that, if you accept to be 2-5 years behind the state of the art for advanced applications, or you accept that you have to spend 50$ per piece where others can get more processing done for 1$, or "much less but still enough" processing done for 0.1$ (FPGA, DSP, microcontrollers, ...).

Good luck in the marketplace if you go CPU.

CPUs work where your complexity is above a quite high lower bound, and they stop working once you hit a certain level of data processing. Below that lower bound, they are absurdly expensive. Above the data speed limit they just can't keep up.

Now there's niches where the strengths of CPUs outcompete other approaches. For instance, you want to create stuff to interface with ancient equipment. Lots of that is necessary, and lots of it is custom. For telecom companies, banks, factories, ... here CPUs and the fact that you can do one at a time are tough to beat. Sure FPGAs can do it better, but that's not the point here. They're much harder to design, so a one-off (or couple-dozen-off) designs use CPUs, plus they're very unlikely to hit the data rate limits.

Even there CPUs tend to have a disadvantage. Lots of environments where these things have to operate aren't exactly pristinely clean, cooled and free of static and interference. CPUs don't deal with it half as well as especially FPGAs do. You want a closed box that has to operate at 70 degrees without cooking itself with constant electrical arcing around it ? Don't go for a cpu on a large mainboard. You can't cool because any cooler will get clogged and destroyed in a matter of weeks at the most.


" For instance, you want to create stuff to interface with ancient equipment. Lots of that is necessary, and lots of it is custom. For telecom companies, banks, factories, ... here CPUs and the fact that you can do one at a time are tough to beat. Sure FPGAs can do it better, but that's not the point here."

Guy I worked with had a recurring gig interfacing FPGA's to older Telcom equipment. Multiplexing multiple gigabit Ethernet channels into one very high speed channel which then was fed into a microwave up converter. Last and final one they did the FPGA put out like 25W. I can't imagine doing that with a CPU of any type.


Great information there!

Perhaps sounds like a use case for a streaming parallel ADC board into GPU memory? Maybe an FPGA can run ADCs and buffer into a FIFO and transfer over PCI in large, efficient frames?


Not OP, but FPGAs are good for realtime signal processing, including initial processing needed to filter and downconvert high frequency signals to make them accessible to conventional CPUs.

They are also parallel processing-oriented by nature, so a traditional ultrasound implementation based on an array of sensors is a good fit for an FPGA-based front end.

People tend to use FPGAs for tasks that used to be offloaded to dedicated DSP chips. Besides their processing capabilities, one key reason is that their I/O capabilities can handle just about any data acquisition/interfacing problem. When you need to move a lot of data around in a hurry, there are few solutions better than an FPGA, and certainly no more flexible ones. For mass production, of course, both DSP chips and ASICs tend to be more economical.

Some of the tasks involved in ultrasound could potentially be handled by the PRU units on the BeagleBone Black platform. Not sure if that would work well for this application but could be worth considering.


As in all engineering related choices, there are tradeoffs that make the decision between FPGAs and DSPs:

Similarities:

->Both are specialty items, with specialty dev. environments and proprietary libraries.

->Both are designed to do fast processing of highly parallel data (DSPs are known as VLIW processors, at least in all of the architectures I'm familiar with).

->Both require knowledgeable specialists to really get a whole lot of benefit from

Differences:

->DSPs were at a point in time much cheaper to build into a product, and sibling comment's mention of the BeagleBone's onboard resources reminds me that they were a big part of TI's business once upon a time. Relatedly, they are often included in SoCs because they're built using the same process node at the fab as the main ARM core. I'm not a silicon expert, so I'll defer to someone else for confirmation of this, but my general understanding is that FPGAs have been built on different transistor designs and such to optimize the LUTs and their routing, causing them to be uneconomical to include in SoCs. Given that both Altera and Xilinx offer such things now, I'm not sure how current that statement is, but there is some amount of engineering tradeoff in the silicon layout itself.

->DSPs are still a processor, with a fixed layout, pipeline depth, etc. that all causes them to be optimal for certain sizes/shapes of data. If your application doesn't meet that, you might be better off with an FPGA.

->On the other hand, an FPGA is more easily scaled to weird data sizes... Do you have a 6 bit super high speed ADC? You very likely want to use an FPGA.

->FPGAs are pretty much the go-to choice when dealing with really high speed sampling. I'm not sure exactly where the line is drawn, but I've worked on a couple of medical imaging devices during internships that both used FPGAs to acquire the data, do some gentle massaging and packaging before offloading it to something else to do real image analysis, image processing, computer vision, etc. ->A sub point to that, the resources on an FPGA that do that signal acquisition are pre-built blocks designed by the FPGA manufacturer and built in regular silicon, so they can acquire data at gigabit sampling rates. You can then apply place-and-route modifications to make sure that all processing of that data gets done on the LUT fabric as close to the signal acquisition point as possible to optimize timing, for example. This is the main area that FPGAs shine in.

->FPGAs have another interesting split: if you need to do complex verification of logic design, you might test on an FPGA and then build an ASIC. If, on the other hand, you're only going to produce, say, 10 MRI machines a year at north of $10 million each, it might not be worth building a custom ASIC, so you just stick an ultra high end FPGA in as a daughter board to the rest of your design. (Read this as "cost plays a major factor in picking FPGA vs. DSP")

Final closing thoughts:

The other major consideration between the two is who you can hire and what they know. A major portion of wireless devices (wifi routers, femtocells, etc.) uses a DSP to do the heavy lifting on the signal processing after the analog front end, so those companies have a lot of built in history, available example code, training for new engineers and an expectation that older engineers from a competitor will be used to that same technology. Medical device manufacturers, oscilloscope manufacturers, etc. have the opposite expectation since they've been using FPGAs for years. Once you're used to paying a particular license fee to Xilinx for the IP Cores you just dropped into your design, you're unlikely to be able to sell your boss on getting a bunch of custom code written by TI to make their DSP work as well, so you'll be using what the tech stack already calls for.

(edit: formatting)


One thing that might tip the balance in 2018: there's a growing familiarity and body of open source knowledge now around fpga signal processing among the sdr community.

There are now a lot of well documented open source fpga-based data acquisition boards being sold today at comparatively rock-bottom prices - we just call them sdrs.

The bandwidth requirements for ultrasound would seem to fall easily within the open source state of the art.


I'd be curious if you could point me to some links? Feel free to dm me, I'll stick my email in my profile.


I would definitely be happy to chat with the sdr community to see how we could interface. Ping me ?


An amazing portable US is soon coming from Butterfly Network. The transducer plus DSP is on a very large integrated circuit, and it can do breathtaking stuff for a device of this size and cost, like real time Doppler US (show blood flow in the image).

https://www.butterflynetwork.com


I'm looking forward to the introduction of this device, but it should be noted that color Doppler is also available on the VScan handheld. I think MEMS sensors are going to revolutionize the industry by making the probe an inexpensive part.


It uses Project Icestorm which is absolutely great. So much better using an open source toolchain than the proprietary horrors that FPGA vendors inflict on us. I wrote a quick intro here: https://rwmj.wordpress.com/2018/03/17/playing-with-picorv32-...

It's a real shame that larger FPGAs aren't yet supported.


Thanks for sharing this!


Cool project :)

I just got back from shenzhen where I purchased a handheld ultrasound probe with built-in screen about the size of a really chunky smartphone for $1800 with wifi & connection to ios/android


I am very interested in hacking on some ultrasound projects. Do you know if what you have is available for online purchase anywhere?


OP here -- I had played with a wireless probe and developed a very basic python framework around it, see https://github.com/kelu124/pyUProbe1 .

Same origin: Shenzen. Very friendly contacts there, always a pleasure to chat with fabs.


google sonostar, bargain a bit for a good price :)


My understanding is that the probes are the expensive part. The rest is added value, for sure, but there is something of a monopoly on high quality piezo crystals used in the probes. This ensures that the current players don't compete much on price.


As a way to go around this issue, I sense a bit of progress through compressed sensing, see for example http://advances.sciencemag.org/content/3/12/e1701423 . This would enable one (with enough horsepower) to get 4D imaging with a single element. This is very early, but I do plan on using this hardware to test the feasibility of single-piezoelement pseudo-imaging.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: