"New?" Things like RFID tags have been doing so for several decades, and prominent companies like Analog and TI have a whole range of energy-harvesting ICs.
I was too thinking that based on the title, but it looks like that the goal is a little bit different and indeed quite interesting:
The idea here is to harvest the energy from the "RF" ambiant noise, not from a pre-provided relatively powerful signal like the one that is powering RFID tags.
Based on what says the article, the power level of the "ambient" energy is too low to be harversted by current technology except the one presented in this article.
I think that titles are confusing, but your links for example are not the same thing as the innovation in the article.
For nexperia, if you read the datasheet in fact it is a module that will harvest energy from a photovoltaic cells.
For the e-peas, this is what says the datasheet:
"RF input power from -18.5 dBm up to 10 dBm (typical)".
So this is just the typical energy harvesting from an incoming signal.
In the original article, they said that their new technology allows to harvest energy under -20 dBm that was impossible till then.
I've been looking to swap in a rechargeable battery for a 3-Button Mini Remote Control (LiftMaster 890MAX). The remote control's case 60 mm x 35 mm, the device pulls 3V at 225mAh, and I'd like to replace the 3V CR2023 with an RJD2032C1 rechargeable coin cell battery from CDE.
The idea is to affix a thin, lightweight, high-efficiency 3V - 5V solar cell to the back of the case that can trickle-charge the battery, such as the SM141K06TFV from ANYSOLAR ($10.98, 184 mW, 58.6 mA, 4.15 V, 45 mm x 22 mm x 1.5 mm).
Ideally, the remote control would be battery-free, but not having to swap the battery more than once a decade would work. Any ideas on how to accomplish this?
3v 225 mAh is not what it pulls. That's the battery rating. "Pulls" refers to the current at a voltage. For example, you may say that it pulls 100 uA at 3v. So, first step is to measure what it actually pulls. Ideally profile it over time and in different scenarios. Or record how long it takes to drain the battery, measure the battery voltage under a small but nonzero load, and approximate how much energy was drained vs time to get watts consumed. Then you can work out what kind of solar cell and charge controller you can pair it with. Unsure if MPPT would be needed here, you likely aren't going for the absolute max power efficiency.
I love energy harvesting it’s one of my favorite topics in embedded land.
I’ve been toying with a chip that harvests power from an NFC phone near by and it’s super neat to have a microcontroller just do its thing with no direct attached power supply.
Isn't that effectively wireless power, thus inefficient, and not purpose-designed wireless power, thus even even less efficient?
It's interesting regardless; I'm just trying to understand some of its potential (and maybe these things can be overcome or become irrelevant for certain uses).
Oh there’s almost certainly a ton of loss, that said it’s definitely still super fun to play with. And makes me excited for better advancements in that field.
The chip I’m playing with is the NAC1080 - supposedly designed for small lock motors but I’m using it to update an eink display on tap
Yeah, there are plenty of use cases for which overall energy efficiency doesn't matter, because the actual energy demands are so low.
If you can run a microcontroller on a fraction of a milliwatt (which is entirely feasible) then it might not matter that you're wasting another 10 milliwatts on wireless power delivery. Depends what your energy source is.
Cold blooded animals generally use less energy than warm blooded animals because they use environmental heat instead of producing their own. So in a way that's one possible method to making something more energy efficient
How much ambient power is typically available in different environments? I skimmed the article but didn't catch any clear information.
Also, if I collect ambient wifi power, for example, instead of reflecting wifi - to whatever extent I do that - I create a sort of black hole. How does that affect availability of the wifi signal for its intended use?
RF is cube root I believe - nope, inverse square law, sorry - from an isotropic radiator. So 1000mW from your wlan AP is very quickly 10mW, at 10 meters.
Lightning exists and that's on the scale of 64gigawatts of energy, I have a differential lightning detector that can "hear" lightning roughly the entire hemisphere. I had to make the antennas. Blitz system blue.
Clear channel broadcast stations (the FCC term, not the company 'ClearChannel') must output 10KW - 50KW no more and no less.
If you're within a dozen or two miles of one that is beam-formed, you can pick it up with basically any semiconductor, a wire, and a speaker. Like, a resistor will work.
edit: I can probably do a demonstration video with a VNA/antenna analyzer and an HT radio, if such a thing doesn't exist. But the "voltages" that radios can detect can go as low as, well, i'll quote one of my radios specs:
> Sensitivity: -140.0 dBm (0.02 µV / 50 ohms at 15MHz) MDS Typ. at 500Hz bandwidth in HF
> Sensitivity: -141.5 dBm MDS Typ. at 500 Hz bandwidth in FM Broadcast Band (64 – 118 MHz)
> Sensitivity: -141.0 dBm MDS Typ. at 500 Hz bandwidth in VHF Aviation Band (118 – 260 MHz)
i was only able to test that radio to -136dBm or so. I forget why, i think the service monitor i was using was really old and probably needed some new caps and a decent interior cleaning. But the radio specs claim it can detect 2 hundredths of a microvolt in specific circumstances (like detecting a very faint beacon signal)
Broadcast AM stations sometimes have to pay other towers in the area to install a special ground match unit on their tower legs, that makes the non-broadcast tower "invisible" - otherwise there'd be a null in that direction. Like a cardioid shape.
The main advantage of a phased array isn't related to efficiency, it's that it can be "pointed" a different direction purely in software.
The ability to focus electromagnetic radio power from any source is: "how many wavelengths across is the antenna?"
This applies to receiving as well as transmitting, which is also why radio telescopes need to be physically so much larger than optical telescopes, but for beamed power you don't care about the receiver's ability to focus, so the size of the receiver is determined by the smallest region the transmitter can focus on.
This is the typical academic research institution press release. The primary purpose is attracting industry interest in licensing intellectual property not disclosing information the general public would be interested to know. These are written by university / research institution intellectual property licensing groups and tend to be rather frustrating from the perspective of anyone who actually wants to know enough to assess likely real-world impact or probable time frames for widely available productization. The telltale phrase was in the last paragraph:
> "The researchers also aim to collaborate with industry and academic partners for the advancement of self-sustained smart systems based on on-chip SR rectifiers."
It's basically institutional research click-bait. A tease to attract interest from companies. Such licensing is a huge business for universities, potentially billions of dollars over time and across thousands of licenses. Those responsible for generating that revenue are essentially marketers, salespeople and IP/contract lawyers. The marketers interview the scientists and then write these releases. Their goal is to make the headline claims as expansive and exciting as possible so that it will get picked up by the press and amplified on social media because it's free advertising. However, they're careful to disclose as little as possible about the trade-offs, constraints or limitations as that might reduce media interest or discourage a potential licensor from contacting them. So they strive for "maximum claim, minimum detailed information" while still maintaining a vaguely plausible similarity to the informative release you and I (and probably the researchers) want.
The best chance you have of getting relevant info is to look up the lead researcher's name to see if they've published a paper. To me, the best-case is when a fellow HN reader who is in the field responds to the post with a summary of recent related research and their personal assessment on the actual state of real-world applicability, trade-offs and cost-effectiveness.
It sounds like you're assuming that RF energy harvesting is theoretical but in practice an unattainable dream. Samsung has for several years been selling a TV remote control that uses RF energy harvesting. It's real; it works outside the lab for actually-useful stuff.
In light of that, this news release has a reasonable amount of information about how their research improves on the prior state of the art. They shouldn't have to expend any effort in a release such as this to convince users that a technology that has already gone mainstream is real and useful and worth improving upon.
I've worked on many products which had hoped to incorporate it, but ultimately didn't.
It does work well for applications like you highlight which require very very low power levels for very intermittent use. Things like a desk calculator or a IR remote. As soon as you need any kind of "real" power for anything other than intermittent use it becomes a problem. You can only beam so much power without cooking everything in between, usually that amount of power is less than we need.
Right, the applicability of RF energy harvesting is fairly narrow, and will still be narrow even with this technology. But this news release didn't seem to be making any sensationalized claims about being able to power something like a phone this way.
reply