I got my undergrad in Engineering Physics from CU-Boulder and talked with many professors about fusion research (plus keeping up with developments since then).
Putting aside all the controversy of LENR (low-energy nuclear reactions, the official name for cold fusion) and assuming that the theory actually results in usable tech (for once), the first line of the NASA article hints at where a device's power density would be competitive:
> "A team of NASA researchers seeking a new energy source for deep-space exploration missions"
which tells me that a theoretical device would be a replacement for current RTGs [1]. Low but consistent power for niche applications.
But in general I wouldn't get your hopes up. The higher-energy types of fusion power are far more promising for world-wide civilization-powering clean energy.
The part that I'm not grasping about this is how you harness the power that is created. I think it's great that they can start a fusion reaction this way. But how does this become practical, in theory? Does the lattice radiate heat once fusion starts or something?
It appears to be the same mechanism as neutronic high-energy fusion. An energetic neutron gets kicked out, which collides with some material in the cell (probably the erbium lattice), and generates heat. Which then needs to be hooked up to a water boiler to create steam, which powers a turbine, etc.
I'm much more hopeful for someone creating a Dense Plasma Focus device with aneutronic hydrogen-boron (pB11) fuel because the reaction energy can be directly captured as electricity, instead of having to capture hot neutrons to boil water.
Fusion products have particle energy on the order of 10s to 100s of MeV. Thermal machines tend to melt or evaporate when their average particle energy gets on the order of 100s of meV.
The increase in entropy on that temperature conversion alone is absurdly wasteful.
My understanding is that they are talking about bulk heating of a metal carrier, probably by a tens to low hundreds of degrees for power production. Not every particle in the medium is at Mev, otherwise it would be trillions of degrees kelvin.
In direct conversion, all of the particles used on the conversion have MeV energies. It is exactly that bulk heating that is the problem.
When you get a very low entropy source and your first step on using it consist on increasing the entropy 1000000000 times, you lose a lot of flexibility and efficiency.
Nuclear fusion is pretty much the exact opposite of a low entropy source. In any case, thermal power plants (gas, coal or nuclear) reach up to 50% efficiency with modern gas or steam turbines.
Putting aside all the controversy of LENR (low-energy nuclear reactions, the official name for cold fusion) and assuming that the theory actually results in usable tech (for once), the first line of the NASA article hints at where a device's power density would be competitive:
> "A team of NASA researchers seeking a new energy source for deep-space exploration missions"
which tells me that a theoretical device would be a replacement for current RTGs [1]. Low but consistent power for niche applications.
But in general I wouldn't get your hopes up. The higher-energy types of fusion power are far more promising for world-wide civilization-powering clean energy.
[1] https://en.wikipedia.org/wiki/Radioisotope_thermoelectric_ge...