
'Universal memory' research passes new milestone - lelf
https://techxplore.com/news/2020-01-universal-memory-milestone.html
======
bsder
This is probably a little better: "Lancaster University shows how InAs/AlSb
resonant-tunnelling non-volatile memory consumes 100 times less switching
energy than DRAM " [http://www.semiconductor-
today.com/news_items/2020/jan/lanca...](http://www.semiconductor-
today.com/news_items/2020/jan/lancaster-170120.shtml)

Sigh, it's a III-V semiconductor. We've been here before--and failed before.
I'll believe it when someone actually coughs up an array of these that can
hold at least a couple of bytes.

However, to be fair, we might actually start getting some progress outside of
silicon now that Moore's Law has broken. If silicon is at a standstill, other
technologies may actually be able to catch up for once.

------
fernly
Original paper (Abstract):
[https://ieeexplore.ieee.org/document/8948343](https://ieeexplore.ieee.org/document/8948343)

I don't have access to the full paper, but I note this sentence of the
abstract: "Simulations show that the device consumes very little power, with
100 times lower switching energy per unit area than DRAM, but with similar
operating speeds."

So perhaps no actual devices exist, only concept and simulations?

Edit: I am reading the original via sci-hub and yes, all conclusions are based
on (quite elaborate) device models.

~~~
vermilingua
Is this not typically (read: always) the case with new technologies though?
Before any production or engineering could be done, a model must be created.
The fact that a model exists strongly implies that it is physically possible.

~~~
scottlocklin
No, as a matter of fact it's extremely rare a model precedes something
physical that makes it to the corporeal world. Especially in device physics
land.

~~~
icandoit
My pcbs existed in design software before they were manufactured. I expect
that is the more common case today.

I ran my code against a dev database before it went to prod.

~~~
akiselev
The GP isn't talking about products but about exploitation of physical
phenomena. The first PCB was etched with the ancestor of our modern fab
process long before CAD software. Although the hydrogen theory for acid and
bases was discovered a few decades before, AFAIK there still isn't a unified
model of acid-metal interactions for different copper alloys, let alone all
metals.

------
reilly3000
I’m skeptical of any ‘best of all worlds’ claim for a nascent technology. In
this case, durability and read/write speed in one device would indeed be
revolutionary, but at what cost? If I were to guess I’d say it would be around
data integrity. If it works using resonance, could neighborhood bits be
flipped sympathetically under certain circumstances? Would it have the same
durability guarantees from server to mobile device? In environs with high EM
radiation? Over millions of writes?

If there are indeed no trade offs, hooray! Even if there are, it will still
have use cases for many scenarios, albeit not ‘universal’.

More generally I wish sci/tech reporters would ask these kinds of open
questions, or lead their readers into doing so. Otherwise it feels like
vaporware as they presented it, even if the work is valuable and credible.

~~~
elfexec
> Otherwise it feels like vaporware as they presented it, even if the work is
> valuable and credible.

Whatever happened to all the graphene hype from 5 years ago? For a while,
graphene was the wonder material that was going to cure cancer, give us fusion
and make us immortal. Everything was supposed to be made of graphene material
by now. And suddenly people/media stopped talking about graphene.

~~~
krastanov
That progress is happening and I see it every day around the academic labs I
work at. There are improvements in manufacturing larger sheets, it is used as
an important step of various nanofabrication protocols, and it inspired a
whole family of more interesting 2D materials.

I suspect the hype died for the same reason it happens with any important new
discovery - it is overhyped in the beginning by excited scientist because they
love what they are working on, then mostly silently the work continues for
decades, and then people just do not notice when it is part of the everyday
tech we use.

------
layoutIfNeeded
Aren’t we already at the point where a sufficiently large amount of RAM with a
UPS just enough to save everything to SSD in case of power loss is a
sufficiently close approximation to universal memory?

~~~
mantap
Yes and there's few computers sold that don't have either a battery (phones
and laptops) or UPS and on site generation (servers). Only desktops are
affected and they are the smallest segment by far.

What the world needs is cheaper DRAM, not non-volatile DRAM.

~~~
imtringued
DRAM costs less than 4€ per GB. It's extremely cheap right now. I remember
seeing prices above 7€ per GB a few years ago.

~~~
rdsubhas
Explains why new phone have as much RAM (8GB) as macbook pro laptops.

------
Nokinside
The largest memory technology breakthrough in memory technology I can imagine
would be to the ability to cost effectively combine high density RAM and logic
in the same chip.

Bandwidth intensive operations could happen inside single RAM chip or between
two of them. CPU would just send instructions with parameters and no data.

~~~
Cthulhu_
I just find it strange that they haven't put CPU and memory on the same die,
is there a specific reason for that? I mean over the years more and more
components have been integrated into the CPU - north / southbridge (I forgot
what those were for), GPUs, etc. Why not RAM? There's L1/2/3 cache on there as
well.

~~~
rasz
Different, incompatible Fab processes (diffusion, temperature) are needed for
logic and Dram cells (capacitors)

------
sevencolors
ULTRARAM

Sounds like something decided by committee.

~~~
atoav
Everybody would call it _URAM_ which is kinda okay

~~~
zdw
Which will inevitably be confused with _UDIMM_

