Hacker News new | past | comments | ask | show | jobs | submit login
IBM has found a way to store data on a single atom (cnet.com)
122 points by ValG on March 9, 2017 | hide | past | favorite | 84 comments



> a single atom of the element holmium carefully placed on a surface of magnesium oxide. A special-purpose microscope uses a tiny amount of electrical current to flip the atom's orientation one way or the other, corresponding to writing a 1 or 0. The researchers then read the data by measuring the atom's electromagnetic properties.

I'm not sure I could have recalled the existence of the element holmium, I've never heard or read much about it. I looked it up, and found the likely reason it was used for this research:

"Holmium has the highest magnetic permeability of any element and therefore is used for the polepieces of the strongest static magnets." https://en.m.wikipedia.org/wiki/Holmium

I don't know if we'll see practical atomic storage or if more than one bit per atom is physically possible, but in theory there's enough space in an atom to hold millions of bits. But I think you have to get to black hole density... https://en.m.wikipedia.org/wiki/Bekenstein_bound


Just note that in practice nobody wants to keep their data storing atoms frozen near absolute zero, but rather prefer to have them at temperature close to 300 K.

However to store 1 bit of information at given temperature the energy difference between state corresponding to 0 and state corresponding to 1 has to be not less than something of order kT ≈ 0.02, otherwise the information would be quickly erased by thermal motion. But if we take maximum energy gap at atom that might be used for storing information to be upper bounded by atom's ionization energy [1], it turns out that it can't be larger than something of order 10 eV. So it doesn't seem to be possible to store more than hundreds or thousands of bits per atom at room temperature.

[1] https://en.wikipedia.org/wiki/Ionization_energies_of_the_ele...


This is an interesting point because with the kind of matter density needed to even approach the Bekenstein bound, it seems like achieving near zero temperatures would be increasingly difficult.

Said another way, the Bekenstein bound is a limit based on the amount of information contained not just in a volume, but also with a given amount of energy. IANATP (I am not a Theoretical Physicist) but it seems like, according to the Bekenstein bound, lowering the temperature might reduce the theoretical amount of information available.

Anyway, yeah, the Bekenstein bound is purely theoretical, there is not, and probably never will be a practical demonstration of it.


Could you please expand on your kT ~ 0.02 calculation?


Sorry, it was of course 0.02 eV, not just 0.02 :)

k = 1.38e-23 J/K = 8.6e-5 eV/K, so kT = 0.025 eV for T = 300 K.


of course ;)


> I don't know if we'll see practical atomic storage or if more than one bit per atom is physically possible, but in theory there's enough space in an atom to hold millions of bits.

What will you encode the bits with?


I think he means that physical space could accommodate such number of bits before a black hole forms, not that we can tame an atom specifically to hold that information.


I thought he meant that you could encode more than one of two states. For a very base analogy, instead of just - and | representing 1 and 0, you could have - \ | / representing 00, 01, 10, 11, etc. Wifi does something similar with signal phase.


While I could (and do) definitely have fun speculating that more than two states might be possible to represent - I can imagine a bunch of armchair physics possibilities - I didn't mean to suggest anything specific. The Bekenstein bound is only an idea, there's no known physical way to get even close.

Maybe ionizing states, or bonds using multiple kinds of atoms, or use of radioactive elements, maybe something like that could be used to represent multiple states... I'm sure IBM & other labs are pushing to find out as fast as funding permits.


I can imagine a bunch of armchair physics possibilities

normal caveats (not a physicist, chemist, lawyer, etc)

Since atoms are made of multiple components, if you can modify and measure those components individually, then it's at least theoretically possible to encode more than two states per atom. All of the following assumes you would want to keep the same atomic number for the duration, obviously if you don't care what type of atom you're storing then there would obviously be many more than two states.

If it was possible set and count how many neutrons an particular atom has (aka which isotope), then it would be possible to encode more. Even Hydrogen has three isotopes, and Xenon has nine stable isotopes (and many more unstable). Same for number of electrons (aka ions).

If there are more properties that could be manipulated for each of those individual components, then it would be possible to have even more states. (ex: electron spin).

For example, with a hydrogen atom and it's 3 isotopes, it's theoretically possible to encode 4 states (2^2, half-nibble, crumb)


What about quantum effects? I thought at that scale it would be a qbit.


Things like this were partially why I got interested in electrical engineering & physics. Sadly, 15 years later, my career deviated to financial software, but I still find articles & progressions like these fascinating.


It's not too late if you're willing to take a pay cut.


Unfortunately, I've lost a lot of my EE knowledge due to mental atrophy.


You'd be surprised how quickly it comes back. I recently had to design a simple ~50 node circuit after ~10 years of not doing anything of the sort, and I had it simulated in SPICE and prototyped after maybe 10 hours of work over a few days.


We had to learn SPICE in my ECE program. I found that inexplicable- writing a circuit in SPICE is like using a slide rule. I can derive a circuit on paper (just as I can do math), and I can use an actual circuit analyzer (just as I can use a calculator). Knowing the foundations of simulation programs is justifiable, but using and practicing with them is just excessive. As the circuits got more complex, I "cheated" by using a script to generate netlists and eventually I just used LTSPICE because christ, I have better things to do than type until my fingerprints wear smooth.


Atom storage has been done before by others. This example at the University of Wisconsin-Madison from 2002 comes to mind:

http://www.trnmag.com/Stories/2002/080702/Ultimate_memory_de...


Storage medium is just part of the equation. The other important and hard part is the mechanism that is required to read and write data to that medium. Would you use a CD as a storage medium when the CD reader/writer is size of a washing machine?


> Would you use a CD as a storage medium when the CD reader/writer is size of a washing machine?

Yes, according to history, if that's all anyone had. :)

IBM's project might be the ENIAC of molecular storage devices. Only time will tell. Keep in mind your example doesn't go far enough to match past history, we used to actually have much worse than 600MB / washing machine. We used to have 100 words / warehouse.

"By the end of its operation in 1955, ENIAC contained 17,468 vacuum tubes, 7200 crystal diodes, 1500 relays, 70,000 resistors, 10,000 capacitors and approximately 5,000,000 hand-soldered joints. It weighed more than 30 short tons (27 t), was roughly 2.4m × 0.9m × 30m (8 × 3 × 100 feet) in size, occupied 167m2 (1800 ft2) and consumed 150 kW of electricity."

"In 1953, a 100-word magnetic-core memory built by the Burroughs Corporation was added to ENIAC"

https://en.wikipedia.org/wiki/ENIAC

* EDIT: It'd be more fair to use punch cards as ENIAC's storage mechanism to compare against, and punch cards held a lot more than 100 words. Anyway, still, crazy by today's standards, right?


To be fair, there's a difference between long-term storage and working memory. The magnetic-core memory you're talking about is RAM, not storage. Though it is looking like the two may converge some time in the future, until now "working memory" (RAM) in computers has always been far lower-density than storage, but far faster, for use in computations (and also volatile, unlike storage which is non-volatile). It's the equivalent of comparing your brain's short-term memory (when thinking about a problem) to your hand-written notes. Punch cards are indeed the appropriate comparison.

But to get back to your original point, a washing-machine-sized storage machine is perfectly acceptable if that's all your technology allows. In fact, it'd even be acceptable now, if it allowed you to replace what currently takes a whole data center's worth of hard drives. I'm sure Google would be ecstatic if they could store all of YouTube on a single machine the size of a washing machine.


> The other important and hard part is the mechanism that is required to read and write data to that medium.

The smart thing to do nowadays is to locate processing circuitry with the memory in order to reduce transport and maximize parallelism.


Well, there were minicomputer disk drives: http://i.imgur.com/7VVFXEg.jpg


True, but have you seen some of the bigger tape library robots? :-)


Can we stop measuring data storage in units of 'song'?


I think it's a good unit of comparison. Virtually everyone has listened to a song to completion, and a very large number of people have done so in the past 24 hours, or even hour. It's also much easier to quantify the length of a song. Whereas with books: well, it's been a couple months at least since I've finished a new book. And number of pages in a book is not as meaningful to the human experience as amount of time. And the amount of time spent on a book -- i.e. minute per page -- greatly varies per human.

Sure, according to who you ask, a 3-minute Justin Bieber song contains less "data" than a 3-minute Bob Dylan song, but at least the quantifying of time is consistent among different people (um, relatively speaking).

And sure, 26 million songs is still as hard to comprehend as 26 million books. But again, more people can quantify how much of their life a song takes because most people have more recently consumed a song's worth of information.

The variance between data storage for song (e.g. length, kbps) is not meaningfully different enough in terms of order of magnitude.


Raw studio-quality audio and highly compressed mp3s are multiple orders of magnitude apart.


To insinuate that the average user is using uncompressed audio is being obtuse. The social context for the measurement by song came from iPod like devices based on the average 1MB per minute of audio MP3/MP4.

Raw is 10MB Loseless compressed is 5

Everyone using "songs" as a metric is talking about 1MB per minute MP3.


And punk rock songs and prog songs can be an order of magnitude apart in length.


When the average reader has to store/port songs in bulk, which of those formats are they using?


An unknown quantity somewhere in the middle.


A “song” is about 4-10MB. Good enough for a colloquial point of reference.


Why is the choice books or songs?

There was a time something of the sort would be needed for the popular audience - but contemporary 'normal people' well understand the usual measures of digital storage.

People don't choose between the '3,000 song' & '6,000 song' iPhone variations - even though Apple previously offered this comparison much more prominently for its iPod range. They choose 8/16/32/64/128 GB (the modern public is even catching up with us in having the powers of 2 memorised!) or whatever is the current lineup.


That's probably got more to do with the prevalence of streaming services. Very few people keep all their music on their portable devices anymore.


It doesn't matter what the cause is - I'm saying as a result of that, the general populus understands gigabytes.


I don't think they do, because family members ask me if they need to replace a 500GB hard drive with a bigger one when it's all Excel and Quicken.


I personally prefer the more correct "library of congress" units of measurements.


why? for those who know how big a song is, you can easily translate. for those who have no idea but roughly know how many songs their music players can hold it is a far more meaningful measure than units of 'bit'


Are we talking 2-minute punk songs, 4-minute pop songs, or 20-minute jazz-prog odysseys? And is this CD quality wav, or 128 kbps mp3?


4 minutes. A typical length for a song.


And for those like me, who don't know either of those things, can you please translate into bytes?


about 5 megs.


So 130TB.

Was it really easier for them to say 26 million songs?


Not easier, just more meaningful to more people.


With IBM's technique, you could record everything you have said since birth and until the end of your life onto the same area, Big Blue said.


Yes, keep the old trend and standardize on a certain symphony.


I like the idea of storing information by using two stable isotopes of carbon into a diamond. It would last a long time and be extremely compact.

Has anyone looked into that?


Push something other than carbon there and it will be much easier to read¹, without sacrificing any reasonable² amount of shelf life.

1 - If you do it with a surface, there's commercial tech available for reading it. But you'll still have to develop the entire writing stack.

2 - For a "my info is secure for N times longer than the Universe will take to get into heat death" you'll get a smaller N.


How would you read that information?


I guess that's what they have to work out.

You could do graphene instead and then just read it out row by row?


How is the single atom isolated? And how can you make sure there's only one?


The atoms are manipulated by a scanning tunneling microscope [1], which allows you to both image and manipulate single atoms, as shown in the movie "A Boy and His Atom" [2], also made by IBM. You can make sure there's only one by just taking an image at a resolution high enough to see single atoms.

This is fundamentally a scanning technique. A very sharp tip, down to a few atoms at the point, sometimes capped with a single carbon nanotube, is scanned across the surface of whatever sample you have, which for a measurement like this, must be almost atomically flat. A bias is applied between the sample and the tip, and quantum tunnelling can allow for electrons to move between the sample and the tip. This current can then be measured, and correlated with sample height or electronic properties of the sample. If you scanning step size is less than that of the size of an atom, you can then image single atoms by detecting the change in current due to a different species of atom, or due to the change in height between your flat surface and tip when an atom is sticking out of the top of the surface.

To manipulate the atoms, the tip is moved close enough to an adatom that it begins to form a weak bond with the tip. The tip then can move and essentially drag the adatom with it to wherever the researchers want. [3]

[1] https://en.wikipedia.org/wiki/Scanning_tunneling_microscope

[2] https://en.wikipedia.org/wiki/A_Boy_and_His_Atom

[3] https://www.nist.gov/programs-projects/atom-manipulation-sca...


If there's an atom there, it's a 1?


Magnetic orientation, probably. Elsethread, it was mentioned that they used Holmium, which is the most magnetically reactive element.


There's plenty of room at the bottom, after all...


Upvote for Feynman, always


Maybe a few decades from now, our children will look at photos of 10MB HDD, 8TB HDD, then 8PiB HDD then have a smile how technology evolve.


As a child (showing my age here) I remember getting a 20MB HDD for our desktop machine (might have been an Olivetti with a 286 CPU, I don't remember). At the time, that seemed an incredible amount of storage - "how will I ever fill that", I thought!


Couple of years ago a japanese team used an iodine atom to compute a fourier transform. The computation was faster than a computer but the ETL was super slow.

IBM has been pushing stuff around with a tunneling microscope for decades.

It's cool but the press should report the transfer rate.


This very much reminds me of https://www.youtube.com/watch?v=oSCX78-8-q0 which they did a couple of year ago


Finally we can test the limit of ZFS. /s


Only "/s" as this remains a "Boil the oceans" problem ;)

https://blogs.oracle.com/bonwick/entry/128_bit_storage_are_y...


800 Petabyte Hard Disks for sale?


Is that like a theoretical limit or are is it just a big number?


the atom was then given 30 days to move to a "core" molecule.


This website is a piece of shit. It autostarts a second video off screen even after I swatted the first one that popped on the right, covering the text of the article.


Is a piece better or worse than a pile, I'm not familiar with the unit conversions.


Yet, can't handle figure out a way to support remote employees.


Would you please stop posting unsubstantive comments to HN?


I was a team lead for an IBM operations team for 6 months in 2016. Aside from my first two days, I worked from home the entire duration of the role, based in Sydney. Most of my staff were in China, with handoff to the next shift in Europe (who were all remote staff as well).


Why do people do this?

News: "A does X."

You: "They don't do Y."

What is the thought process involved in this behavior?


The thought process is that someone gets badly burned by Y, so the association with A becomes very strong - stronger than any other association. Now every time someone mentions A, they bring up Y.


I think it's this:

News: A does X.

Thought process: A

Thought process: Pop stack about A and post it.


This has really become common while my Facebook friends play politics.


I'm pretty sure you can't work from home at IBM anymore, which is a bummer in today's world.


You definitely can - I know that many of them still do. However, I'm not sure how many people do this 100% of the time.


Nope, they recently changed their policy where your manager has to go through tons of paperwork for you to work from home one day a month! One of my really good friend works for IBM and that person told me about it.


Your friend is misinformed. The marketing department is the only group that did this. It is not even close to a blanket policy right now.


IBM's policy seems to have changed recently. https://www.theregister.co.uk/2017/02/08/ibm_no_more_telecom...


From your article: "IBM has spent the past couple of years undertaking a massive turnaround effort to transition from its servers and services business model to one focused more on cloud, security, analytics, and mobile. That turnaround has brought with it thousands of job cuts."

While it's obvious that IBM is trying to cross the same CASM (cloud, security, analytics and mobile[1]) as their "West Coast competitors", I'm glad that IBM is still investing in basic research as per the OP submission.

[1] https://dupress.deloitte.com/dup-us-en/deloitte-review/issue...


People I know there work from home and love it.


Bitter much?


Yeah, actually. I am.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: