5D and 13.8 billion years? Phony. I know no sane scientist or enigneer that would say they use more than 3D physics. Sounds like catch phrases to sell. With such logic plain old HDD is at least 4D, because it uses CHS coordinates and magnetic orientation, to store data.
To me such communication style undermines the real scine behind it. Do they have nothing better to brag about than excuses to call invention 5D? Why introduce such noise?
Turns out the biggest decay factor is nanograting and the biggest contributing physical quantity is temperature. They plotted what decay would look like on an Arrhenius plot. They both computed estimates and measured to confirm: the measurements are quite accurate. The specific claim is that they computed that it would last 13.8Gy at a some reasonably high temperature (462K). They could've picked any other point on the time/temp scale, like "here's how long it would last at room temperature" or "here's how hot you could store it if you only cared about it for a billion years".
They did not, however, simply make up a number with no justification, let alone commit straight-up academic fraud, as you're implying.
The paper is here: https://www.researchgate.net/publication/297892219_Eternal_5....
Choosing a temperature in order to say "this will last as long as the universe's current age", or finding ways to count extra dimensions is the behaviour of marketeers, not academics.
(I'm not quoting the other authors, who are of course also distinguished scholars :-))
A more useful number would be MTBF or a decay rate.
Right, the concept of storing data in crystalline like structures is not new and there's good technical reasons to believe the technology is a goer. (See my more general comment hereunder giving reasons.)
BTW, this is a truly exciting development, let's hope the practical implementation doesn't stall.
And it looks like it is the idea behind that storage medium. I haven't read the paper though so I can't really comment on whether or not it is justified to call it 5D.
Also note that 4+ dimensions are relevant even in a 3D world. For example the Lytro camera really outputs 4D light fields (2D position + 2D angle). Of course, there is nothing extradimensional about that camera, it is just a microlens network in front of a regular CCD, with processing that converts a high resolution 2D image into a low resolution 4D light field.
There is a common conceit to treat a vector of possible properties as one (small) dimension. For example, we can make your light field hold an RGB value at each coordinate. We can alternatively interpret this as a 5D field of 3D luminance vectors or a 6D field of luminance scalars.
However, if someone writes out a set of these light-field measurements as individual measurements, they might append the coordinate vector and the properties vector into one longer point (x, y, z, a1, a2, r, g, b) and claim it is 8D. However, that perspective ignores that a real light field cannot store more than one color combination at a given (x, y, z, a1, a2) coordinate. An arbitrary 8D point cloud can represent nonsensical, sparse superpositions of multiple light fields.
In ML, we regularly accept 100+d data (much higher for imaging applications). Even simple mechanical systems have large state vectors.
I’m a regular critic of university systems but this isn’t the university making stuff up because it sounds impressive.
(Additional evidence: one of the authors is a Microsoft employee, not an academic employed by the University of Southampton.)
When the number of dimensions is so close to 3, and the subject of discussion is ways to physically store data, this appears misleading.
(I work in the same lab but on different things)
“We have not (yet) built a full storage system and are currently building out (multiple) prototype read and write heads.”
“the prototype decoder achieves an accuracy of 99.47% across ... voxels written at two micron lateral separation, over 10 layers.”
“we anticipate that in a volume equivalent to a DVD-disk we can write about 1 TB. The technology can potentially get to 360 TB”
Dimensions are the conventional term when talking about aspects of data. It doesn't just have to be spatial data, although that's the common usage of dimensions.
If you're using the 3d position with two additional dimensions of size and rotation of the object at that position, you're literally processing 5 measurable dimensions of input in order to extract data.
It's not like they're claiming to record data in time and a wibbly wobbly 5th dimension. It's a literal mathematical dimension describing the properties of the recording medium and method.
It's completely common for scientists to use more than 3D, since D does not have to mean spatial dimensions. Relativity is 4D (space+time), M-Theory is 11 or 26 or 10 depending on context.
Even classical mechanics is formulated as 6D (3 position, 3 momentum). This leads to Hamiltonians and Lagrangians, cornerstones of modern physics, upon which all modern physics is based.
As to light, it's east to find multiple dimensions to store light patterns: 3D space, 1 D polarization, 1D frequency, and now we have 5 independent dimensions in which to store data.
5D, if they use 5 independent degrees of freedom to store data, does not undermine the science - it makes it precise.
The title of the paper is "5D Data Storage by Ultrafast Laser Writing in Glass" and is an invited paper at a top conference in their field. I suspect the '5D' being in the title means it is not marketing hype but actual physics.
EDIT: Here  is the paper. They do use 5 degrees of freedom as I expected: 3 position, slow-axis orientation, and retardance. So it most certainly is 5D storage.
data = read_bit(x, y, z, yaw, size)
But i agree, i would appreciate more information if they are going to use buzz-words.
"In mathematics, the dimension of an object is, roughly speaking, the number of degrees of freedom of a point that moves on this object."
If you have 3 spatial dimensions x, y, z, then you can have 3 degrees of freedom by translation and scaling along those dimensions. You can also have another 3 degrees of freedom by rotation around planes x^y, x^z, and y^z. I'm not sure what degrees of freedom are represented by the dimensionless scalar and pseudoscalar x^y^z, but probably one of them is mana points, and the other is hit points.
Two spatial dimensions give you two translation/scaling axes (x, y), one rotation (x^y), and one stamina points (scalar).
A 4x4 transformation matrix has 16 values, suggesting 4 dimensions, but it doesn't have 16 degrees of freedom, because several values are constrained. Nine values have three degrees of freedom, three other values have three more, and four are fixed, with zero degrees of freedom. Those six degrees of freedom suggest three dimensions, where two of the potential degrees of freedom are ignored.
So a higher-dimensional mathematical space can embed a lower-dimensional model, if the constraints are clever enough, and map the degrees of freedom onto the right dimensions or combination of dimensions. This might be useful for generating smoother animations using linear interpolation, or avoiding gimbal lock, or attempting to unify gravitation with electronuclear forces, or something else.
"Term of art" might be a subjective concept, but the article making it up or not is objective.
That said, the article is heavy on marketing (13.8 billion year claim is not scientific, at least in spirit) and looks like PR department product.
In what sense is this "not scientific"?
What’s special about 462K? Is it a local min/max? Doesn’t seem so.
Given that they had a hypothesis, designed an experiment and validated the result, it seems a bit much to say it "flys[sic] in the face of the foundations of science".
Not on Earth, not in space, when real considerations like impact resistance are in.
Neural networks used for AI are represented by N-dimensional spaces of, for example, 1024 dimensions.
People have been storing things in 5D on wood substrate for years. Like clothes in the closet (the 2 extra Ds might be type and color).
The terminology is clearly mean to impress the general audience that would normally not even bother to read a classic "engineering" title.
With 2d I can double the length or width to double the capacity.
With 3d I can double the length width or height to double capacity.
That's why 3d storage is interesting.
A byproduct of public disinvestment in higher ed. Nowadays, you've got to show the taxpayers, and donors, where their money goes. Which is not all bad in my opinion.
Turns out the biggest decay factor is nanograting and the biggest contributing physical quantity is temperature. They plotted what decay would look like on an Arrhenius plot. They both computed estimates and measured to confirm: the measurements are quite accurate. The specific claim is that they computed that at a specific temperature (462K) the medium will last 13.8Gy: their point is that it would last as long as the age of the universe at that temperature. They could've picked any other point on the time/temp scale, like "here's how long it would last at room temperature".
They did not, however, simply make up a number with no justification, let alone commit straight-up academic fraud.
The actual paper is here: https://www.researchgate.net/publication/297892219_Eternal_5...
Are they from flatland? If not, how do they calculate motion?
To clarify what I mean when I talk about what I think they mean (yes that sentence is confusing), consider the following example:
The pits on a DVD are subject to one-dimensional analysis because every position targeted by the laser for a read operation can be in one of 2 states (i.e. 2^1)
The pits on a dual-layer DVD are subject to two-dimensional analysis because every position targeted by the laser for a read operation can be in one of four states (i.e. 2^2, because there are two layers and each layer can be in one of 2 states independently of the other)
If I understand the article right, this technology uses 3 physical layers, and adjusts the size and orientation of each air pocket in the glass. So on each individual layer, a dot can be categorized in any one of 8 states (2^3) and there are 3 layers (2^3) of dots. That should give 2^6 possible states per read operation. So why do they say 5 instead of 6? There could be any number of reasons, such as certain adjustments to size and orientation at a higher layer affecting the ability to read from a lower layer, or error correction built into the decoding algorithm.
But yeah, that's my lukewarm take on their silly buzzwords.
The 5D in the article is kind of fishy, but in this case size and orientation are more of a low resolution dimension like multi-level cells in a flash drive.
Thanks to lvh, who pushed me to actually look at the paper.
I think your emphatic view is somewhat ill-informed. For years it's been theorized that glass-like crystals could store information indefinitely—even well beyond the age of the universe. Here's the reasoning:
1. The oldest crystals known on earth are already some 4.4 billions years old and they are still intact. They are zircon crystals found in various places across the planet but the oldest discovered to date were found in Australia.
2. For all intents and purposes these zircon crystals still contain most of their original encoded information from the time they were formed. The reason we know this is that these crystals have remained essentially intact and structurally undamaged over their lifetime of some 4.4 billion years—by definition, if they'd lost all their information they would no longer be intact crystals.
3. Yes, they will have lost a tiny percentage of their information integrity from the time when they were formed, this data loss would have been the result of small amounts of both internal and external radiation and from external cosmic ray radiation, heat and other environmental effects, nevertheless zircon is a very hardy material and this loss has been minimal.
(Essentially, restating the obvious, as zircon has been shown to keep most of its data intact for at least one third of known length of the universe (13.8/4.4 Gy) then by extension the storage time mentioned in the paper [13.8 Gy] for this glassy crystalline structure is highly feasible).
4. If these 4.4 Gy zircon crystals had originally been encoded in ways that allowed lossless data recovery such as Reed-Solomon encoding as used in CDs and DVDs to recover data then today we would be able to recover ALL of the original information from 4.4 Gy ago.
5. We know that under ideal conditions some crystals such as zircon and others have lives of much longer than 4.4 Gy. Why? Well crystal structures are known to be the most stable forms of matter in the universe that goes with the fact that they have the lowest entropy: https://www.livescience.com/50942-third-law-thermodynamics.h... — to quote: “The entropy of a perfect crystal is zero when the temperature of the crystal is equal to absolute zero (0 K).”
6. Theoretically, a perfect crystal with zero entropy in an ideal environment would keep its information for eternity—which is much longer than the age of the universe. Now, as we know zircon and its data/structural information can exist almost intact for at least 4.4 Gy and that a perfect crystal kept under ideal conditions will do so for ever, logic dictates that we cannot rule out that crystals cannot last the age of the universe which is ≈13.8 Gy.
7. That said however, making crystals that last the age of the universe is one thing but keeping them sufficiently intact to be able to recover all their data after this length of time is another matter altogether. To do so—as mentioned—we would have to encode the crystals with a data recovery algorithm and ensure that they were safe from cosmic radiation etc. (Even with hardy crystals such as zircon we may have to send them to a part of the universe that's very low in ambient cosmic radiation (but we'd have plenty of time to do so given their intrinsically long and hardy life).)
8. The ultimate life of encoded data in near-perfect crystals would be the product of the universe's environment and the degree of how perfect the crystal was (how close to zero its entropy was). A similar problem arises with ancient DNA, no matter how well DNA may me stored against damage it ultimately will succumb to damage from cosmic radiation which on current estimates is about 1 to 2 million years [right, Jurassic Park is not possible by resurrecting DNA at least]. Keep in mind however that zircon-like crystals are millions of times more hardy and resilient than is DNA.
9. As I said, the concept of storing data in stable crystal structures is not new. Seeing the crystal/glass-like photo in this article I cannot help be reminded of the remarkable similarity between it and HAL's crystal-like memory in Kubrick and Arthur C. Clarke's 1968 classic science fiction film '2001: A Space Odyssey'. I'm pretty certain, one way or other, crystal data storage of this type will be the norm before long.
10. Oh, BTW, for those who might worry about the concept of 5D not making much sense, I can assure you it's pretty common these days, it even extends down to what's called 5-axis machine tools: http://www.5-axis.org/
Edit: please stop posting flamebait generally. You've been doing it repeatedly, and we ban that sort of account regardless of its views.
To not use such well known examples on the grounds of a form of political correctness is both absurd and it also denies us our culture and heritage.
I do not believe in the literal truth of the Bible as such, but I certainly do not deny its importance in Western Culture. Moreover, the King James Edition of 1611 is one of the most remarkable texts ever written [perhaps you should read a few bits of it some time], especially so given that it was written by a committee—committees usually produce Lowest Common Denominators but that certainly wasn't so for the KJE!
There is nothing inconsistent with what I have said here.
* I was nearly going to say "for Heaven's sake" but I thought better of it. ;-)
What is it telling me about the researchers!
Also, Christianity is the largest religion... https://en.wikipedia.org/wiki/List_of_religious_populations
Sales is an approximation for readership. The second source also claims "figures may incorporate populations of secular/nominal adherents".
So, while I agree that populations that identify with Christianity may be larger, it's pretty obvious a lot of those do not observe the rituals the religions they identify with specify.
To get a better readership proxy, perhaps, one could use the share in sales of religious texts across the different demographics.
Then we need to also define what "read" means. Does it mean the book was read cover to cover or does it mean the book is read from occasionally? Religious texts are used in very different ways than other forms of literature, to the point comparison borders the questionable.
If researchers at KAUST had decided to etch the Quran as a sign of devotion and humility before the unknown, what would have been the problem?
For the decision of who to honor is solely of the creator(s) and not of some political commissar
Press releases. Whatta-ya-gonna-do?
Are you suggesting they pulled this out of their backside? The research very much supports that number.
No wonder public discourse is so terrible nowadays, when people are willing to dismiss entire institutions based on an arbitrary image in a press release. But, I guess that's just a step or two removed from asking someone to be fired because one time they interviewed or were seen with someone else who's opinions you don't agree with.
Just because they eventually map to three special dimensions doesn’t mean it’s not useful to talk about the different degrees of freedom you’ve managed to use for storing information.
What were y’all expecting? Spider-Man: Into The Spiderverse?
"WMAP estimated the age of the universe to be 13.772 billion years, with an uncertainty of 59 million years. In 2013, Planck measured the age of the universe at 13.82 billion years."
Turns out the biggest decay factor is nanograting and the biggest contributing physical quantity is temperature. They plotted what decay would look like on an Arrhenius plot. They both computed estimates and measured to confirm: the measurements are quite accurate. The specific claim is that they computed that at a specific temperature (462K) the medium will last 13.8Gy -- but their point is that it would last as long as the age of the universe at that temperature, not that the pulled a number out of their backsides.
Most likely, for long-term archival, we need to make copies of the data from time to time. But this technique will likely allow those intervals to be longer than the current age of the universe, and it offers higher density than you can get with techniques like Norsam's planar discs, which is a useful improvement over existing media.
BTW, if you want to go even further back, who else remembers the promises about bubble memory?
No "bad history of claims," just niche technology.
Not everything has to be "fake news" or "phony" just because it doesn't take over the world.
I can't help but think we're cheapening the idea of fraud when we accuse every company or technology of fraud just because they don't wildly succeed.
A medium that really has these kinds of density and survivability traits is indeed a great thing, but "niche" is a bit of an understatement. Adding it to the payload of a multi-million dollar rocket is definitely a publicity stunt, so I think it's entirely fair to point out that it might be good for little else depending on how further development plays out.
Also, such rockets often just use a mass simulator (i.e. block of concrete or metal) on inaugural flights, so there's nothing wrong with giving it a shot.
In that context, describing it as "a publicity stunt" seems short-sighted to the point of self-parody, like a small child who thinks that the main distinguishing feature of money is that you can buy candy with it. In a very short time, it is likely that the only things humanity has done that are even detectable are the launching of satellites, a mass extinction, and the launching of such archival media.
Practicality wasn't the point. Publicity was.
The real issues involved here are the Laws of Thermodynamics, Entropy and 'Glass' (Crystals) being the the most stable state of matter in the universe. All of these indicate that such longevity is possible (see my main post).
Clearly, the reason that '13.8 Gy' is used here is that it's a well known time interval and it puts the longevity of this technology into perspective in ways that many will understand.
If actually achieved in practical terms then we ought to be hailing this work as a remarkable effort—not quibbling about trivia and silly incidentals.
Since I've been talking about the Rosetta Project and related initiatives for about 15 years, long before Elon Musk was involved, you're also mistaken to assert that "we're [not] talking about it here for any other reason." I mean, I'll take your word that it's true of you, but it's not true of me. Maybe you've got a mouse in your pocket?
Using this exact media was a full test of the technology, to get it out of the lab and into the environment they intend for it long-term.
But I can see you're committed to this narrative, so please don't let me interrupt it with the actual organization responsible for it.
Only in this case you also have to arrange the other objects so they don't occlude the ones behind them.
Imagine trying to lay out a warehouse so you can do inventory without walking through the building. Just a binoculars and a gantry crane, or a bicycle to go around the perimeter.
How densely are you gonna be able to pack things, really?
I mean, there is one option I know of. The guy who discovered NiMH battery chemistry had an optical storage format he was pushing where you use one laser to excite phosphors in a 3d matrix and a second to read the state. The bits are all transparent until excited. But I think you're limited by how fast you can activate the cells, so linear scans would be the slowest.
Bubble memory, similarly, does work; it was just too slow to compete with DRAM and not dense enough to compete with spinning rust, except in a few niche applications.
False conclusions hypothetically jumped to by finance-obsessed man-children and brogrammers, in JWZ's immortal phrase, are not my problem. Mere assertion that their perspectives are relevant does not constitute an argument for their relevance. In fact, nobody in this thread has so far suggested storing their replica of the npm archive, their ML training data, or their porn collection on one of these discs, so your concern seems to be entirely without an empirical foundation.
Is it your reading comprehension that's lacking, or just your integrity? I said perspective is relevant. I didn't say their perspective.
More seriously, they don't talk about reading capabilities (retrieval speed etc). And what if it gets scratched? What is the error tolerance? At that density, a single speck of dust could have dramatic implications...
I hope this reaches industrial viability, because we desperately need a digital format that can approximate the lifespan of simple paper. At the moment we are chained to a maintenance nightmare of periodic hops between formats, with deadly consequences any time we miss a single jump.
Either way it'd have to be legible without the need of a microscope or computers. Write the same text down in a few dozen vastly different languages, so you've got a modern day Rosetta Stone.
And of course, build it somewhere remote and seal it. Bonus points for a stable atmosphere, fill it with a heavy noble gas or make it a vacuum.
(I'm not well versed in any related sciences, I'm just sounding off some random ideas)
> The geology of the mountain will allow the MOM archive to fully close itself by a natural phenomenon: the salt “flows” with a speed of 2 cm/year into any void. This will protect the archive from the greatest threat; man himself.
> The pressure which results from the weight of the mountain and a hypothetical ice shield of 5 km thickness is approximately a fifth of the burst pressure of the used materials.
>For the extreme longevity version of the Rosetta database, we have selected a new high density analog storage device as an alternative to the quick obsolescence and fast material decay rate of typical digital storage systems. This technology, developed by Los Alamos Laboratories and Norsam Technologies, can be thought of as a kind of next generation microfiche. However, as an analog storage system, it is far superior. A 2.8 inch diameter nickel disk can be etched at densities of 200,000 page images per disk, and the result is immune to water damage, able to withstand high temperatures, and unaffected by electromagnetic radiation. This makes it an ideal backup for a long-term text image archive. Also, since the encoding is a physical image (no 1's or 0's), there is no platform or format dependency, guaranteeing readability despite changes in digital operating systems, applications, and compression algorithms.
>Reading the disk requires a microscope, either optical or electron, depending on the density of encoding and could be combined with an Optical Character Recognition system to read the text back into digital formats relevant at the time of reading. We are keeping our encoding at a scale readable by a 1000X optical microscope, giving us a total disk storage capacity of around 30,000 pages of text.
I’ve been wondering lately how well you could do with a microfilm strip full of 2d barcodes prefixed by an optical copy of the relevant specifications.
Not really, those future people don't just come out of nowhere. It's a gradual, incremental change as there are still systems using tape storage the technology or formats are not lost but migrated.
Find a 3.5" floppy disk from 1995 with WordPerfect files on it and see how much effort it takes you to get the documents back. Now imagine 100 years have passed.
Similarly, I think what matters here is the higher-order redundancy. A single encoding that can be lost, is not straightforward to figure out, and holds the clue to a massive storage, isn't the most future proof decision; it's a single point of failure.
My understanding is that it doesn't matter. The data is embedded three dimensionally in glass. Assuming there was some small margin of glass around the patterned region, you could simply grind / polish off the scratches. Of course if you scratched it deep enough to destroy the data region then you'd be out of luck, but this can be made arbitrarily difficult.
Furthermore, you can arbitrarily scale error tolerance with error correcting encodings; data can be distributed, such that damage in one or a few areas is still available elsewhere, similar to RAID 5 and 6.
* Materials. You need something that can be manipulated or deformed at reasonable energy levels into two or more stable states to store data, then maintain that state for a useful lifetime without power. That's already a bit of a Goldilocks act, but it must also be non-reactive, not too brittle, not too expensive, etc.
* Media manufacture. Recording even a single layer at these feature sizes is a non-trivial challenge. As layers increase, yield drops exponentially - even before you consider how the layers interact.
* Read/write mechanisms. Focusing on a single layer on a moving medium, with nothing in the way, is also a challenge. Put other data-bearing layers in the way and it becomes much more difficult. Also, the "blast radius" for a single focus/alignment error becomes much larger, so you're going to need some serious error correction over and above what already exists for 2D.
* Transfer speeds. These are already problematic even at 10TB. 360TB without a corresponding increase in transfer speeds would be a nightmare.
A prototype involving a novel set of materials and/or mechanisms working once under ideal conditions is great. Science advances. Kudos for that. But that's only solving about one fifth of the problems that need to be solved to produce something of actual value in the market. You see a similar thing with battery technology. Everyone knows we need better batteries. There are always several companies claiming to have found the next breakthrough Almost invariably the new wonder material turns out to be a poor fit for real-world scale, economics, or conditions. Obviously we should keep trying, and keep learning, but nobody should get their hopes up too much too early.
> Spinning Rust – term for conventional hard disk drives with motors and using ferrous-based platters for data storage. Probably derogatory now that Flash Drives are beginning to replace conventional drives.
So far the most viable "solution" is to store data in the cloud service like Backblaze or AWS S3, where it's someone else headache to maintain high enough duplication and swap the drives as they reach their EOL.
If you really need the lifespan of paper... http://ollydbg.de/Paperbak/
What if SSD memory gets scratched?
What if <any storage> gets <X mechanical damage>?
> What if <any storage> gets <X mechanical damage>?
It's more like "How much X damage can storage Y tolerate before losing Z amount of data?" You evaluate the different tech then you pick the best option.
Maybe some of this could be mitigated with logical techniques (e.g. make data redunant over 2 or 3 different parts of the disc, sacrificing capacity for durability), but it really depends on whether the process allows for this or completely borks out as soon as a scratch happens. They don't say anything about that and I think it's a pretty important aspect of any long-term storage solution.
...it (appropriately) contained Asimov’s Foundation Trilogy.
> The Arch library that was included on the Falcon Heavy today was created using a new technology, 5D optical storage in quartz, developed by our advisor Dr. Peter Kazansky and his team, at the University of Southampton, Optoelectronics Research Centre.
or did we capture its evolution :)
Had we found a bunch of silicate glass pebbles in Europa, we'd have considered them a curiosity. After this, we have good reason to find a way to ship it back and have another look.
Documents that are meant to last a long time used to be written on vellum because it is a very physically durable material. I understand that this glass method beats existing digital storage methods for resilience, but does it best traditional analog/legacy techniques?
Vellum and other leathers have a lifespan of under 10000 years under ideal conditions, like those under which Ötzi was preserved. Under such conditions, the researchers extrapolate from accelerated-aging measurements that their medium will last 3×10²⁰ years, which is 3×10¹⁴ times as long. That is, this glass disc will last 300,000,000,000,000 times as long as a vellum document will, unless it's subjected to high heat.
They also extrapolate that at 462 K (189°, or, in obsolete units, 372°F) it will last the current age of the universe, some 10–100 billion years. At 189° I think vellum's lifetime is a few minutes.
(from a more detailed article )
> The information encoding is realised in five dimensions: the size and orientation in addition to the three dimensional position of these nanostructures.
Is the idea that these nanostructures are themselves hyper-resilient? Or would a significant impact alter them so as to render them unreadable?
The nanogratings do have a certain amount of built-in redundancy; they're holographic phenomena.
How long can we realistically expect glass to last, judging from e.g. beads of volcanic glass embedded in the geological record?
Nothing in the article states what kind of glass is used (which blows the bullshit detector right there).
If they use crystalline quartz, it should last indefinitely. If it's regular glass, then 50 years if you're lucky, controlled conditions, etc, etc.
If you look at other articles of this website, they will seem more like a lame attempt at getting traffic than to provide something useful.
They have articles titled like - "Ladies Get dose of Radiation From Government UFO" and "Hackers, UFO's and Secret Space Programs - Oh My!"
I mean, this does not feel like an information source I'll trust.
Edit: As others are mentioning in this thread. From a researcher's perspective, they should have also talked about the read/write capabilities.
The length of time this type of project takes amazes me.
It was recycled for a university press release in 2016: https://www.southampton.ac.uk/news/2016/02/5d-data-storage-u...
Then that press release was recycled for this HN submission.
And even if that was a reason to pursue a technology, the actual storage capacity is meaningless on its own. 360TB of spinning disks isn't very expensive to buy, rack, or run. And you can read and write to them at a decent clip. Managing failures is fairly predictable. What benefit does a magical glass (or, in previous incarnations, quartz or other crystal) disk have? Optical media isn't known for its amazing random access speeds. Write-once media has very limited use and is almost never fast to write to.
So who's buying this? Who has data that needs to last that long, or needs to store lots of permanently immutable data that's read back sequentially? I honestly can't think of a market for this. The article says "museums" but I don't know of any museums that would prefer glass disks for their backups over an S3 bucket. This is "on prem backup" taken to a comical extreme.
I can see the academic value of exploring the technology, but this space has been exhausted many times over. I remember seeing similar articles in Technology Review and Scientific American about identical developments twenty years ago. It's just not a good idea.
For random access, rapidly created and destroyed data, SSD and HDDs will continue to dominate. But for the growing use case of hoarding data forever, this is a good fit.
While this is true, the lesson I draw from this is the opposite from what you seem to be concluding. I conclude that most people are self-obsessed idiots who waste their lives on meaningless and futile things like "market need" and "survival".
Note that 'glass' is a hugely varied class of materials, so without knowing much more we can't make spot judgements about the validity of the claim.
As I say in part of my post the key issue here is the intrinsic property of this "glass" which will be of a crystalline-like nature, to quote:
<...> 5. We know that under ideal conditions some crystals such as zircon and others have lives of much longer than 4.4 Gy. Why? Well crystal structures are known to be the most stable forms of matter in the universe that goes with the fact that they have the lowest entropy: https://www.livescience.com/50942-third-law-thermodynamics.h.... — to quote: “The entropy of a perfect crystal is zero when the temperature of the crystal is equal to absolute zero (0 K).” <...>
They're using fused quartz, as one does. I'm guessing it would be a bit easier to use soda-lime glass, and also reasonably stable, but considerably less stable.
I would assume the most dense medium possible would be a collection of neutrons, since neutron stars are the most dense object other than a black hole, but retrieving information from them doesn’t seem feasible.
"The novel is regarded as a landmark in hard science fiction. As is typical of the genre, 'Dragon's Egg' attempts to communicate unfamiliar ideas and imaginative scenes while giving adequate attention to the known scientific principles involved."
Great book btw, hope I'm remembering it correctly.
From an old ZFS weblog post on why they chose 128 bits for their data structures:
> Although we'd all like Moore's Law to continue forever, quantum mechanics imposes some fundamental limits on the computation rate and information capacity of any physical device. In particular, it has been shown that 1 kilogram of matter confined to 1 liter of space can perform at most 10^51 operations per second on at most 10^31 bits of information [see Seth Lloyd, "Ultimate physical limits to computation." Nature 406, 1047-1054 (2000)]. A fully-populated 128-bit storage pool would contain 2^128 blocks = 2^137 bytes = 2^140 bits; therefore the minimum mass required to hold the bits would be (2^140 bits) / (10^31 bits/kg) = 136 billion kg.
Alternatively, information encoded in rocks could've been encrypted. And encrypted information should be indistinguishable from random noise.
"Here we have an extremely rare ritualistic sex device, the male of the species placed this over his member before performing an elaborate dance and inserting his member into the female"
It was a painful loss. I learned my lesson. Now I back everything up into at least 2 different methods: cloud + HDD. Cloud has been the most reliable so far. Some HDDs have failed (humidity killed the circuit boards).
What are you referring to? I assume you're not just talking about "high quality" compact discs, since we have plenty of readers for those.
I can't remember this statement from those times. Only people who said that tape is still the better/safer medium for music and backups. I even had a zip drive for that.
The question is nearly 4 years later are they anywhere close to production?
If you look at glass in medieval glass windows it's heavily distorted by gravity (fatter at the bottom than the top) and that's after ~900 years, so presumably they're actually meaning something more robust than that!
20:20 optical replication ;)
"It's been dubbed five-dimensional (5D) digital data because, in addition to the position of the data, the size and orientation are extremely important too"
Orientation is inherently important anyway. Have you ever tried reading a book upside down, and what does size bring? Other than lowering density. And obviously position is important, it's the difference between data and randomness.
The rest of your points... oh my. None of the dimensions count at all, really? So rant away, but it's nice if the sound and fury at least signifies _something_.
Regarding orientation, are you saying they printed half the book over the first half, but at right angles over the top? Ie you see different data from different orientations?
Ps I've seen old (18th century?) letters where they wrote at 0, 90 and 45 degrees to get more on one page. Postage was charged by the page so it made sense. Newspaper also used to be taxed by the page (in the UK at least) so you had origami like folding of one sheet of paper into a newspaper, not aware of them 'double printing' in this way though. I assume that's why 'broadsheets' were so massive up until relatively recently.
Imagine you could print a page containing just a grid of 100x100 numbers. If you can orient those numbers so the top of each digit could be facing up, down, left or right and use those 40 different glyphs to encode in base 40 instead of base 10, now you can fit 4x as much data into the same two dimensional area.
Now, that said, the stream itself is defined by a 3-dimensional dataset: amplitude, phase, and frequency. Every modulation technique is some way of twiddling those 3 dimensions.