The same goes for rogue librarians, or Google Books employees dumping entire libraries onto discs and leaking them out.
I was curious if this statement was true...
So assuming, 4.4 GB her HD film & 2 hrs long each, you can fit roughly 232 films per TB or 232,000 films total (for 1000 TB), that's 464,000 hours of HD content or 53 years (!) back to back.
So, yes, that statement is true. Now the next logical question: Has the human race produced 464,000 hours of video content?
Let's assume there are around 100 cable channels showing content in HD video 24/7 (with no overlap, there are more but a lot of duplication). That's 2,400 hours of content a day, 876,000 hours a year or 101 years back-to-back worth of watchable content shown each year.
But realistically that will be at least 70% duplication (particularly looking at it over several years). So even if we just look at 24/7 news and shopping channels which always produce original content, you're still talking about easily 200,000+ hours/year.
Far more than that... YouTube alone claims "100 hours of video are uploaded to YouTube every minute":
...which comes out to 144kh/day. In other words, one of these discs would be enough to store ~3 days worth of contributions to YouTube. So a box of these discs, let's say 1000 of them - which isn't all that big (e.g. http://www.amazon.co.uk/Neo-Aluminium-Storage-1000-sleeves/d... ), could hold an archive of every single video that has ever been uploaded to YouTube.
I am now reminded of this: http://www.youtube.com/watch?v=Y_UmWdcTrrc
Edit: got beaten to mentioning YouTube, but there's many other video sharing sites on the Internet, and of course there are probably countless hours of one of the types of video the Internet is well known for: porn.
So, that would be 4,640 minutes or less than a week. Lower quality, and probably with (near) duplicates, but I guess it is safe to say we have that amount of video.
Another way to look at it: I expect we produce more content in unique wedding videos every year (at an hour each, that takes half a million weddings or a million people marrying. At an average of one marriage per life and a life expectancy of 75 year, that takes a population of 75 million people)
People bitch constantly, but a typical feature film encoded by YIFY at 720p will clock in at somewhere between 650-900 megs, leaning towards the lower end of that range.
Not that it matters, the rest of your math makes it look ridiculous. Without too many tradeoffs, you could certainly put every American (Hollywood and indie both) film on such a disc.
> Now the next logical question: Has the human race produced 464,000 hours of video content?
Somewhere Google has a gauge that shows how many hours of videos they store... but I can't find it.
> So even if we just look at 24/7 news and shopping channels which always produce original content, you're still talking about easily 200,000+ hours/year.
Agreed. But I don't think that they have archival policies in place where everything is kept permanently. Even the news channels may have been in the habit of dumping everything but stuff deemed important, well into the 1990s.
I'm thinking that 100 channels provide exactly 100 years worth of content per year instead of 101. Unless you rewind the good parts :-p
The US is supposedly the only other country with similarly high number of channels, but I don't think it's unreasonable to assume that there are more than 10,000 channels worldwide.
>> Let's assume there are around 100 cable channels showing content in HD video 24/7
Note that this is comparing HD content to films, though I still think you are fine in saying we easily have enough content. :)
Seems to be rather likely.
Especially in math and physics they are every poor graduate students dream. Even large libraries have only one copy of some of those books stuffed away in some underground vault.
At 1X speed, a DVD's data rate is about 11Mbit/s - so at 250,000 times the density, isn't it safe to assume a theoretical data rate of 2.75Tbit/s? Of course, pushing that much data at that rate is another problem...
If my assumptions are incorrect, please let me know - I'm just trying to understand.
The total amount of data the laser goes over in a second is proportional to n - because it's related to how long the laser takes to get from one edge of the square the the other edge. But the total amount of information stored goes up as n^2 - it's related to the area of the square.
Or: to put it another way, the number of tracks also goes up when data density increases.
(This also happens with hard drives over time. Recent hard drives take a lot longer to read or write the entire drive than older ones. See http://tylermuth.wordpress.com/2011/11/02/a-little-hard-driv... for example.)
The "1000 TB" is just PopSci extrapolation at this point. If it does happen, it'll be neither a CD nor DVD.
Think about that for a moment - it's the index for one of the largest collections of publicly available content in the history of humanity, and it fits in my wallet 80 times over.
 Not the P2P content that people download - just the main site itself, containing all of the magnet links, etc. TPB's 'content' is literally just an index (it's not even a tracker).
 I don't want to link to TPB from HN, but just search for "pirate bay archive". There's a large archive that contains all of the site data (db backups), but the magnet links are all you really need content-wise.
I think you're confusing different definitions of 'content' here. Also, there are many more sites that have publicly available indexes of other links much greater than the pirate bay, google.com for example.
Furthermore, 8GB is relatively small for a flash drive nowadays, e.g. here is an 128GB micro sd card: http://www.amazon.com/dp/B00M562LF4
But the entire site archive is already available as a torrent indexed on TPB - why bother crawling? That's what I'm referring to - search "pirate bay archive" on TPB and you'll find it.
The magnet links are not useless, because they are able to query the rest of the metadata from a tracker and/or DHT.
I'm thinking about this from the perspective of 'minimal amount of information needed in order to reproduce TPB', not 'how much information does TPB index (which is obviously much larger).
(Remember that TPB no longer even operates its own tracker, so if TPB itself were taken offline, the magnet links would still work fine).
$18K seems like a drop in the bucket for powering this kind of research. Dr. Gan needs someone to introduce him to Kickstarter.
$1.8k in Perth http://www.postgraduate.uwa.edu.au/students/funding/travel
$1k to $2.5k at Uni South Australia http://www.unisa.edu.au/student-life/global-opportunities/tr...
$2k to $3k at Ian Potter foundation http://www.ianpotter.org.au/travel
From his other pubs , it looks like a two-photon process is involved as part of writing in deep-subdiffraction limit for lithography. Will be fun to try and reverse that.
Edit: Paper that this PR refers to is here:
This is pretty cool stuff. Reading through the paper now.
Digital downloads is all the rage right now, but people will still want to archive their downloads.
I don't know how true this is. Even the tech-savvy folks I know rely primarily on Netflix/Spotify/etc, and in the rare instances they buy something, they trust iTunes/Amazon/etc to hold it for them. (I continue to think that is nuts.)
Eventually I want to get a FreeNAS machine up and running, and any important files will be backed up to Google Drive (though, obviously not movies that can simply be downloaded again if desired).
If you store 100 times as much data, all else being equal, time to write a full disk will be ten times as large.
Ways around that are increasing rotation speed (been there, done that. There is little to gain here without making the disk a lot stronger = heavier, if that is really possible at all) and using multiple heads (harder to do, but may eventually be the better solution, certainly if one can completely do away with head movements)
The parameter of interest is specific strength, which is the material's (ultimate tensile strength/density).
I forget what the plastic in DVDs is, but it could probably be reinforced with carbon fibers (chopped, nanotubes, etc.) to give a useful increase in rotation speed.
Also, now I think a bit more about it: in a 50-speed CD player or a 16-speed DVD player, the edge of the disk moves at about half the speed of sound. Doubling rotation speed would push it over the sound barrier. I guess it would take quite a bit of research to make a CD/DVD/BlueRay player that will work fine at those speeds (a way around that is to make the disk smaller. That would be an option for a disk with the storage density discussed here, but it wouldn't win you more than a factor of four or so, at most)
However, assuming you can R/W at the same rate that's 10GB / second speed which might be tough for a home PC to pull off anytime soon.
PS: I suspect if this where ever out into production they would change the form factor to enable even faster R/W speeds and a significantly reduced capacity.
They made a fluorescent multilayer DVD and a credit card shaped ClearCard the DVD which I believe the goal was to store several hundred TB of data.
And this was about 14 years ago!
Certainly even 1TB CDs would be insanely awesome. It would open up a whole realm of mass storage. I'm just feeling how easy backups will be :)
Steampunk storage FTW.
AND B) Are CDs gonna make a comeback? Can we have cool mini-CDs?
Writing and reading speed has been mentioned. Another point is that as the dots become smaller, the mechanical parts need to be much more precise, moving the device from consumer parts to really expensive parts and assembly.
1000 terabytes = 8,000,000,000 megabits
8,000,000,000 / 400 = 20,000,000 seconds (7.6053 months)
Is this true? Can anyone point to a some kind of study showing convergence of storage capacity and data production over time? I was under the impression that we've got far more storage than we'll need, at least in the near future.
Error correction for these is going to be interesting, even a speck of dust could obscure many megabytes of data.
The researchers never used any discs. The 1000TB figure was for a surface area equivalent to a CD or DVD; they're the same physical size, and these descriptions give the reader an idea of what a 9nm feature size means in practical terms.
What they actually developed was a novel resin and a method for etching 9nm dots in it using a specific optical laser setup. If it were to be commercialized, it'd be a new type of disc, and CD/DVD lasers would not be able to read it.
Oops, accidentally implemented as a medium-hash CAMFS in disused packaging-and-optronics 45nm fab. Please write test, mind 5W limit if retaining cyano/pyridene dye in media.
What if I could hand you the totality of Netflix, iTunes, etc.? Not wait for it to download, but transfer large chunks of the Internet in seconds? and no longer have to hope that, say, Netflix won't lose vast sections of their library for contractual reasons (as it has done of late)?
Backups: permanent archiving of everything you have all on one disc.
It makes vast swaths of data yours, in your hand, under your control, in a ridiculously compact & cheap media. No more hoping that you can get X in time, or that it will be there some time hence.
I've seen profound transformations in computing when cheap fast storage increased by an order of magnitude of orders of magnitude. This will bring that about again, to similarly disruptive and amazing results.
Totally side-steps last mile monopolies as well. Don't want to work with us? We'll drop an LTE chipset in as well for the admin stuff. Redbox In Your Home.
You cannot buy a three thousand Gbps internet connection.
You would read from that plastic disc just like a CD or DVD, i.e. as you consume the content. This makes the throughput of any link in this chain irrelevant. (I guess except the read speed of the plastic disc)
Who's going to store data on something that will erase portions of itself when dropped? Consumers won't, because they won't know what to do when their 50,000 movies suddenly start skipping ten minute chunks. Scientists won't use it because they want strong guarantees on retrievability, and won't want to bother with stringent protocols on handling the media when other options exist. Logging systems might have a use for it, if they can stand potentially losing big chunks of data.
How often were you going to swap discs? When you got SESSION COUNT EXCEEDED, or UDF timeouts? Every time you give contact info...'and here is a capsule intelligence of things you may wish to contact me about in the next 15 years...but just call.'
Skip most of the spinning and size, put it in a fat SIM card, and have recovery pipelines (cleanroom et al) for failures. If you throw down for the 8TB (not really 8TiB) multiplatter +9 Fondleslab Of Regret maybe you can throw in the service premium to go visit the one retailer the future holds for us, do key exchanges, and have them print out a scan of peak versions of your besties, or let you write Abnormal_brain on the top with a marker?