
I need to copy 2000+ DVDs in 3 days. What are my options? - huhtenberg
https://www.reddit.com/r/DataHoarder/comments/a6fkpm/i_have_temporary_3_days_access_to_more_than_2000/
======
huhtenberg
The context -

> _Maybe you should disclose your city or at least state /general location_

Washington DC, specifically College Park, even more specifically The National
Archives at College Park.

> _What is the content of the discs?_

Years ago in partnership with the National Archives Amazon.com digitized a
tremendous amount of film for the National Archives, with the catch that the
material cannot be freely disseminated online by the National Archives until
Amazon breaks even on their digitization investment. It’s been years now and
for a variety of reasons - many of which are Amazon’s fault - there still
exist a solid number of discs that Amazon hasn’t sold even one of. (In part
because amazon hasn’t even had them all available to sell. Pretty ridiculous).

The flipside of the catch is that the DVDs can be viewed and even copied for
$0 on site by researchers. I am doing research there and I have have a
research pass. I was looking at many of these titles yesterday. It’s time to
set these national treasures free.

~~~
veridies
I wonder what this material is. Are these commercial films, historical
footage, or what?

~~~
esonderegger
My guess is they're the films described in the CustomFlix partnership
announced here:

[https://www.archives.gov/press/press-
releases/2007/nr07-122....](https://www.archives.gov/press/press-
releases/2007/nr07-122.html)

For all the helpful advice offered both here and on Reddit about how to do
this, I wish there had been more time asking if this was something the OP
should be doing in the first place.

I completely understand the frustration if this is indeed the above set of
films and it's been over eleven years since the digitization agreement and
files have not yet been made freely available to the public.

That said, planning to use a researcher's pass to "set these treasures free"
and blaming Amazon for not having generated more revenue from these films
makes it sound like the OP thinks he knows better than the staff of the
National Archives how to best care for these assets and that he knows better
than the folks at Amazon how to turn a profit. To me, it sounds more than a
little arrogant.

A big reason the National Archives enters into agreements like this with
companies is that digitization, especially on their scale, is expensive. If it
weren't for agreements like this, Amazon would only want to digitize the films
for which they knew they could turn a profit and the vast majority of the
collection would sit un-digitized and be at risk of loss. The tradeoff is
between the immediacy of access vs the number of assets digitized and the team
at NARA made the decision that it was better for the American people to have
more assets digitized.

I'm worried that the next time NARA is in talks with someone about a
digitization agreement (for example, if there's a large number of early jazz
audio recordings on 1/4" reels and Spotify is interesting in paying the cost
of digitization in exchange for 2-3 years of exclusivity) that the company
will point to this example and say "didn't you just let a researcher publish
the entire collection Amazon digitized? How can you assure us the same thing
won't happen with these recordings?" The result will be the National Archives
clamping down on researcher access. I think that would be a net loss for
everyone.

~~~
setr
A reasoning that doesnt require (too much) arrogance: contracts and business
rules hold despite changes in context. Plans change, people leave, things are
forgotten. But the contract, and its limitations, go on.

Since amazon was only looking to get its investment back, but not a profit,
then clearly the intent of both parties was to eventually have it release to
the public. Whoever on amazon’s side pushed the deal presumably thought they
would be sold, but it clearly didn’t happen. It’s much more likely this has
been sitting on a backburner somewhere, left as some forgotten plan, than some
kind of long-term strategy to induce sales.

Thus, by virtue of an ill-written contract, we’ve entered a situation that no
one wants. Amaxon no longer cares about it, the museum presumably prefers
releasing it, and the public can only benefit from it.

By virtue of that same contract, there’s an escape hatch that might bring us
back to a state where everyone is content.

It would make sense to exploit it.

Ofc, this is assuming Amazon doesn’t care. But amazon is a company, and the
larger a company is, the less distinguishable it is from a government. And
governments certainly have control over things it has collectively forgotten
about, as it only rarely operates a single, like-minded, cooperative organism.

And amazon is indeed a very large company. It would hardly be unsurprising for
Amazon to not even be aware this contract still exists.

------
NoblePublius
I literally did this at a startup. We had eight eight-at-time Netflix accounts
(DVD accounts). This was in 2009. We made a search engine for movies. Yeah,
like Mr Skin. We ripped them six at a time. Took us about six months to do
4000 films. The DMCA violations would have been in the trillions.

~~~
Rotten194
Why did a search engine for movies need the actual movie files, and not just
the metadata? Or was it actually making the movies themselves available?

~~~
techdragon
I would guess it’s in order to build the hash/IDs/searchable dataset that can
be used to match the movies. Think Shazam style fingerprints but for movies.

I was looking into this sort of thing recently for a project. Very interesting
technology space with some genuine direct benefits coming out of the
application of machine learning.

~~~
NoblePublius
Correct

------
gpav
Judging from the date on the reddit post, it's a bit late for my comment (two
days after the post), but the real issue is not how to duplicate so many DVDs,
the issue is how to RIP those DVDs onto some cheap storage for later burning
onto DVDs. So high data bandwidth between stacks of DVD players with lots of
memory for caching (or however you cache DVD data streams). Fry's had a 4TB
external hard drive for $89 this weekend. Maybe have a bunch of SSDs as the
intake point for the DVDs, and then offload the ripped copies from the SSDs
onto physical drives while you're changing out the DVDs. Would want to use the
fastest interfaces available. I've no idea these days what the cool kids are
using. (My first hard drive was a 5-1/4" full-height 10 MB MFM. Thought I
would never need more storage than that. I think my second HD was 20 MB and
used RLL.)

Organizing the DVDs by length would allow optimizing the loading/ripping
process to assure minimum time lost waiting for the operator's hands to be
free. This kind of planning makes for an interesting project.

It might make an interesting crowd-funded project, if it's reasonably easy to
get a research permit. Plan it out, go in with the hardware, come out with the
images. Use all the error correction opportunities you've got available.

Do a web search for "bulk dvd ripping" (without quotes) and you'll find lots
and lots of discussion and advice, including some about building a dedicated
DVD ripping rig. MakeMKV gets good press, in my very quick read of a few
posts.

And there's always the option of crowd-funding to raise the exact amount
needed to pay off the break-even for Amazon's investment. I can't imagine
they'd fight back too hard when looking at a large check vs. a non-performing
asset, unless Bezos personally never intended to let the footage go free.

~~~
sokoloff
A 16x DVD drive is around 21MB/s output. Any modern hard drive is likely to
have 5-12x the sequential write performance as a 16x DVD drive. Shouldn't be
any need for SSDs.

~~~
dspillett
> Any modern hard drive is likely to have 5-12x the sequential write
> performance

The data rate falls to the floor as soon as the access pattern isn't
sequential though, which if you are using multiple readers it won't be. While
an OS might be bright enough to organise data flowing out of write buffers so
it isn't as random as it could be there is a limit to how far they will go
with this because they are general purpose OSs and optimising for multiple
bulk streams will punish more interactive activity. If you have a tool they
bypasses the OS cache and works in large enough blocks you might see better
results except if the write activity from each lines up at which point this
will make things worse.

Pulling the data off multiple DVD drives onto an SSD, swapping to output to
another once near full to continue while its contents are dumped sequentially
to cheaper-per-Gb traditional drives, would probably be the way I'd suggest.

In fact, you would get away without swapping between two SSDs: the read
activity pulling data from the SSD to a traditional drive is unlikely to have
much effect on the write performance for the data coming off the DVDs unless
you have a great many readers in one machine. If doing this all relatively
manually, to reduce manual steps once a DVD copy is complete add it to a queue
to be moved using something like
[https://en.wikipedia.org/wiki/TeraCopy](https://en.wikipedia.org/wiki/TeraCopy)
so you don't have to worry about manually coordinating the SSD-to-cheaper copy
operation to keep it sequential.

Assuming 15 minutes to read each disk (it is a long time since I pulled data
off a DVD in bulk so this is guess work based on old memory of it taking a
little more than 10 minutes to read a full DVD9 disk, and rounding up to 15 to
allow for manual process inefficiencies and some disks being slower to extract
due to condition causing rereads, etc) you are looking at wanting 21 or more
drives constantly on the go to get the job done in 3 solid 8-hour days
(2,000x15/3/8/60 = 20.8). Five laptops each with an internal SSD (128G+) to
extract to, five DVD readers on USB3 to extract from, and a 4+Tb spinning disk
(also external) to finally write to, might do the job and have the space
(2,000x8.5/5=~3.5Tb output per laptop). You'll need a powered USB dock/ for
each laptop instead of a passive hub, and you are going to want to add more of
everything to allow for the possibility of device failures.

Of course significantly less resource is needed (or you get more contingency
time (and/or spare kit to deal with failures) from the same resource) if most
of the media is DVD5 and/or not full disks. I've assumed the initial three
days is just for obtaining the content - I've not accounted for any other
processing (such as indexing and transcoding) or further distribution.

~~~
StillBored
I've done this kind of stuff, and between OS buffering, and making sure the
ripping software is writing large blocks (say 4-32MB at a time) its possible
to run drives at basically full bandwidth with something less than a dozen
streams. There is going to be more inner/outer track bandwidth variation than
the perf falloff going from 1 to 6 streams with large blocks (say 4-32M
sequences). There are a lot of reasons for this, but a lot has to do with data
placement effectively combining multiple streams into data writes to the same
sequential track.

More interesting is that even "sequential" read/writes already have seek times
built in because HD's aren't spiral track, so head switching, and track to
track seek (and the associated rotational/finding the servo track) are
inherent in sequential IO perf. So most filessytem placement/schedulers aren't
going to place 3 files being written at the same time on opposite sides of a
disk, so those head switch times and track times have nearly immeasurable
increases because the drive itself is also storing a large part of a track
write and moving 3 tracks and a head, is basically the same as just moving a
head.

------
PakG1
I manage the IT for a school. As we've gone more and more away from DVDs
(thanks, Apple), teachers have been asking me to rip their old DVDs. Tried
doing that, but a lot of the DVDs did not rip perfectly, so that the ripped
video ended up having a lot of corruption. I'm not entirely clear why software
could not play the ripped DVDs and just deal with the jitter and corrupted
data, but playing from a DVD drive could deal with it. It was weird. Googled a
bit, couldn't find much information other than the fact that CDs and DVDs can
deteriorate over time.

My question is not necessarily related to OP. But can anyone tell me why my
ripped DVDs could not play in DVD software properly when there was jitter or
corrupted data, but it could play properly when in the DVD drive? When I got
lucky, the ripped DVDs would play fine. Otherwise, it was play fine for a few
minutes until it ran into the first bit of corruption. Confused the heck out
of me.

~~~
StillBored
That's generally caused by a combination of a cheap/bad high speed DVD-ROM
drive and/or bad ripping software/options settings. What seems to happen is
the ripping software is running a marginal drive/dvd at full speed and
disabling the drive retry on error. Combined with all the soft ways these
drives fail it results in a bad rip.

This used to happen to me a few years ago, and it was overwhelmingly just a
sign of a marginal drive. For about 10 years, I was replacing drives about
once a year. It doesn't happen as frequently since I started using bluray
drives (usually LG) to rip DVDs and a piece of software designed for "piracy".

Anyway, two software things to try, are lower the ripping speed to 1x-2x, and
make sure to leave the drive retry on bad sectors enabled. Most decent ripping
software will have options to control the error correction/retry logic. The
problem is some copy protection schemes leave bad sectors on the disks, and
this will hang a lot of the "honest" ripping software that doesn't know how to
deal with it.

~~~
reaperducer
If the OP has been using a low-quality DVD drive, I recommend using an Apple
SuperDrive, since he mentioned Apple.

They may not have all the latest whiz-bang features, but they are rock solid,
and are USB powered. I have two at home, and use them to rip DVDs, and have
never had a failure in I don't know how many years.

New they run $79 from Apple. Or you can get a used one off shopgoodwill.com
usually for around $15.

~~~
PakG1
Problem happened with SuperDrives too though. :(

------
rhplus
The real question here is how the heck does Amazon have a deal with the
National Archives that prevents that public body from distributing public
domain content that it now holds in its archive? Is this typical for digitally
archived content?

~~~
rrix2
Ancestry.com has a similar agreement with the National Archive:
[https://www.archives.gov/files/digitization/pdf/ancestry-201...](https://www.archives.gov/files/digitization/pdf/ancestry-2015-agreement.pdf)

Five year embargo on the National Archive releasing whatever Ancestry chooses
to digitise and host on their proprietary database. This model of digitzation
appears to be the principle way that the records are digitized now:
[https://www.archives.gov/digitization/principles.html](https://www.archives.gov/digitization/principles.html)

------
social_quotient
I’d call a staffing company or a very active meetup group and ask for help.
Everyone could meet in a conference room and write the contents locally to
their machines. I’d create Pod groups of 5-6 people to assist each other with
the physical parts of the task, organizing and labeling what’s completed etc.

Thinking 15 minutes per you have 30k minutes. 1 person can get 4 done per hour
- let’s think 2 12 hour shift you just need about 25 people + their laptops.

If you need the equipment it could be rented by someone like this that does it
for trade shows [https://meetingtomorrow.com/austin-computer-
rentals](https://meetingtomorrow.com/austin-computer-rentals)

~~~
mikeryan
There are more and more “active” senior living communities that may likely
have a trove of willing and interested volunteers.

------
unqueued
I had a similar challenge many years ago. I worked at a computer lab and had
access to about 25 Dell Optiplex 380s, and I had about 200 DVDs I wanted to
rip. They were for a friend, who was compiling city council meeting videos for
a county.

So, I slightly modified Knoppix and started a netboot server, and made a dvd
ripping cluster.

I didn't end up doing exactly this, but it's worth trying DVD::Rip. It
specifically has support for ripping clusters[1].

But it doesn't even have to be something as complicated as a cluster, just as
long as you parallelize things. How many laptops are you allowed to bring with
you? Could you bring five laptops and 10 dvd drives?

[1]:
[https://ubuntuforums.org/showthread.php?t=1217643](https://ubuntuforums.org/showthread.php?t=1217643)

~~~
reaperducer
I had a similar challenge when the first iPod came out. I got one on launch
day in late October 2001 for my wife as a Christmas present

Two HP towers, each with two optical drives, I spent nights and weekends
ripping her CD collection. I just barely got it all done before Christmas. She
still has no idea how much work that was.

------
toomuchtodo
You need DVD readers, storage, and the ability to rip to disk images (try to
not reencode or modify the bits on disk for archival purposes).

Maybe put out a call to Jason Scott/ArchiveTeam/ArchiveCorps to saddle up.
ArchiveCorps (run by Jason) has a Slack team for such Call To Arms.

OP: Have you attempted to renew your research card with the National Archive
to get more time?

~~~
rhema
Jason Scott has a nice podcast where he talks about a range of topics of
interest to the HN community.

~~~
pronoiac
Jason Scott's podcast:
[http://ascii.textfiles.com/podcast](http://ascii.textfiles.com/podcast)

------
bane
Give a call to Jason Scott at the internet archive? Get lots of friends to
help?

------
crispyambulance
3 days isn't enough time to do a quality job on all of them given "one shot".

Better to just prioritize some sample of discs and do the best you can--
consider it a pilot run. If you monitor everything you'll be able to use the
pilot run to estimate and propose what it would take in terms of equipment,
time and process to do the whole thing in a reasonable time.

------
nickysielicki
I've done this with an old PC, 6 DVD drive bays and a bootable USB stick,
ripping to network storage, did like 200 movies in a weekend.

That PC sits in a basement now, I think. If anyone lives in Milwaukee and
wants to rip some DVDs over the holidays, get in touch, more than happy to
loan the machine out.

------
baroffoos
Do DVDs offer better error correction than CDs? I know from ripping CDs it can
take over an hour per CD to get an accurate rip because the program has to
read the disk multiple times to get the correct data out.

~~~
jonhohle
Do you have context on this? I’ve been ripping/verifying old CD based games
using raw cdrdao and can rip a CD with error correction in a few minutes. I
often sell games and use the same method to verify the integrity of the game
before sending it off.

~~~
oceanghost
Your rips are full of errors unless the drive you're using has C2 (many do),
even then-- you've got a few errors.

The only way to get an accurate rip of audio data is to compare it to a known
good source. This is why Accurate Rip exists...

Source: I've ripped 4000cds :)

EDIT: This applies only to audio tracks. Data tracks don't suffer from this
issue.

~~~
fireattack
>The only way to get an accurate rip of audio data is to compare it to a known
good source

That's the only way to _confirm_ you got an accurate rip. It says nothing
about how to get the rip in the first place. In my experience, "full of
errors" sounds exaggerated. Most of CDs can be easily ripped in 'burst mode'
(in EAC) or equivalent in other software and you still get same (exact CRC32)
result from AccurateRip DB with no C2 involved (I disabled it), as soon as you
set the offset correctly.

Disclaimer: I didn't rip 4000 CDs, only hundreds.

~~~
oceanghost
You are technically correct, which is the best kind of correct :-)

------
mgamache
Make sure and test the copies. I used to do bulk copies years ago and the
media + writing speed would make a difference. Writing at full speed would
technically work, but the disks were sometimes unreadable in some readers.

'Back in my day' I would chain FireWire drives together (better than USB/Hubs
at the time)

~~~
userbinator
With that time constraint I wouldn't bother re-burning to discs, just rip them
to a HDD (or several) to save the data for later.

------
slfnflctd
At 12 hours of labor per day, that's roughly one per minute.

So, at minimum you need one machine more than the number of minutes it takes
each machine to rip the DVD. If you also have to burn a second set of DVDs,
you will need at least one machine more than the number of minutes it takes to
burn a DVD _plus_ the aforementioned ripping machines.

If you're just storing the DVD images after ripping, that part can be at least
partly automated, set up a machine as a central server (about 20 terabytes
should do it if nothing goes wrong) and have each 'ripper' run a script to
copy the image up after the rip is done (which hopefully it can do while also
starting the next rip, if not you'll even need more machines).

------
nuguy
I wanted to do this but with Blu-ray’s. My goal was to build a library of
every movie ever made that was worth seeing. It seemed like 2018 was a good
time to do this since good movies have apparently stopped being made. Anyway,
If you wanted to watch tons of good movies, you would normally end up paying
tons of money to rent it from iTunes. And even then you only get to see it
once and you have to have an internet connection. And streaming services don’t
have even a fraction of the selection needed. But Netflix’s mail dvd service
seems to have every movie I can think of. So why not open a few Netflix
accounts, order disks in the mail and just save all the disk images? It seemed
like a good idea until I looked into Blu-ray copy protection. Of course, I
wanted to have my library consist of only the highest quality and highest
fidelity so Blu-ray’s were called for. But Blu-ray copy protection is devious,
ingenious and very effective. Each disk consists of two regions: A region that
holds encrypted movie data and a region that holds a key. It is illegal for
players to be sold that read the key and then forward it to user-facing
interface like a computer. Players must always only read the key only in order
to use it internally to decrypt the movie data. This stops all legitimate
entities from selling players that reveal the key to the user. But what about
the illegitimate entities that might want to sell modified players that
provide the key? Or just publish keys online? Well, the key on the disk is
itself actually encrypted. And it is encrypted in such a way that multiple
keys can decrypt it. Blu-ray players come with special hardware that is
flashed with a key at the factory. This hardware uses that key to decrypt the
Blu-ray’s key. In the event that a key is compromised and published online, or
used widely in any way, that key is depreciated and all Blu-ray’s from that
point onward contain keys that cannot be decrypted with the compromised
hardware key. Instead, a newer key is used. This new key is still able to
decrypt all the old Blu-ray keys as well as all the new ones. This effectively
defeats people publishing keys online. It’s ingenious in that the people who
conceived it realized that the only time key compromise is a problem is when
those keys are disseminated widely, and that when keys are disseminated widely
they are easy for authorities to detect. If you want to get perfect rips of
any blu-Ray you might come across, you are forced to go through the pain of
probing the hardware yourself to get that key, which is quite difficult.
There’s no way around it.

~~~
zwily
I’ve ripped every Blu-Ray I’ve purchased easily with MakeMKV, probably a
couple hundred. I dunno, but someone is making it seem easy....

~~~
nuguy
Like I said, makemkv (which is indeed the idiomatic tool for the job) will
never be a reliable way to rip _any bluray_. For my purposes, it was very
important to be able to rip absolutely anything I came across. If you’ve got
some Blu-ray’s and want to try it out then fine. It will probably work if they
are older movies and were “pressed” long ago. The Blu-ray copy protection
scheme also has the quality of making legitimate players obsolete if they were
the source of stolen keys. So even a legitimate older player might not be able
to read a new Blu-ray.

~~~
zwily
You may be right, but that doesn’t reflect my experience. Most the BR’s I rip
are just barely released (pressed) and I have a 100% success rate with
makemkv. They all literally worked on the first try, even new releases. Maybe
I’m just lucky.

The _only_ problem I’ve had is when they try to make it hard by putting a
gazillion titles on the disk to make it hard for you to figure out which is
the right one. That sucks, but is not insurmountable.

~~~
onychomys
This has been my experience too. I've never had makemkv fail to open
something. I suppose that if the dude who runs it ever gets hit by a bus or
something, then it might not be useful going forward from then, but presumably
somebody else would take up the work.

------
lph
My hands are bleeding just thinking about the logistics of opening and closing
2000+ DVD cases, and loading/unloading that many DVDs, over the span of three
days. Ouch.

------
taurath
My tip would be - find a nearby game company, especially a publisher. They
tend to have large amounts of extra burning towers from the pre-USB-drive days
- 2000 1-1 is a bit more difficult than 2000 copies of one though. I'd bet
there's a copy function on some of them.

------
turtlebits
You need one find one of those Firewire Sony 200 DVD changers. Used to have on
hooked up to my HTPC. Just a standard DVD writer in there.

Maybe you could reach out and borrow one?

From a quick google, it's the VGP-XL1B/VGP-XL1B2/VGP-XL1B3

------
umvi
There is a company in Utah called "VidAngel" which I believe had built up a
fairly sophisticated DVD ripping/streaming operation until they got sued by
Disney.

They are still around but have since switched to a streaming only model. You
could reach out and see if they still have their DVD ripping infrastructure,
I'm sure they would love to help.

------
PinkMilkshake
Look for a local gaming cafe and rent all of their machines.

------
baldfat
One PC with 8 DVD R/W Drives.

I use to own one in the 90s. I would do work for churches and conferences. I
could get a thousand done in a few days. They use to have automatic feeder
drives for producing more. No one sells them anymore it appears.

------
z3t4
40 TB of HDD and 10 (high speed) DVD-readers with two guys should get the job
done in one day. I wonder though if Amazon did use DRM ? That would be an
issue.

------
patrickg_zill
700 per day. About 30 per hour. Each DVD is what, 4gbytes? Maybe 10 laptops
with SSD drives or external hard drives would be enough. And lots of caffeine!

~~~
marak830
Surely that could be pumped up, write a script to auto-grab the video files,
rename them to $dvd_name(n) and place in folder. Deal with combining those
files later(when you have more time).

Note:it's been a looking time since I have tried to pull DVD files, but if my
(rusty) memory serves certain types of DVD readers will just let you copy the
vob.

~~~
toomuchtodo
You’d want a block level image of the DVD for archival purposes, such as an
ISO or BIN format.

Using dd (Linux) or Disk Utility (Mac) makes this straightforward.

~~~
cookiecaper
Though it's probably not the case here, there are some discs with copy
protection that specifically trips up block-level copies. The drive will
continually error and misread sectors. I'm not sure what open-source tooling
can be used to circumvent; I always end up falling back to AnyDVD.

~~~
jacobush
[https://www.makemkv.com/forum/viewtopic.php?t=17138](https://www.makemkv.com/forum/viewtopic.php?t=17138)
scripted

------
porlune
Does anyone know why they can't just network the file system?

If he has network access, then it might be faster to just upload the entire
set directly to another drive over the internet, since most universities have
fairly good connections, that shouldn't be any slower then trying to write
them all to DVDs.

------
simplecomplex
Contact archive.org, this is what they do.

------
JustSomeNobody
Get several cheap laptops with dvd players. install ubuntu. Buy a bunch of USB
HDs. Use DVD::Rip to extract the VOBs onto the USB drives. At some point
later, convert the VOB data to ISOs. Burn to DVD as time permits.

------
onion2k
Complete speculation here and I have no idea if it's actually possible, but
couldn't you take a high resolution photograph of the DVD and then use
software to turn the image data back in to DVD data? Maybe with a couple of
different colored light sources at obtuse angles to the disc? It's obviously
not a practical solution but it'd be _really_ fast if you could make it work.

It's definitely possible with phonograph records.

~~~
hamandcheese
Maybe if you shot a laser at the DVD and imaged 1 bit at a time. And then
maybe you could rotate the disc while you do this, so you can get a lot of
bits in rapid succession.

~~~
alias_neo
I don't usually come to Hacker News to laugh hysterically, but thank you for
that great start to the day.

------
netrap
There are many old SCSI and Firewire DVD changers available. Just gotta ask
the right people.

------
newname2018
Dd if=/(name of drive) Bs=1 Of=/( name of drive to copy to include the path
Conv=notrunc

Create a directory and a file on the target machine first a name like
/myfile/this file Then to create a bit by bit copy of that file use this
command Something like this Dd if=/dev/cdrom bs=1
of=/dev/sdc/myfile/thisfile.iso conv=notrunc

~~~
Dylan16807
Why set the block size to 1? Isn't that just going to cause a ridiculous
amount of syscalls?

~~~
ars
Yah, and be incredibly slow as well. It should be at a minimum 32K which is
the DVD block size, but even larger would be better (but may interleave badly
with IO to other devices).

~~~
anticensor
It should be 2048 which is DVD sector size.

~~~
ars
Because of ECC it doesn't read a single sector at a time. And actually even
32K is not large enough for good performance, you should go higher, maybe 1MB.

