Hacker News new | past | comments | ask | show | jobs | submit login
What is the “two-drive trick” that can read Amiga disks on a PC? (retrocomputing.stackexchange.com)
151 points by _Microft on Nov 15, 2019 | hide | past | favorite | 43 comments



Maybe I am a bit sentimental here but I found the low-level tinkering I did with computers during school much more interesting than the high-level things I learned to do later. Things like toggling the keyboards LEDs for the very first time felt just great. That reminds me how easy (QBasic!) it was to get started with it. I think it was the lack of abstraction and the verbosity and 'speaking' names of some statements (select case, goto) that made it so accessible for someone with no teacher and without any prior experience with programming. Even gotos themselves might have helped on that journey as they made the control flow much more comprehensible. Stuff happens in your program and IF you want it to happen again THEN GOTO the beginning. That it worked out of the box and didn't require any setup might have contributed as well.


Steve Woz talks about this in his autobiography. That basically, today’s generation is mostly missing an experience he was lucky to have: a complete understanding of a machine and all of its parts.


I respectfully disagree. Arduino can be fully understood from the ground up: from voltage-levels, to the clock, to the simple assembly language (and compiler). The AtMega328p is an incredibly simple design... rugged enough to withstand a variety of voltages (1.8V to 5V inputs), and even able to be used on a breadboard.

Even ignoring Arduino, the "toy computers" that people make today are Minecraft Redstone, Factorio circuits, and Dwarf Fortress river-contraptions. Complete computer architectures built up from the fundamentals for the video-game computer scientists of the future.

People play games and use toys because its possible to understand everything.

EDIT: Oh, and the TI-83 calculator. Overpriced obsolete garbage of a calculator, but since its the "standard" calculator of High School, plenty of children will get the opportunity to really learn that machine if they so choose. There's something to be said about the TI-83 community keeping that Z80 chip so well documented and alive...


Except you have no reason to use those things, unless you specifically go out of your way. In Woz's day, you HAD to work on that level to work with budget systems.

The same opportunity may exist, but the same motivation does not.


> Except you have no reason to use those things, unless you specifically go out of your way. In Woz's day, you HAD to work on that level to work with budget systems.

In Woz's day, most people were playing outside yelling "NERD" at the people who were working on computers. No one had to stay inside to practice coding or working with the machine back then.

I think people forget how antisocial it was considered to actually work on computers back then. I got the tail end of it as a 90s kid, but it was probably much worse in the 70s and 80s.

--------

In any case, I'd say its cheaper and easier to know about things today than it was back then. Sure, select communities would be photocopying the Unix source code (ex: Lions Commentary on Unix), but you still had to hope such tomes existed in your local library. It wasn't like today where you can just download obscure documentation on chips for free and instantly.

Do you want to know how AMD GPUs work? Just read the docs. https://gpuopen.com/wp-content/uploads/2019/08/RDNA_Shader_I...

You used to have to pay money to get printouts of such docs. I'm pretty sure my local library wasn't holding holding the processor manuals or schematics for the Amiga back then.

Realistically speaking, if its the 80s, you were getting by on the Commodore 64 User's guide alone. https://www.lemon64.com/manual/

Frankly: There's far more information available today for Rasp. Pi or Arduinos than the 192-page Commodore 64 book (as influential as the Commodore 64 book was... it was still one beginner-level book).


The C64 Programmer's Reference Guide was available at many bookstores; is the GPU of the Raspberry Pi documented?


https://github.com/hermanhermitage/videocoreiv/wiki/VideoCor...

Documented as much as Lion's documentation on Unix. Gotta dance around NDAs an all, but they had to deal with that in the 80s as well.


And mean, there's no public source for the videocore processor. You can't get it even with a wink and a nod like you could with Lion's commentary.


The Lions commentary was basically a work of literature for the 80s developer. Very few people were running DEC-PDP11s in the 80s, you couldn't really run the code.

Don't get me wrong: the data in the Lions book inspired later operating systems and is very important. But its a very difficult read, ill-suited for the general audience. The code does NOT run outside of DEC-PDP11 (and almost no one had that computer by the 80s, nor the compilers or tools needed to actually generate the code), so it really was just a work of literature and OS-study more-so than actual technical documentation.

-------

In contrast, you can buy a complete Rasp. Pi system for $35, with a myriad of books and materials on programming (Python, C, Java, even GPU-coding).

Then you can buy an Arduino for $20, and a breadboard kit (wires, breadboard, resistors, etc. etc.) under $100 and get cracking today. Complete understanding of the machine, and very cheap and accessible.

Even the Commodore 64 was $500+, in 80s money (so inflation adjust as appropriate).

-------

The 2010s is way better for learning programming than the 80s ever was. Not only is computing respected hobby these days... but its so cheap and information is freely available.


Tons of people had access to PDP-11s at the time, they sold half a million or so of them. It was super common for high schools to have one or two for instance. And vast majority of them were running UNIX since you could get it for the cost of the blank tape at the time; AT&T was forbidden from selling it due to antitrust regulations.

And you're acting like it's some impenetrable tome. Give it a read, it's a better intro to OS dev than Tannenbaum, IMO.

And a RPi has tons of still hidden bits, and an Arduino is even less powerful than most PDP-11s.


> And you're acting like it's some impenetrable tome. Give it a read, it's a better intro to OS dev than Tannenbaum, IMO.

That's not my point. Virtually no one was reading in code from Lion's book and compiling it.

Today, if someone has a bunch of code written for Rasp. Pi, you can download it from Github, change a few lines (to blink a different light or something), and then run your own changed version yourself.

The C64 manuals of the 80s were the most similar in this regard: people would faithfully type the code written in the C64 manuals into their C64, and "learn" by tweaking those programs.

In contrast, the Lion's book was basically theory. You wouldn't actually change the code, it was there for deeper learning purposes.

> And a RPi has tons of still hidden bits, and an Arduino is even less powerful than most PDP-11s.

Arduino is the beginner microcontroller. You'll quickly graduate to STM32 if you need to push more processing power.

But with that being said: Arduino / ATMega328p is built like a tank. The lax electrical characteristics make ATMega328p much better for beginner electronic engineers: you can be quite far off on voltages and still have a working system.

In contrast: send 5V down a 3.3V pin on STM32, and you'll fry it. ATMega328p can actually take that kind of abuse in most cases.

STM32 is fully open and small enough to fully understand as well. But I dare say that most people probably only need the power of an Arduino / ATMega328p for most electronics projects. Having wider tolerances is better IMO, rather than spec-chasing.

-------

Or hell, buy both. The STM32 is $10 and the ATMega328p boards are also $10.

https://www.digikey.com/product-detail/en/stmicroelectronics...

https://www.digikey.com/product-detail/en/microchip-technolo...

This isn't a $500+ investment like it was back in the 80s. Things are way cheaper and easier to do, with plenty of free documentation. I mean, we have Github these days to share code and examples with friends. Its just so much easier to collaborate and learn compared to the past.


There were tons of people recompiling the code in Lion's. You just didn't type it in, but you pulled it off of dectape. Lion's was a huge part of why Unix proliferated so much.

Like, the whole point of xv6 is to recreate that environment.


I would say that I started programming in BASIC, and then later Z80 assembly solely as a result of reading the manual that came with my ZX Spectrum:

https://www.worldofspectrum.org/ZXBasicManual/

One of the things that made computers back then interesting was that they were well-documented. (Of course I only started this because I received the computer for Christmas, and the cassette-player was broken so we couldn't load any of the bundled games for about ten days. The local computer shop didn't open again until the new year!)


>In any case, I'd say its cheaper and easier to know about things today than it was back then.

That might be true but it isn't required. Its cheaper and easier to access information today than its ever been, but doesn't mean people are more literate and more informed - they aren't. Years ago using computers and getting everything to work required that you learn at least the basics and how to tinker with different things and get a basic understanding for them to work. These days most people are using tablet and other simple, graphically designed interfaces that don't require (or in many cases allow) any substantial tinkering in order to get them to work or do what you want them to do.

Necessity is the mother of invention, not availability.


Eh, embedded, robotics, mechatronics and the things we do with it are way bigger than the nascent computing industry ever was. More people "have" to play with that stuff than ever had to earlier. And we have better tools and it's more fun.


I got a chance to ask Woz a question at a speaking engagement. I asked him what he thought about the fact that I don't own my IOS device like I owned my Apple II computers-- that I can't tinker with the software and hardware and do what I please with it. His reply, which I don't recall verbatim, was to the effect of: "We have to have locked-down devices today so we can have security."

It made me very, very sad.


Man that's a lame response. I really have to wonder if we didn't just get incredibly lucky that the initial culture of hackers that built the open source community and internet infrastructure was one that was very non-commercialized. I really have to wonder if all the companies that are thriving now on locked down hardware and proprietary systems, the same ones pushing everything to their proprietary cloud platforms would even be here and able to do half of what they do thanks to that culture of selflessness.

If it didn't already exist and have such momentum, I'm really skeptical that anyone would reproduce the concept of free and open source software in today's day and age.


I mean, your Apple II wasn't constantly connected to a network full of hostile actors and executing code downloaded over the network without explicit confirmation.


I miss having low-level access to video memory.

I've kinda wanted to experiment with graphics in a way that would require me to calculate individual pixels, but AFAIK, unless you use pixel shaders, it is ungodly slow.

Used to be that you could color a pixel blue with a single MOV instruction. Now, you either use a graphics API that has a SetPixel function (Which is incredibly slow for drawing full-screen graphics), or create a byte-array of some sort that gets translated, which while faster, still isn't fast enough.


You have ample room to define the hardware you want to see, though. [0] Setting bytes in a buffer isn't some ultimate graphics abstraction: it's just one way to express it at a low level, and only capable of drawing anything of interest in the sense that a Turing Machine can compute anything that can be expressed digitally. You're gonna crave higher level concepts really quickly. And going higher level does come with its own limits, but that's always been the case. Every sufficiently complex production graphics renderer will eventually develop some kind of hybrid approach to the data representation while achieving the necessary flexibility: Simply define line segment drawing, and boom, suddenly you have a vector-to-raster pipeline. Or, maybe you have a sprite all laid out perfectly for fast memory copies. But then you have to clip it against the screen bounds, add a mask so that it can overdraw, and so on...and so you end up carrying geometry and color data around too. And if you desire to do something fancy like scale and rotate it, you will probably define it not in terms of bytes but in terms of linear algebra, so now you have a requirement for some matrix math.

And the only way in which we're more limited is when we start talking about distributing the work, creating products. Then we start saying, "well, that API doesn't let me dip into the hardware to create the abstraction I need." But that's because we moved the goalposts from "it can run on a computer" to "it can run on many people's computers, in many different contexts." And back in the day that really was never a guaranteed thing. There were stark differences between platforms, and many ports amounted to remakes.

So, then, does it matter? Does "minimalism" in programming even functionally exist, given our unsatiable appetite for dependencies?

[0] https://prog21.dadgum.com/66.html


> or create a byte-array of some sort that gets translated, which while faster, still isn't fast enough.

Except, of course, about 2-3 orders of magnitude faster than the best SVGA era cards were.

I mean, you got many GB/s bandwidth now. Yes, that Tseng Labs ET4000 was amazing, with 7 MB/s over ISA, beating most (all?) other cards I bothered to try back then.

Or some basic (but fast) later PCI card like S3 Trio 64+ 50 MB/s (IIRC).

PCIe 3.0 can do up to 16 GB/s over 16 lanes. Even if it were 7 GB/s in practice, we're still talking about 1000x improvement compared to the best ISA SVGA card. You have also bus mastering and ability to map buffers that way that GPU itself has access to the data.

Setting one pixel might be slow. But who cares when you can just have your "framebuffer" in system RAM and "MOV" away!


It's pretty fast to work with a pixel framebuffer and just blit it to the screen with GDI, DirectX, OpenGL, or something. You might not be able to do 1920x1200 at 60fps but you can do something smaller and stretch it to fullscreen. I wrote a minimal framebuffer library for doing this, because I just wanted to work with pixels: https://github.com/samizzo/pixie

I used it for the Windows port of an MS-DOS demo I wrote recently: https://www.pouet.net/prod.php?which=83648


You could try old systems. The GameBoy Advance is reasonably well-documented[0] (enough for me to write a simple game for it over the summer as my first real C program), and a hacked 3DS[1] can run the game on real GameBoy hardware through Virtual Console injection.

[0] http://coranac.com/tonc/text/ [1] reddit.com/r/3dshacks


I was always playing with electronics in school. A student project, or my friends' project, or some project that came up at a computer lab class. My mom used to make me play with calculators and other electronic devices. I'd sit down, turn off the lights and turn on a few lights, and play with a little bit of circuitry on a breadboard until it became my hobby. I was always fascinated by them and always wanted to learn how to program computers.

But I wanted to do something more complex than just play with my calculator. I wanted to learn how to program. I started using the computer in the fourth grade, and my friends and teachers were very supportive. But I wanted to do something more complex than just play with my calculator. I wanted to learn how to program. My first year of high school was an amazing experience. In college, I worked as a web designer for about two years. Then I switched to an education department and did research on computers, and also spent some time working on my own, building a web application, which I called "Math Tutor". After a few years of designing and developing web apps, I decided to go into education full time.

As a teacher, I've seen the power of computer programs and how they can be used to enhance learning.


Well, it's cool that these days you can learn Javascript and have your code do something on anything with a browser, which includes almost all computers and phones. Abstraction isn't all bad.


Seriously! I really miss setting IRQs with jumpers (or, on the fancy stuff, DIP switches). It was fascinating having an excuse to learn how interrupts worked.


That's a really cool hack, I wonder if it would work on 82077AA or compatibles? I might give that a try later this month after I resurrect my A500

Presumably it should be possible to create a simple drive emulator to spit out that sector/track header without needing an actual disk. Not sure if I'm smart enough to tackle that but it's worth a try!

I'm already working on a 6502 System at the moment that has an 82077AA compatible FDC and can read PC Formatted floppies so it should be relatively easy for me to tool around with this hack.


There are now quite a few floppy emulators of all kinds of flavors.

One: https://github.com/keirf/FlashFloppy


Yeah, I've got a gotek for my A500 but I like to take on stupid useless projects, like a uC that does nothing except spit out sector headers lol


I used this trick to back up my 5000 disk Amiga collection in the 90s. Bought a few extra floppy drives for $10 each at the local flea market :)


According to the writeup this trick was invented in December 1999, was it known before?


It was definitely before 1997, because that's when I moved to Japan.


Very clever. Presumably this would be unable to read high density (1.76MB) Amiga disks.


Commodore never upgraded Paula chip (knowing that company they probably immediately fired designer and didnt keep documentation to the chip), every model got the same 1985 state of the art Double Density floppy controller all the way to 1994 bankruptcy. As LIV2 already mentioned even their top of the line $3.5k A4000 Workstation shipped with gimped Floppy subsystem.


The floppy drive could read/write high density 1.76MB disks. I know, I had one. You're right that the Paula chip wasn't upgraded to handle double the data rate from the drive. The solution instead was that the disk span at half the speed.


Amiga HD floppy drives spun at 150RPM rather than the usual 300 from my understanding because the chipset couldn't handle a higher bitrate whereas on a PC the FDC switches between 250/500Kbit/s modes so I wonder if a pc controller could indeed read an Amiga HD floppy at all or if there are other differences?


Would probably depend on the drive. There were utils out there in DOS land that let you format higher than normal. Usually by playing with the sector/track sizes. But it depended on the drive to let you do it. The best I ever got was ~1.6 out of my drives/media.


The trick works by using the double sampling rate to inspect the normal data rate disk surface. When the disk surface already has a double data rate, the drive doesn't have a quadruple data rate mode to be able to read it.


Suppose someone still had a box or two of old Amiga floppies but no old hardware - what is the easiest way in 2019 to turn them into .adfs?


Not the easiest way, but interesting. http://amiga.robsmithdev.co.uk/history



link is right there, under the post https://www.amigaforever.com/kb/13-118 Easiest would be finding your local retro computer museum/club.


Life, Uh, Finds a Way.gif




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: