Hacker News new | past | comments | ask | show | jobs | submit login
Turning a £400 BBC Micro (1981) into a $40k disc writer (1987) (scarybeastsecurity.blogspot.com)
34 points by scarybeast 28 days ago | hide | past | favorite | 5 comments



>6522 VIA shift register

>Unfortunately, I was unable to get it to work.

Author blamed this on:

>I did not work out how to get the shift clock running continuously and smoothly. Even with attempts at precise timing for shift register reloading... reloading the shift register incurs a delay before shifting resumes.

This was in fact a hardware bug http://forum.6502.org/viewtopic.php?t=342#p2310 , rather famously the reason for C64 having cripplingly slow (300 byte/s !!) Floppy drive communication implemented thru bitbanging in software despite using fixed 6526 CIA. Previous VIC-20 used 6522 and floppy drives had to stay compatible https://en.wikipedia.org/wiki/Fast_loader

Dropping this compatibility and using 6526 CIA hardware shift register results in C128+1571 combo reaching ~5KB/s


That's an interesting story of the 6522 VIA for sure :) I don't think it is what I was hitting though, for a few reasons:

1) "The 6522 has a bug in mode 011, shifting synchronous serial data in under control of external CB1 clock" -- this is not the mode I tried.

2) The bug appears to be intermittent data loss. What I had trouble with seemed deterministic, the chip is just too slow to respond.

3) I wonder if this hardware bug was ever fixed? The BBC Micro doesn't use the MOS 6522 VIA. It tends to use Synertek or Rockwell. It's unclear if those companies were just using the buggy MOS mask under licence, or something else.


The article links to a demo on the BBC micro which is simply mindblowing - https://youtu.be/oK2D1EdFXMM


>"This is similar to the description of how the Dungeon Master fuzzy bits are written.

As can be seen in the screenshot, the 0x88 data bytes soon start reading back incorrect and non-deterministically. But the variance isn't 100% random like weak bits -- the variance is whether the 0x8 bit is late enough to have a chance of being missed. If missed, you can still eyeball that there are patterns and themes to the madness.

You know, Dungeon Master and copy protection aside, my intuitive mind says there is some form of as-of-yet unobserved/not understood -- similarity between the concept of "fuzzy bits" on old school floppy disks, and that of Qubits, from relatively new school Quantum Physics...

Perhaps qubits could be constructed in the same way fuzzy bits are -- use a device which writes multiple pieces of data at a higher resolution -- then measure that data with a device that reads at a lower resolution.

Applied to qubits then, whatever the particle size, whatever the region of space that the qubit occupies -- attempt to "write" one by writing multiple points in that same space with a device that can modify smaller points within that space (if the qubit is a particle, then this would be sub-particles), and then read it with a device that reads the entire space (which should read probabilistically now, because it lacks the resolution of those smaller particles necessary for a true, repeatable reading...)

I could be completely wrong about this, of course.

But intutitively, I sense something there...

>"The above results are actually the application of fuzzy bit principles to FM encoded data. In FM encoding, every data bit is interleaved with a clock bit. This results in the bleeding of clock bits in to the data stream on occasion (see the 0xFF bytes in the first run above -- they are likely clock bits). The Dungeon Master protection uses fuzzy bits in conjunction with MFM. This leads to a calmer situation where the fuzzy bit drifts between two valid data bit encodings and does not mess up the clock!"

I wonder what could be learned in physics if we tried similar techniques with small particles/small regions of space/small regions of space on substrates...

Anyway, physics aside, a truly fascinating article!


> Perhaps qubits could be constructed in the same way fuzzy bits are -- use a device which writes multiple pieces of data at a higher resolution -- then measure that data with a device that reads at a lower resolution.

https://www.nature.com/articles/s41534-019-0217-0

If I understand, you mean all that in order to achieve quantum computing cheaper. As far as I know, the effects used and needed in quantum computing aren't just "randomness" of reading something but much more. The physicists do attempt to achieve the "quantum simulation" effects as cheaply as possible, and until now they had to depend on complex setups:

"These quantum devices can be implemented in a large number of ways, for example, using ultracold trapped ions6,7,8,9,10,11 cavity quantum electrodynamics (QED),12,13,14,15 photonic circuits,16,17,18 silicon quantum dots,19,20,21 and theoretically even by braiding, as yet unobserved, exotic collective excitations called non-abelian anyons.22,23,24 One of the most promising approaches is using superconducting circuits,25,26,27"

One example:

https://www.nature.com/articles/nphys2253

It doesn't appear to me that the "unreliable bits" read from the magnetic medium could be enough. (Disclaimer: not in that field.) It's however always interesting to learn more while trying to make some idea work.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: