Hacker News new | past | comments | ask | show | jobs | submit login
Good Apple (Bad Apple Remix) on the Soviet BK-0010/11M (pouet.net)
118 points by bane on Sept 12, 2018 | hide | past | web | favorite | 48 comments

Fantastic, thanks for posting!

Pouet really doesn't make it obvious where to actually watch the demo. If anyone's confused, https://www.youtube.com/watch?v=8Q1vN51o-Dg is the place to be (unless you've somehow got the hardware to run their actual code).

If you, like me, thought it was only a couple of images on repeat - there's a youtube link in the bottom right.

Yes: https://www.youtube.com/watch?v=8Q1vN51o-Dg

Is the music also produced by the computer? I suppose that it is but it's hard to believe given the quality. Impressive work.

I think it's the COVOX 8-bit thing. Not sure what the specs are, but if you can load samples and loops and set them off at specific intervals, you can accomplish a whole lot with almost no burden to the main computer.

Edit: nope. The COVOX was a parallel port DAC. The computer has to send each sample out. I suspect some light compression, pre-rendered audio segments and an interrupt-driven routine to get data from the HDD and send enough samples per second the sound doesn't get distorted.

> and send enough samples per second the sound doesn't get distorted

That's kind of the awesome thing here, this machine isn't even capable of 1MIPs, so it has to pump audio out of the COVOX and read, decode and display the video.

From the pouet page from one of the authors: "1. This computer has no DMA. We have to read all data "manually" from HDD registers, word by word. As well as switch heads, cylinders and sectors on fly.

2. This computer is very slow, it spends up to 72 CPU cycles to move a number from one memory cell to another. And it runs on 4 MHz only."

> 2. This computer is very slow, it spends up to 72 CPU cycles to move a number from one memory cell to another. And it runs on 4 MHz only."

This is kind of shocking. Was the PDP-11 that bad?

Bloody hell... Covox!!!

I remember making a knock-off and having doubts if something this simple would work. But it did and it was absolutely magnificent. The PC stopped beeping and started making proper sounds. The sunset of the age of tangible computer wonders :)


Awesomeness. So when do we finally get rid of frameworks upon frameworks upon abstractions upon emulations upon frameworks upon abstractions upon emulations upon frameworks upon abstractions upon emulations upon frameworks upon abstractions upon emulations upon frameworks upon abstractions upon emulations upon frameworks upon abstractions upon emulations upon frameworks upon emulation done by the cpu?

Nothing has been stopping you this entire time from working directly with the hardware, so what are you waiting for? A pre-written set of routines for mapping the idiosyncrasies of different execution environments onto a neat mental model? Coupled perhaps with a set of tools for achieving best practices when using those models?

I wonder what we might call such a convenient arrangement...

What are you willing to give up to get there?

The frameworks make everything easier, more portable, more future-proof, more extensible, and quicker to develop. The only cost is that it's all slower and less elegant.

It's unnecessary. We are running a:

  which does abstraction and containers
  OS on top
  a container on top
  an application toolkit
  a cross platform development framework
  a browser embedded inside
  with containers
  with a javascript vm
  which runs framework
  which generates their own, non-standard UI widgets.
What am I gaining? Only more layers, and possible flaws

Oh, and don't get me started about all the languages

> What am I gaining?

"easier, more portable, more future-proof, more extensible, and quicker to develop"? I mean, people aren't doing it this way simply out of incompetence or for the hell of it.

It's also something of a local minimum in the platform wars. We can't have truly cross-platform everything because the platform vendors wouldn't be able to take their cut. The only reason the web survives as a cross-platform platform is it's where they've fought each other to a standstill.

>What am I gaining?

Free time to do something else but coding.

But isn't "coding" fun?

It is, but life is much more than just that.

The problem is not a layers, but the way they compose. Advanced fusion techniques do exist, but it's barely applicable to low-level languages like C.

We'll get very fast single program environment which is not transferable to another slightly different computer.

If it suits you, look at car industry: once in a while the bloat is removed and an end of the line car is transformed into an entry one with all the modern essentials there and none of the crap.

Or you get your own team and build the equivalent of a Subaru Rally car by investing a similar budget

Can anyone clarify what's going on here?

This looks like a Demoscene[0] site. It's sort of a contest to create artistic computer programs with the smallest possible executable size. I found a YouTube link in the description: https://www.youtube.com/watch?v=8Q1vN51o-Dg

Bad Apple[1] is a famous Flash animation from ~10 years ago set to a pop song remix of a video game stage theme. Because it's so stylized there's a whole bunch of parodies / homages that imitate it. Here it's parodying old iTunes commercials[2].

There's another famous demoscene program named 8088 Domination that includes a version of the same song: https://youtu.be/_yTsCOy7j0M?t=168

Keep in mind while you're watching these videos that they're showing full-motion video and audio playback on a machine with single-digit Mhz of processing power. I like to think about this every time Slack pegs one of my cores to render an emoji.

The OP link says the program runs on a "BK-0010/11M" which Wikipedia says[3] is a Soviet clone of the PDP-11. Looks like it's a popular platform on this demoscene gallery.

[0] https://en.wikipedia.org/wiki/Demoscene

[1] https://www.youtube.com/watch?v=9lNZ_Rnr7Jc

[2] https://www.youtube.com/watch?v=X2HFiwfvsc0

[3] http://en.wikipedia.org/wiki/Electronika_BK

Good summary. A bit more generally, demos are often written to run under constrained conditions. They are also often written for competitions that set those conditions.

Sometimes, like in this case, the constraints are entirely dictated by the hardware, and the objective is just to make the most impressive demo given the particular device’s limitations, very often pushing the hardware way past what was commonly thought to be possible. Modifying the hardware is usually not allowed, but playing “dirty” in software by using undocumented and maybe even unintended, obscure implementation details of the platform is highly encouraged. No stone is left unturned.

8088MPH (by the same makers as 8088 Domination) is a very good example. For decades, nobody knew it would be possible to display 1024 colors, at the same time, on the original PC (which most nominally supports only 4 colors at the same time, which could be stretched to 8 with a trick). You can watch the demo here: https://youtu.be/yHXx3orN35Y And read about the methods used and invented here: https://trixter.oldskool.org/2015/04/07/8088-mph-we-break-al...

Sometimes, there are also additional constraints like size limits, which is especially popular on more advanced hardware like modern PCs. For example, the objective could be to make the most impressive demo in 1024 bytes.

>which could be stretched to 8 with a trick

The old tricks of 160x100 pseudo-text mode and composite artifact colors both allowed for 16 colors at once when used in the obvious manner. 8088MPH reached 1024 colors by combining both tricks:


or 128 bytes :


(yep you read it, that's 128 bytes; to give you an idead the tiniest GIF image is 26 bytes long). This post is 191 bytes long :-)

To add a little more commentary on why this is particularly impressive, they're more or less using the stock hardware, with the only "modifications" being some custom peripherals. But those peripherals still have to move data through the I/O ports of the original machine. The pouet comment by one of the authors says that it isn't even able to read data off of the IDE hard drive at 16kB/frame @ 30 fps (480kB/sec) because it's so slow.

On top of it, COVOX audio device requires the CPU to generate the sound and pulse it out of the parallel port where it acts as a DAC. Here it's generating a 22.725Khz stereo signal along with the hard drive I/O along with the computation required to decode the video stream and display it.

This machine is ~.5MIPS and is a clone of a PDP-11.

In demoscene conventions, this actually is a controversial production because it's technically a "video" and demoscene productions need to generate their display and run in real time. But there's lots of argument that the ability to harness this machine in the way they did is computationally impressive enough that there's some debate over it.

However there are some categories of demoscene works that also allow for these kinds of productions and the debate is really over how it was competed more than what it is.

For comparison, here's what games on the BK normally look like https://www.youtube.com/watch?v=-IOre6AAcGc


> Bad Apple[1] is a famous Flash animation from ~10 years ago set to a pop song remix of a video game stage theme.

I just realized that the video can be summarized as "a downsampling of a parody of a music video of a remix of a song from a niche Japanese video game."

The internet is weird.

Addendum: I just told some friends in chat that I was watching a downsampling of a parody of a music video of a remix of a song from a niche Japanese video game running on a Soviet personal computer from 1984.

They called me a hipster. :<

It's a demo. Coded in assembler. On a horribly slow (4 MHz) Russian PDP-11 clone with no DMA capabilities. 72 cycles to move a word from one memory location to another.


As such, it's a very impressive feat of coding prowess.

Compare and contrast with web "programming".

The clock might have been 4MHz, but the actual MIPS was only 0.5, IIRC.

The beauty of that cpu/architecture is that you could write in machine code easily, no assembly required. For example, MOV a constant to a pointer in memory was code 012737 with 01 denoting move command, 27 specifying the first operand as a constant residing in the memory immediately following that instruction, and 37 is specifying the second operand as a pointer, also in memory.

010102 was MOV register 1 to register 2.

I self-taught myself this while in the high school, because Basic was too slow there.

I was about to correct you but then "mind blown"

I can imagine how inefficient it is to have every command take 3 bytes, kind of explain its slowness (but other 8-bit machines weren't much better in MIPS terms)

Two bytes: it's octal, and the system is a 16-bit machine. The leading octal digit is always 0 or 1, if I've understood https://en.wikipedia.org/wiki/PDP-11_architecture correctly.

Surprisingly similar orthogonality to ARM.

It's extremely efficient, actually. After a while spent in the monitor on the C=64 (courtesy of the Final Cartridge III), we started just punching in hex opcodes instead of mnemonics; more efficient that way. I still remember that $4c is the hexadecimal opcode for the jmp instruction:

  4c 00 c0 jmp $c000

I once poked a bunch of 6502 opcode into memory to speed up someone elses basic code and they looked at me like I was some kind of wizzard. They didn't know I could not afford an assembler...

You could have used a cracked copy of TurboAssembler like everyone else. The cracked versions had bugs fixed, performance enhanced and many features added, even long after the author abandoned the software and all support for it:



Growing up on the Commodore computers in a country with no access to originals, I've never seen original software for Commodore64 in my life.

I never owned a Commodore64, though I did fix a large number of them because a computer store in Amsterdam had so many DOAs that there was good money to be made turning five broken C64s in to four working ones and a couple of spare bits.

The machine I did most of my 6502 stuff on was a BBC Micro, which had a basic assembler built in, so that really helped but the KIM-1 which I used before that was so sparse that even though an assembler existed I would have first had to expand the memory (big $back then). So memorize opcodes and use the handy 'opcode card' that you could get for those CPUs back in the day. No big deal, though I was really happy once I moved on to automating that part of the process. Especially branch calculation is tricky because if a branch is on a boundary you could fix it easily in something with macros but by hand it is hell because if the branch can't reach the target you need to invert the branch and then use a JMP, which requires more space, which may cause other branches to no longer reach their targets and so on.

For a second I was confused why would someone want to jump into the slot 0 ROM of the Apple II... All that's there is memory-mapped IO...

$c000 is free memory for user code on the Commodore64.

This is a hell of a demo, on a hell of a hardware. It's as impressive as 8088MPH if you ask me.

A touhou classic. Cool

Awesome. This reminds me of the stop motions animation from https://www.youtube.com/watch?v=IOu0DuxFAT0 the way the scene are changing from one into another.

Here is the original music video:


2hu hijack lol

Maybe this will finally stop the horribly misguided "noone programs in assembler any more" comments here on "Hacker News" (especially during yet anothet RISC V marketing campaign).

Generally people don't unless they really need to, a significant chunk of assembler is a maintainbility horror in most commercial contexts. But it's still useful for really weird architectures or getting the most out of your PC - the compiler is not going to auto-vectorise use of the GF2P8AFFINEINVQB instruction for you.

This simply is not correct - the scene is huge. If you think that, you're hanging out with the wrong crowd.

So many myths about what computing is and isn't -- I've not experienced anything remotely similar in any other field of human endeavor.

The demoscene has always been awesome and influential, and it may be thriving right now, but it's still a niche hobby. It doesn't change the fact that use of assembly for practical purposes is dwindling, and has been for a long time.

Actually it very much changes that "fact", because most of the people coding demos work in the gaming industry and write commercial software titles.

It's much bigger in Europe. Demoparties in the USA tend to die off quickly. Other than @party, I'm not sure any others have lasted even 5 years.

The demoscene, or the software development "scene" as a whole?

> I've not experienced anything remotely similar in any other field of human endeavor.

Similar to what?

Demo and cracking scene.

Similar to mentality in any other profession.

Applications are open for YC Summer 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact