Hacker News new | past | comments | ask | show | jobs | submit login
Donkey – A computer game included with early versions of PC DOS (github.com)
468 points by mondaine on July 18, 2016 | hide | past | web | favorite | 173 comments



Playable here: http://www.pcjs.org/devices/pcx86/machine/5150/cga/64kb/donk...

"The above simulation is configured for a clock speed of 4.77Mhz, with 64Kb of RAM and a CGA display, using the original IBM PC Model 5150 ROM BIOS and CGA font ROM. This configuration also includes a predefined state, with PC-DOS 1.0 already booted and DONKEY.BAS ready to run.

And now that PCx86 automatically saves all your changes (subject to the limits of your browser’s local storage), you can even close the browser in the middle of a game of DONKEY, and the next time you load this page, your progress (and the donkey) will be perfectly restored."


I find it hilarious, when you don't hit a donkey and it says "Donkey lost".


seems like a win-win situation to me.


I'm disappointed that my iPhone can't drive the car because Apple didn't put directional keys on the onscreen keyboard.


For a sleepy minute there I thought you meant because you were driving and wanted to play donkey.bas while commuting in a self driving car


spacebar works (it's the only key needed).


The "any" key also works, as well as the onscreen keys (e.g. F1)

It's not surprising that this simulation of a 36 year old 4.77Mhz desktop runs fine in javascript on my 1.3GHz quadcore $40 phone... but in a way, it is.


This makes me wonder how accurate the timing is. I've been looking at the Gameboy Link Cable, which seems to need very low latency. Are we there yet, in the browser?


It doesn't necessarily need accurate timing under emulation. The emulator is god: it can play with time as it likes.


More interesting to me is to connect a real Gameboy to an emulator, probably with the BGB protocol - http://bgb.bircd.org/bgblink.html


In the browser, I expect we are (like having two emulator instances in a single web page). Emulating a link cable via web sockets, now that gets interesting...


There are a couple of gameboy emulators that can communicate over a TCP connection, I imagine doing the same with Websockets would be fairly trivial.


Use d and h for left and right worked for me.


You only need one key. Any one except Esc will do.


Try the letters l and r.


[flagged]


Please comment civilly and substantively, or not at all.

https://news.ycombinator.com/newsguidelines.html


Maybe I'm just easily impressed, but I find this incredible.


Why are people complaining that the game is bad? Really? Can we see what you wrote in 1982? Games like this were quickly hacked together to show the capabilities of the machine. It was done in BASIC for owner's of those machines to play with. If this was a commercial product, it would be written in assembly.

If anything, this was likely a demo of Microsoft BASIC, let's not forget that Bill Gate's first product was not an OS, but rather a BASIC interpreter written in 1975 for the Altair 8800, written in assembly on paper tape, without the actual hardware! This was written a good 5-7 years later.

While some may not like Bill Gates, he was probably a better programmer than most of you in his youth and this was before the Internet where it's now easy to get access to books, screencasts and lots of sample code via github. Give the man his respect.


This was the progenitor of other such classics as gorilla.bas. I liked that it was "poorly" written, as it made it accessible - and I made the leap from c64/amiga/acorn to ms-dos because I knew there was at least one language (basic) on there that I could use.

Promptly after switching I learned c and never looked back, but there's a lot to be said for what these early open-source (ykwim) games unlocked.

Hell, I started coding because I wanted to play games, and could only do so by copying them into a micro from a magazine (storage media was prohibited) - and fairly promptly wanted to cheat - and before I even really realised, was writing my own games and finding holes in the micro's interpreter that could cause entertaining crashes (I still fondly remember the look of panic on the geography (that was the nearest thing to computers, I guess?) teacher's face on coming into a lab (air raid shelter) full of vdus displaying a garbled mess).

So yeah, "bad" code can be great to learn from at an early level.

Comment complete, please proceed to downvote.


Holy nested parenthetical expressions, Batman!


Was Kildall a better programmer than Gates?^U

Who wrote more elegant software?

Who undercut the price of the other's software? And why?

Microsoft's early success was not due to originality.

Gates has no aesthetic. He has no respectable standard of quality.

Does anyone remember "Microsoft Bob"?

He may be worthy of respect by less capable programmers. He may be worthy of respect by the business community. He may be worthy of respect for his philanthropic efforts.

But for software users who have any insight into and appreciation of software quality, he has yet to earn any respect. For these users, he has only impeded progress. And Microsoft continues this tradition to this day.

It is perfectly OK to critique him on this particular point.

[Original title was something like "Gates wrote this beauty at four in the morning."]


Did Bill Gates personally write Microsoft Bob? I got the impression OP was referring to late 70s / early 80s Microsoft when Gates was still very much doing the coding himself, as opposed to the much bigger 90s Microsoft when he was a few layers removed.


http://www.computerworld.com/article/2483957/microsoft-windo...

"'We were just ahead of our time, like most of our mistakes,' Gates says of personal agents like the notorious Clippy character."

"Gates has a personal connection to Microsoft Bob -- more than it being just another product launched on his watch -- as his wife, Melinda French Gates, was a marketing manager on the Bob project."


Andy Hertzfeld recollects how Macintosh team dissected the new IBM PC and what they thought of donkey.bas:

http://www.folklore.org/StoryView.py?project=Macintosh&story...


Wow, that's a salty article!


I don't think salty is the right word. Salt implies some kind of frustration and I don't think that's accurate here.

I'd call it an honest, judgmental article.


salty is slang for "envious"


"Throwing salt" can also be used to mean "casting dispersions", as it were. I think the article tosses a bunch of salt towards Bill Gates, but who knows what the motivation was. I'm guess when properly confused, you might just throw some salt for luck, and end up looking salty yourself. :)

Edit: I'm talking about the author throwing salt, not any of the posters here.


I have to do this! I believe you mean "casting aspersions."


I always considered salty to mean the act of being bitter or resentful.


That's an interesting piece of history, and certainly demonstrates a little bit of the competitive atmosphere, if nothing else.

Donkey was a 'hello world' of graphics with basic, it wasn't a game. Even as a kid, playing it never lasted more than a few seconds. Critiquing it as though it's a serious game gives this code too much credit and makes Hertzfeld look bad too- taking it seriously is almost as funny to me as the game is. For what it is, a code example, having a better game would have detracted significantly from it's value.


Agree with the article, the code is not to be proud of. One thing about the Macintosh, it was better but the price was way higher. Most people stayed with the z80s that costed a fraction and were as good for games.


The Mac folks understood a lot more about personal computers than the IBM folks. For instance, the graphics hardware on the Mac wasn't much to write home about, but the PC's graphics was unbelievable garbage that had no hope of performing well (it probably set back GUI development on the PC by five years). The Mac's sound wasn't great, but the PC had a speaker hooked to a TTL port, with the same engineering sophistication as the backup alarm on a garbage truck. Decent sound on the PC finally became real and relatively well adopted in the early 1990s.

On the PC there were a lot of fights about standard ways to do things that just worked on the Mac: Memory expansion (that was a whole industry right there), graphics, networking, mice. Even the CPU needed a turbo button to work correctly. Just about the only piece of the PC that universally worked and didn't beg for improvement was the keyboard.

So you can argue that the "crappy PC" spawned a bunch of industries to fix things, while the Mac, which was a lot closer to perfection, only spawned small businesses to satisfy niche needs. Apple nearly died in the 1990s because the alternatives finally caught up to them, a decade later, and Apple couldn't satisfy the computing industry alone. Still can't, doesn't try.

But I had a PC (256K, two floppies) and I had a Mac, and I sold the Mac because I couldn't develop software on it without going crazy (early Mac development was pretty painful; at one point I was seriously considering FORTH, and I hate FORTH).


The PC was much more used as a replacement for things previously done by minicomputers, at least in my field of vision. Just the differential in cost made for quite a business model.


PC had a speaker hooked to a TTL port, with the same engineering sophistication as the backup alarm on a garbage truck

While true, the quality of sound achievable this way is a matter of software. There are many programs that achieved excellent sound quality, playing music even with just a speaker and a TTL drive.

Remember, the simplest DAC is PWM into a low-pass filter.


A couple of years back, I burned out a friend's tweeters by hooking up an Arduino's 31.25kHz PWM to his stereo system. Worked fine at first, but it eventually melted his tweeters. I figured the stereo would low-pass-filter the input for me, and I paid the price.

The PC speaker didn't have PWM; it had a square-wave generator with adjustable frequency. Very few 1980s programs achieved anything even approaching decent sound quality using the PC speaker, although by the 90s, CPUs were fast enough that it was reasonably feasible. In the late 90s, I used to patch my Linux kernel with an unofficial driver that provided arbitrary PCM output via the PC speaker. It was glitchy if I left interrupts disabled during IDE disk accesses, which was the default at the time due to some buggy IDE devices that would sometimes corrupt data if you didn't. Fortunately, mine wasn't one of them.

I feel like PDM is just as simple as PWM and often gives better quality, and an R-2R DAC like a Covox is arguably just as simple as PWM and gives dramatically better quality. You need a few more pins, though, or a 74595.


Interesting!

I must be thinking of a different computer then. Perhaps my Radio Shack CoCo. I remember some small computer from the mid-late 80's producing really good music with just a logic output driving the speaker. Was sure it was the IBM/clone.

One of the nice things is that I have original cloth-bound IBM PC 5150 Technical Manuals in my basement so I can actually go look up the schematics of what's driving the speaker :-)


Well, maybe. It's still bang-bang control, and to get decent quality you'd have to dedicate the processor to bit-banging the port. With the Mac you just filled in a buffer that got DMA'd along with the video. Not great, but at least you could do waveforms using a slice of the CPU.

The PC was built to beep, and that's about it. Anything you got out of it past that was painful and used a lot of CPU.


But keep in mind that this was while the Mac was still in development. The Apple ][ was a great game machine. That is what they are probably comparing it to. I was in high school at this time and the library had a whole wall of PCs and one Apple ][. Huge crowd playing Castle Wolfenstein on the Apple, no one touching the PCs as they only had Flight Simulator on them. In a vacuum, Donkey is not too bad, but compared to Castle Wolfenstein and the other good games on the Apple, I can see why it got a meh...


The IBM wasn't cheap when it came out - when comapared to a PET or Apple ][

I recall working on a Grey Import IBP Pc in the UK - Our electronics shop built an inverter to convert 240v to 110.


[flagged]


This comment violates the HN guidelines. Please don't do this here.

https://news.ycombinator.com/newsguidelines.html


>Andy Hertzfeld sounds like a sore loser.

A proven winner, with significant contributions to the field would be more accurate.

>Why should anyone care what he thinks about games?

It's not what he thinks about games in general, it's what he think about a particular historical artifact that happens to be a game. And people care about his opinion on those times (the website linked is quite popular) because he was an important contributor to that era of computing and the early PC industry.

>Macs suck for games and they always have.

Only for people whose appreciation for games is restricted to the latest, full-blast on the GPU, AAA titles.

>Every piece of software he's written has been a second rate failure.

And it's HE who sounds like a "sore loser"? Oh, the irony.


There's something unseemly about ripping on other people.

I know it's bill gates, but Hertzfeld comes across as the kind of guy who enjoys making people feel small.


Seriously? I've ready almost every story on folklore.org, and this is the only instance I can remember where he puts anyone down -- it sticks out in my memory for that very reason.


I'm simply basing my statement on my reading of that one story. I haven't read anything else on that site and I don't know the person at all.

I think the original poster was simply responding to the "vibe" of that essay.


I worked with Andy for a while. He is one of the nicest people I know, and couldn't hurt a fly.


> Only for people whose appreciation for games is restricted to the latest, full-blast on the GPU, AAA titles.

The one common complaint for all game developers seems to be about Apple's nightmare-inducing OpenGL implementation.


Is it possible to get rid of the mentality that anyone who criticizes anything is a hater, a loser or suffering from the sour grapes syndrome?


> Macs suck for games and they always have

Not in the 80s. PCs were DOS machines for the office and if you wanted to play a game the Mac, Atari or Amiga were much nicer plattforms.


The C64 was vastly superior for games, and the Amiga anyway. Macs never played a role for gaming at this time, because of their price and distribution channels. Only fairly rich professors and graphic artists were able to afford one.

Ataris were never really that good for gaming either, but they were used a lot by pro audio folks. They had superior midi capabilities and programs.


Disagree about the Ataris. They were very popular gaming platforms at the time.


Not the ST, which would be the competition we're talking about vs the C64, etc.


You can't compare the ST and C64.

C64's competition were the Atari 400, 800, XL and XE lines. Most games came out for both platforms. Atari's player missle system was both an advantage and a disadvantage in some respects.

The ST and Amiga were competitors but completely different platforms than their respective predecessors.

When you're talking Mac - the Apple II would have been the contemporary of the C64 and Atari 8-bit lines. There were games on Apple computers as well, but I suspect not as many. They were completely dysfunctional and the Apple IIgs was a really fun machine.


*weren't - I meant that Apple computers _weren't_ completely dysfunctional..


I was a Commodore kid, but the one thing Apple had that it didn't was The Robot Odyssey.

Hell of a game.


A matter of geography - the ST was very popular in europe (as was the amiga).


The ST was competing with the Amiga, not the C64.


The Atari ST 1040 was a great gaming platform.


Boy were they unpopular though.

They built a machine to compete with the C64 at the time when Amiga was the new thing.


> Boy were they unpopular though

They were popular in Europe, where they were cheap. In fact, in the UK, the 512K Atari ST with mono screen and mouse was slightly cheaper upgrading a 128K Mac to 512K (£750 vs £800).

The Atari ST was slightly faster than the Mac, had a better monochrome screen, and could run all Mac software, albeit not legally. (You either bought a plug-n cartridge that took Mac ROMs or used MacBongo, a ripped-off version loaded from disk.)


> Ataris were never really that good for gaming either

Atari practically owned the gaming business with its home games consoles, and its first home computers -- the Atari 400 and 800, launched in 1979 -- were the best games machines around at the time.

The Atari ST was the one the audio folks used, and it did play games, just not as well as the Amiga.

Curiously enough, the Amiga was developed by staff from Atari, while the Atari ST was developed by staff from Commodore. (This was after Jack Tramiel left Commodore and bought Atari, which he ran with his sons.)


Mac with the monochrome screen? Not really. It did have a few games, but the gaming platforms of the time were your C64s, Spectrums, Ataris, Amigas and other more graphically capable TV units.


shufflepuck cafe was uniquely wonderful, even in monochrome! :)


Apple/Mac doesn't belong to the list.


Dark Castle was pretty impressive in 1986.

https://www.youtube.com/watch?v=OjtjVS7VFFY


> Dark Castle was pretty impressive in 1986.

Honestly, I think that it's still impressive. There was a beauty to those old black-and-white bitmaps that modern engines just don't capture. Same with the icons from back in the day. We've gained an awful lot, but we've also lost something.


Meh. Sort of. He mostly sounds like someone with experience on both platforms who is surprised by the dominance of the one that he considers inferior. His prominence as a developer isn't relevant. I trust that his ability is sufficient and that his experience with the platforms is actual. He is a suitable person to comment on the contrast between them.


Many people did dump on the PC as inferior, but it was a marvelous platform for tinkerers of every sort, and tinker they did. The result was a firehose of applications and hardware add-ons of every sort imaginable, including some unimaginably good ones like Doom.

Programmers were utterly amazed by Doom. Nobody had thought that was possible on the PC hardware.

The first PC Flight Simulator was another astonishing piece of engineering.


>The result was a firehose of applications and hardware add-ons of every sort imaginable, including some unimaginably good ones like Doom.

Huh?

Doom was developed on NeXT workstations, under the NEXTSTEP operating system. The Doom game engine was programmed in C, and the editing tools were written in Objective-C. The engine was first compiled with Intel's C compiler for DOS, but later Watcom's C/C++ compiler was used. (Wikipedia)


People were amazed at it running on the PC, and running very well.


Yes, but the appeareance of Doom on PC is not because PC was a "tinkerer's platform" as the argument in the parent post goes.

The tinkering part leading to the creation of Doom was done on NeXT.


Doom ran under a 32 bit DOS extender, which was created by tinkerers, not Microsoft.


BTW, I would also build the software on a better machine that had memory protection, get it all working correctly, then port it to DOS. That would greatly cut down on the debugging time.


> The first PC Flight Simulator was another astonishing piece of engineering.

True. I remember the feedback a WW2 vet pilot gave after testing PC FS. It went something like 'if flying on a PC got any more realistic, you'd need a pilot's license to do it.' !!


"Microsoft" Flight Simulator on the PC was actually a version of Sublogic's Flight Simulator for the Apple ][ (although it gets complicated because the classic setting of Chicago's Meigs Field was on the PC first and then backported to the Apple ][ version). But if you were impressed that the flight simulator ran on a 4.77 Mhz PC, how much more amazing that it ran on a 1 Mhz Apple ][?


I agree, and I think that is a common problem with purely technical people. They don't include all of the other important factors: affordability, support (by the company and number of people around you familiar with the system), and many others.

The Amiga sound and graphics chips blew other systems out of the water technically, and did good with the consumer offerings like the A500, but aside from a niche in the video editing market (Toaster on A1000/A2000), Macs and PCs swamped it in sales numbers.

I have had a lot of systems starting in 1978 with an all-in-one CPM PET with a cassette drive to load programs, a very small green text terminal, and 8K RAM. I paid $800 for it used, and another $850 for a 32K Ram upgrade the year after. You could 'pop the hood' like an old American muscle car, and check the board inside, and install the RAM card. I wrote a horse racing game in 1979 using just ASCII characters with random trots, and an odds and bettings system for my two horse-loving, OTB-visting parents.

I have only really lusted after three machines: A Lisp Machine, Silicon Graphics workstation, and the NeXT machine, again all technically brilliant, but not the biggest sellers. Pricey because of that.

The best tech doesn't always win by default, and that's not always a bad thing (as I once thought).


Funny memory. We bought an SGI machine for when clients were around (doing 3D graphics and video compositing). In the backroom we did it all on Amigas and Lightwave. Clients thought it was all done on SGI/Softimage - so it must be good! It would've been (better) if we could afford as much workstations and application seats as we would've wanted. I still have two SGI's that work. It feels like owning a vintage Ferrari or something, even though nowhere near the monetary value.


Where was this? I worked in NYC, and I landed a non-paying, non-intern learning position at a 3D animation company using SGI with Prisms, which is now Houdini. I loved the design, and Irix was great! I recall thinking that the screens were so hi-res ;)


It was in Croatia, Europe. Same story all over the world though (at least where I was at). And, they were so "high-res". At the time, I believe only screen I saw on a workstation that had such a high resolution was on one HP-UX machine. They were also a great sight. Shame about later models (Octane) being such loud beasts though. Not really for desks.


Yeah, I would not be able to read the print on my SGI today without my reading glasses! Prisms had some many little windows and parameters. The CHOPs(?), procedural animation stuff, a terminal, etc...


Here's a wiki link[0] for anyone else that didn't know what they were looking at.

0: https://en.wikipedia.org/wiki/DONKEY.BAS


> "We were surprised to see that the comments at the top of the game proudly proclaimed the authors: Bill Gates and Neil Konzen."

This is odd that these comments aren't in the github source code, is it missing something? Why does it starts at line 940?

  940 REM The IBM Personal Computer Donkey
  950 REM Version 1.10 (C)Copyright IBM Corp 1981, 1982
  960 REM Licensed Material - Program Property of IBM


The line numbers in BASIC were arbitrary. People often started at higher numbers and jumped by 10 in case they had to put code or comments before or between lines.


Could line numbers be out of order? Like, if I did this:

    10 PRINT "1"
    30 PRINT "2"
    20 GOTO 10
What would print out? Just a bunch of "1"s or "12"s?


It would only print "1"s.

The thing is that basic programs were not entered through a 2D text-editor per rather line per line. You would type "LIST" to re-read the whole program you typed so far, with the lines in the correct order. Then, you would enter a line starting with a number. It would overwrite any existing line with the same number.

Typically, a way to correct the program you wrote would be to type:

30

15 PRINT "2"

(replace the line 30 by an empty line, and insert a line number 15)


Just to add that some BASIC dialects had a RENUMBER or RENUM command that would change all line numbers and references to use a consistent interval. Without that, programs could become hard to modify due to the "gaps" between the lines closing-up.


Couldn't you just increment by 100? Or even 1000? Why did everyone use 10? Was it some interpreter limit that didn't allow line numbers above a certain amount (like 2^15 on a 16-bit CPU)?


Providex (a basic dialect that my company still uses) has a soft-ish limit of around 52000 for line numbers.

You can get around that with some flags, but its more of a pain than its worth.

We tend to go by hundreds for programs, and we have a renumber command, but using it is the nuclear option because it will break goto line number references from outside programs (yeah, that's a very common thing in this language)


I'm curious, is there a specific use case for using such a language today? Is it just legacy?


Mostly legacy.

The main application is decades old, and Providex provides a database of sorts, the language/runtime, a GUI toolkit, and the ability to run on windows, linux, and as a web app (you can use a "desktop" application written in providex in the browser, a great idea in principle, but with the cost of each user license of the language becomes cost prohibitive really fast).

So while everyone agrees that providex needs to go, that would mean replacing just about every single aspect of the company's core application GUI, language, database, even simple-ish things like the editor we use (which is built into the language) all need to be replaced. It's not an easy task.

We've started moving away, but it's going to take a lot of time, and a ton of effort. Right now we are still relying on the providex db stuff pretty much across the board (although i'm launching a new node.js+postgresql server in the next month or so), and we are slowly moving our hosted applications to more traditional web languages (we have some PHP, a bunch of javascript, and a little python).

But the language hurts. It is a combination of compiled and interpreted, so unless you jump through hoops files are saved in a binary format, and you can only use their editor. This also locks us into SVN as our source control, as it's the only VCS that providex supports (and even then, it's pretty bad support). There is virtually 0 tooling, nothing is open sourced, and it's really expensive. It's impossible to remove old code as there is 0 safety, any line can be GOTO'd or GOSUB'd by any other program at any time, and while new programs don't do that, the old programs that do are the ones that you want to refactor but can't. There are also programs where we are out of line numbers and need to resort to GOSUB "hacks" to add a line.

But it's not all bad. Being able to use a "drag and drop" visual editor to make a screen that will work on windows, linux, and the web is pretty nice, and having the DB so tightly coupled means stuff like upgrades/downgrades are pretty simple and don't involve multiple systems. It's also a pretty capable language all things considered (it has classes/objects, it's not as slow as i thought it would be, and it runs on anything without any modification).


For a long time it was still a useful teaching language. On how many languages can you simply make two statements and have a line drawn on the screen?

SCREEN 9

LINE (100,100)-(200,200)

I only know processing that has this kind of ease of use. But any curious kid, given a BASIC interpreter and the LINE instruction will start drawing stuff around, naturally come to loops, and so on, without having to worry about the details of "real" programming languages.


Certainly, it would in part depend on the support of the interpreter, though. Realistically, incrementing by 10 was easy, and if you really needed more you could use goto to jump down and back (though obviously that's not nice for readability).


I remember there being a STEP command to set the increment value. However, when I tested it (on the PCJS sim from the top of the thread) just now, its throwing a Syntax Error!! I am nearly sure that was what the command was, but I don't know what gives right now.


My BASIC memories are rusty, and more oriented toward Applesoft than Microsoft. But I remember STEP being used to give an optional increment to the FOR statement, rather than setting a line numbering increment - for example, you might

    FOR i$ FROM 1 TO 10 STEP 2
    PRINT i$
    NEXT
and get back 2, 4, 6, 8, 10, each on its own line.

I don't remember a command to set the increment for line numbers, but that may be because I never used a BASIC which didn't either require a number on every line or not care about line numbers except as targets for GOTO and similar.


Which is one cause of the spaghetti:

    17 GOSUB 5500


If you typed those lines, then issued a

    LIST
command, the interpreter would return

    10 PRINT "1"
    20 GOTO 10
    30 PRINT "2"

You don't have to type LIST to reorder the lines, that just shows the re-ordering to you. If you didn't type LIST, but did type RUN (with the "unordered" listing) the interpreter would run each line in sequence of line number.

If you ran out of line numbers you could sometimes RENUMber the listing.

Here's Stackoverflow:

This question is marked as a duplicate: http://stackoverflow.com/questions/2435488/why-basic-had-num...

Here's the closed question it's marked as a duplicate of: http://stackoverflow.com/questions/541421/why-did-we-bother-...


It gets run in the correct order. Think of each line as a command that stores the instruction immediately at that line number. You could actually "re-type" a line later by reusing the same line number but replacing it with new contents.


"Correct" order is ambiguous to someone asking that question in the first place.


It sounds like he's saying the "line number" is just the index to a tokenized list (i.e. line 15 is at `lines[15]') where unused lines are just nops. So my "program" would be tokenized and the lines stored like:

    char** lines = malloc(...);
    lines[10] = &line1;
    lines[30] = &line2;
    lines[20] = &line3;
Then when executing, it starts at `lines[0]', sees nops, gets to `lines[10]' and runs it, sees more nops, gets to `lines[20]' and jumps to `lines[10]'. All the while ignoring `lines[21]' and up.


Usually in 8-bit BASICs the lines would be stored consecutively (not in separate malloced segments) with each line starting with its line number. Yes, this meant that GOTO required a search. Memory was a really central design constraint on these systems. I don't know how closely MS's BASICA on the PC followed this model, but the PC's starting configuration had 16k bytes of RAM.


Here's a description of the file format used by BBC Basic:

http://xania.org/200711/bbc-basic-v-format

That's Basic V, which was the variant used on the ARM-based Archimedes, but it's the same file format as the 6502 machine Basics.

Note that the line numbers used by GOTO and GOSUB were specially flagged --- this was so the RENUMBER command could find them. It also meant that computed gotos weren't renumbered...

(Of course, BBC Basic had proper named procedures and functions with local variables, but all self-respecting Basics had to support GOTO and GOSUB.)


Spot on - so line 30 would never get executed on that code, and a series of '1's will get printed, in an endless loop, until stopped with a Ctrl-C.


What is "correct order"? You forgot to answer the actual question.


Sorry, it was ambiguous. I assumed the writer of the original code intended to execute them in the order of the line numbers and that was the "correct order". Otherwise, the line numbers are meaningless.


Not an especially valuable comment from me here, but the fact that people who are probably very technically literate can ask this question makes me feel really rather old.


I remember explaining line-number based BASIC to a younger programmer once and he really couldn't get his head around the idea of GOTOs and GOSUBs based on line numbers -- "why wouldn't you just define functions?"


I remember the minor shock I experienced when writing programs without line numbers. I recall a couple of "good at programming whiz kids" never could make the transition and dropped the compsci major.

I escaped the "permanent brain damage" that Dijkstra complained about that era of BASIC producing, but I understand where he's coming from now and don't begrudge him the slight hyperbole. It was a language where there were so many accidental complexities that there was virtually no prayer of someone penetrating through to the essential issues, using Fred Brook's definitions of accidental/essential. You're not learning anything useful while you're manually renumbering lines because you used up everything between 15-20 and fixing up the GOTOs. (I never had anything with RENUM support.)


It would print out a series of 1s. It is the same as if you had entered

10 PRINT "1" 20 GOTO 10 30 PRINT "2"

The reason this is allowed is because it let you rewrite existing programs; if you later entered

10 PRINT "Hello World"

It would replace the original line 10 and use this one instead.


Well noted. I recall there also used to be a STEP command in the basic interpreter (basica and gw-basic) with which the line number increment could be set e.g. STEP 5 will auto-increment the line numbers by 5. (The default used to be 10.)


Microsoft usually started at 1000

So you can see that they added the copyright and SAMPLES$ logic later.


> The first version of DONKEY.BAS was released in 1981, followed by version 1.10 in 1982

This is the improved 1982 version.


Even in the 1.00 version here it doesn't have Gates in the comments.

http://www.pcjs.org/devices/pcx86/machine/5150/cga/64kb/donk...


I see. I thought they'd have kept the authors name, unless the code was completely rewritten.


I never played or even saw Donkey. Too young.

But I learned to program by playing and modifying Microsoft Nibbles -- which is still my favourite snake game. And possibly my favourite Microsoft product.

And QBasic is still the only IDE that I ever really liked. Although there are some newer ones that I respect.


Same here, and GORILLAS.BAS too. QBASIC was brilliant.

There's something about an immersive fullscreen coding tool with offline documentation built in.

The other environments I've really loved were Lisp in Emacs and Java in IDEA...


Context here (Jeff Atwood's blog post dated 2007): https://blog.codinghorror.com/bill-gates-and-donkey-bas/


I love that right rear view mirror attached to the left front wheel! ;-)



Now I want to see what Donkey .NET is like ^^


I believe this is the link: http://download.microsoft.com/download/4/b/4/4b400bf9-f71d-4...

Found from some rather more shady-looking sites, but it points back to download.microsoft.com, so I suppose it's legit.


My father bought the first family computer, an IBM PCjr, when I was 4 or 5. Donkey.BAS made a huge impact on me, although its easy to overlook as a crap game these days. Sure, there were "better" games that I played on that system: my older brother purchased Sierra's Black Cauldron–which I played the shit out of. My uncle wanted me to grow up to be a pilot, so MS Flight Simulator was in order, too. Jordan Mechner's Karateka is still, to this day, mind blowing to me. And lest we forget some Broderbund and Microprose classics.

But donkey.BAS was, which all those other things weren't, is my _first_, real introduction to programming. I was too young at the time to really recognize the value in being able to not only consume, but read the code, change it, learn from it. Unlike more polished games, I learned way more about programming from changing, breaking, and subverting donkey.bas than anything else on the PCjr. Sure, it wasn't _the best_, but it was the first time that someone pulled back the curtain and I was afforded a glimpse at what was possible, and what computers could _do_. Between that, pouring through the "Hands On Basic" book[0], and typing in basic programs COMPUTE magazine, I'm not sure that I'd be a programmer today, as hyperbolous as that sounds...

And some overly nostalgic part of me kind of misses doing PEEKs and POKEs in physical memory and the summer I spent learning binary math because I didn't understand the relationship between the AH and AL registers of my 286 years later. Then I go back to writing CRUD applications in whatever javascript library is the flavor of the month.

I just hope that kids today have the same access to shitty, but accessible, chunks of code to help inspire them and show them what the machines they interact with everyday are really capable of.

[0]: http://www.brutman.com/BasementCleanout/IBM_Hands-On_BASIC/H...


Ha! I was 9 or 10 when my dad bought us a PC jr. It was awesome. Having the code samples like Donkey.bas was what really allowed you to experiment, see what was possible, and write longer programs. There was no online documentation then.

There's a lot of shitty but accessible examples now, and a lot more ways to get them easily. If anything, it's harder now to pick a platform/device/language because there are so many choices. My kids won't put down their iPhone games long enough to read code examples, even though they talk about wanting to program.


It makes me feel incredibly nostalgic too. In the 80s I used to borrow books from my local library about programming (written for kids) [1], and spend hours typing out source code for computers that used a slightly different version of BASIC. While I was frustrated they wouldn't work, it sparked my imagination and I think the process taught me more about self-learning than anything I was taught at school.

[1] https://drive.google.com/file/d/0Bxv0SsvibDMTYkFJbUswOHFQclE...


This reminds me about my time writing games in Basic on my calculator during math class. I suck at math because of this, but at least I'm an ok programmer.


Yes! TI Basic on my Texas Instruments graphics calculator was how I spent downtime during math class.

So much so, I remember writing program for solving quadratic equations.

They were supposed to clear the calculators memory during exmas, but the exam invigilators had no idea what calculators could do, so they didn't.

I'd like to think it wasn't cheating... as writing quadratic solver in TI basic is harder than solving simple quadratic equations. And I did do them all by hand anyway in the exams...


I have the same story, but my first program was for solving systems of equations using a variety of methods and showing its work each step of the way. Same as you, I learned the material far more thoroughly than had I studied it in a conventional way.


I was rather sad when Dijkstra said anyone who learnt on BASIC was irrepairably broken :

"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."

http://www.cs.virginia.edu/~evans/cs655/readings/ewd498.html

The second line of code I ever wrote was in BASIC :

  goto 10


That line of code reminded me of the following thing you could do to the Commodore PET[1], one of the first computers to use Microsoft BASIC.

   10 GOTO 10
   RUN
Oops. No way to recover without power cycling the computer. I remember Commodore brought their product to an early computer show, and all the annoying nerds (like me) kept entering that one-line program into the demo computers, much to the annoyance of the nearby sales critters.

And it wasn't possible to patch that quickly, since BASIC was in ROM. They probably had a fix about 12 weeks later.

[1] https://en.wikipedia.org/wiki/Commodore_PET


I wrote a similar (first) version on my Spectrum 48k. For some reason it initiated the same kind of colour cycle that you'd get when loading a program.


I wish I were as driven now as I was back then. I used to sit for HOURS writing little programs with that horrible PRGM navigation menu. Nowadays I get upset when my color scheme isn't perfect.



People forget that the intention of some programming demos is to explain some concepts simply. The intended audience of that code wasn't some expert programmer - it was for those who wanted to see how things worked so that they could see how the game sucked and once they learned enough they could improve upon it.

For expert BASIC code see Nibbles.BAS:

http://stanislavs.org/OldPages/stanislavs/src/nibbles.bas


true enough, and I agree, it was just a demo hacked together, but i think the difference you are seeing is both a consequence of nibbles requiring qbasic, and nibbles being developed nearly 10 years later after basic has become quite popular.


There are plenty of magazines on Archive.org where one could also find GW-BASIC or BASICA code.


Relevant to the mention of 4 AM: https://www.youtube.com/watch?v=ORYKKNoRcDc


36 years ago they wrote every popular iPhone game ever.


"2 lanes ought to be enough for anybody"


GORILLAS.BAS for the win


My 386 didn't have a color graphics adapter, which required using a program called "Phix" - a terminate-stay-resident that emulated VGA on monochrome. Sadly this completely broke QBASIC (the editor was just a black screen), so I had to reboot the machine (to remove the TSR) in order to hack on GORILLAS.BAS. Compilation times nowadays are still longer than that rigmarole.


But it should have some kind of display adapter, what was it, Hercules? I didn't know of any 386-class computers that didn't carry EGA or higher.


Hercules definitely rings a bell. The Wikipedia page[1] actually explains how the TSR [probably] worked, which seems to not be much as the Hercules seems to be quite elegantly designed.

[1]: https://en.wikipedia.org/wiki/Hercules_Graphics_Card#CGA_Emu...


Well I clearly remember using an Hercules emulator with my SuperVGA card so that it could run some old DOS games, so the other way around also worked.


YES


How old was gates when he wrote this thing??


Ah, programmers. I love how we can show the logic behind 3 correct ages for Bill when he wrote this :)


Then we can don our statistical machine learning hat and make an ensemble model out of the three sub-models.

Unless we do something exotic we're probably going to predict 25 as the point estimate of {24, 25, 26}.


Don't forget that approx 1% of coders are running on an OS with a bug in the DST times around that time frame. That makes his estimated age as 24 years, 364 days, 23 hours and 57 minutes.


The reasoning behind two of those answers has resolution of one year, while the third one uses day precision; it seems that all three are correct within their error ranges.


However, since only one age value can be actually correct in this case, the use of logic looks less like a feature of the programmer's mind and more like a bug.


One of them is a case of GIGO, using "36 years ago" as input when the correct value is 35. Fixing that reduces it to two values, and the discrepancy is down to day versus year granularity.


All three results are close to each other and to the true value, so I'd still call it a feature :).


He was born on October 28, 1955. Therefore, in August 1981, when the game was released, he was 25.


Current Bill Gates age: 60. This little game was written 36 years ago.

60 - 36 = 24 <- Bill Gates age when he wrote this.


The game was released in 1981, so Gates was 26 years old.


This reminds me of the first computer games I ever wrote. I remember being able to get something so satisfying together in BASIC, with graphics, user input, and sounds. Then I remember moving to C and asking how I could draw a simple line, only to be told that it would require a few pages of boilerplate to set up, and that all my programs should just be text prompts.


Is anyone else impressed that it's only 131 lines long?? I have to do more work to set up a plain empty window and OpenGL when I create my games these days!


Wow, This is actually one of the first games I remember playing. I was probably 4 or 5 and visiting my Dad at work. To keep me busy their IT guy showed me this game. I think it was also the only time I played it.

Shortly after that we got a PC at home and it came with a demo copy of EGA Golf which I think only had Pebble Beach. Sound quality was similar.


The first code I ever wrote was modifying this in a few different ways on my parent's IBM XT to say inappropriate things and change the game dynamics.


You could say the frustration of interaction is already apparent. The fact I cannot cut in right after a cow is so lame yet so familiar as a Windows user of the 90s.


It might be because it is a donkey.


36 years ago? That would mean 1980, but the code says "1981, 1982" in its copyright info.


It's pretty short


This has as much interaction as many mobile mega hits.


Try hitting the donkey with the side of the car :)


imagine how many hours do you need to spend studying basic to be able to write that


https://robhagemans.github.io/pcbasic/ is a GPL3-licensed implementation of the BASIC language this game is written in. PC-BASIC is written in Python. Supposedly it can run this game, although I haven't tried it.

A couple of interesting features of the game (quotidian to those of us around at the time, but...)

1. The "PLAY" statement is barely used; the sounds are mostly done with the SOUND statement, maybe in part because they are being generated randomly. In fact it seems to be used only to verify that the program is being run on a BASIC interpreter that supports "advanced" features like DRAW.

2. The "DRAW" statement, which has its own mini-language similar to that of the "PLAY" statement. Later these were dubbed "Graphics Macro Language" and "Music Macro Language", even though neither one allows you to define macros. This is used to include vector graphics of the donkey and the racecar in the program, in the subroutines on lines 1940 and 1780, respectively. But the interpretive rendering of these vector graphics (and especially the flood fills, lines 1900 and 2010) was too slow to want to do it every frame; instead it's done at program startup (into an on-screen buffer, since that's the only way to do it in GW-BASIC or BASICA; you can see the painting happen briefly before the game starts) and stored in the arrays CAR% and DNK% with GET statements, later to be PUT onto the screen in the right place each frame. There's a bit of sloppiness there: CAR% is DIMmed right there in the subroutine on line 1910, while DNK% is DIMmed up at the top on line 1470. (And the sprites for the halves of the donkey and car are dimmed there too, along with a planned sprite called Q% which is never used.)

3. There's actually an additional sprite, B%, which isn't set up with a GET statement; it's filled in "by hand" to a simple fill pattern on lines 1510–1530. My memory was saying that the data format of this array was undocumented, but it does seem to be documented in http://www.antonis.de/qbebooks/gwbasman/, which I'm pretty sure is the actual GW-BASIC manual from Microsoft. Anyway, B% is a vertical line that's getting XORed into the framebuffer (the default PUT raster op was to XOR into the framebuffer, violating patent 4,197,590 if you use it for a cursor) to make the stripes down the middle of the road "move".

4. See how AND is being used as a bitwise operator? That's why true was -1 in MBASIC.

I think there are some important lessons in GW-BASIC/BASICA about how to design user interfaces for end-user programming, and the DRAW and PLAY statements in particular. Also, I can't have been the only person who never figured out how to use the vi-like line editor in BASIC-80 but who edited existing code all the time in Z-BASIC/GW-BASIC/BASICA because I could just use the arrow keys.


PC-BASIC is a great re-implementation of GWBASIC. I tried it on some of my old gems from the late 80s and they still worked. :)


Are you going to put them on Github?


Sorry, that would be deeply embarrassing. I was 10-12. :)


Use a nym! Just be sure to generate a separate ssh key for it.


Could we fix the title?


(I agree.)

If you see a submission with a very bad title in the front page, you can ask the mods to change it with an email to hn@ycombinator.com

It's usually faster to use the email because sometimes the comments are unnoticed.


So 3 decades ago the autopilot could already keep the car in the current lane but was not able to avoid the unicorn^W donkey? Drivers need to remain engaged and aware!




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: