A lot of reminiscing and feeling old, no actual content:
I did a lot of my early programming on BASIC6, since I grew up nearby and my school gave us DCTS accounts. It was basically the same language as BASIC(1): variables could only be two digits long and the result of an IF/THEN could only be a GOTO. I still loved the language, even though it was objectively garbage compared to BASIC7 and BASIC8 which had already been developed. They had cool things like proper subroutines, and using them would probably have been a lot more fun. The interesting thing was that the language was actually compiled, not interpreted like crummy PC BASIC. This made it (relatively) screaming fast.
I wrote my first big program in BASIC6. It was a chat application called ECHO that was a clone of something called Xcalibur that the college kids used. It was awful spaghetti code. I remember using up line number 10,000 and panicking because I could no longer find intermediate line numbers in big chunks of the code (I started counting by 10s and then gradually filled in a lot of the intervening lines. There was no renumbering procedure I was aware of.) All that speed came in handy since there was no concept of threading in BASIC6 and my program had to sequentially handle commands from all the users. People hated it with a burning passion.
One time I found John Kemeny’s email and showed him my amazing BASIC6 monstrosity. He wasn’t very happy about that.
It's amazing how much you can do with just a few simple statements.
Creating a computer language simple enough for novices to learn in an afternoon but powerful enough to write numerical code, games, and other software was an extraordinary accomplishment, especially in 1964.
The utopian vision of empowering non-experts to unlock the power of computing by writing their own interactive programs for their own purposes is also groundbreaking. (They also expanded access to the BASIC system by developing a time-sharing OS and deploying terminals in multiple locations, which meant the system could be used for synchronous (chat) and asynchronous (email, bulletin board) text messaging and collaboration.)
Every modern computing device with a web browser includes JavaScript, but I can't help but imagine what the computing landscape might look like if JavaScript in the browser were as visible and easy to use as BASIC systems were, from the 1960s to the 1990s. Why doesn't Firefox come with a button you can press to open up a user-friendly programming tab, e.g. with a REPL, a canvas/textarea, a code editor, and the ability to easily share programs and games over the internet?
I think you can (still) learn the basics (haha) of Python/Smalltalk/Logo/Scratch/Lisp/Matlab/JavaScript/etc. in an afternoon, but it's hard to beat BASIC's straightforward simplicity.
I also miss the days when
print "Hello, world!"
was a valid program in both Python and BASIC.
(Note John McCarthy started the work that led to LISP when he was at Dartmouth, but then promptly moved to MIT. Imagine if Dartmouth had gone with LISP instead of BASIC...)
Did anyone notice how in the code listings, the O's had slashes through them to distinguish them from zeroes, instead of the other way around? I wonder when it became a standard to slash the zero.
I think I've seen some older content where the convention was reversed like this. Wikipedia's article on slashed zero claims that IBM and some other mainframes put their slash on the letter O instead of the number 0. However, in a bit of circularity, its citation (1) is this same BASIC manual. Checking an IBM manual (2) for punching cards, neither 0 nor O has a slash. The manual is undated, but documents the model 26 keypunch from 1949 and not the model 29 keypunch from 1964, so it's probably older than this BASIC manual.
On IBM punched cards, the letters needn’t look very different, as you could always look at the punches (digits have only one punch; also look at the patterns for letters A-I, J-R, and S-Z)
And yes, people learned that. Because the primary output of a puncher were the holes, not the printed text, maintenance of the ‘print’ part wasn’t always done well (ink smear or devices that ran out of ink), resulting in print output didn’t look remotely as clean as that manual shows.
In the 60's there was no consistent standard for this. I remember this being written about in the CACM 1967. Those with access to old copies of the Communications of the ACM can take a look at [1]. Here is the title and abstract:
Towards standards for handwritten zero and oh: much ado about nothing (and a letter), or a partial dossier on distinguishing between handwritten zero and oh, by R.W. Bemer
The Chairman of the ACM Standards Committee, Julien Green, has charged me with making “more effective use of CACM for communication … to get grass-roots opinions from the ACM membership.” This paper is the first attempt.
A partial dossier on distinguishing between handwritten zero and the letter oh is assembled here. This presentation was triggered by a request for guidance in this matter presented by the United Kingdom Delegation to ISO/TC97/SC2, Character Sets and Coding, at the meeting in Paris on 1967 March 13-16. The matter is just now in the province of USASI X3.6, to which comments might be directed.
Comments will be expected within sixty days [by approximately October 1st].
That's a tough question. The slash-0 (for decimal zero) had been in use long before the BASIC and the ASR-33 came along. (In ham radio for one.) Dartmouth or Teletype may have had a reason for trying this ... but it didn't last!
[Edit] OK, think I found the culprit.
"The rule which has the letter O with a slash and the zero without was used at IBM and a few other early mainframe makers; this is even more of a problem for Scandinavians, because it looks like two of their letters at the same time." -- https://simple.wikipedia.org/wiki/Zero#Telling_zero_and_the_...
I remember in 1976 we were using mark sense cards to write programs, and we definitely had to slash the letter O. I think we also had to slash the Z to distinguish it from a 7.
At least as early as 1977 the Commodore PET and the Apple ][ had "slashed zeros"; followed by the Atari 400/800 in 1979, the BBC Micro in 1981 and the Sinclair ZX Spectrum and Commodore 64 in 1982:
Many years ago I adopted the personal convention of using a slash for the number zero and a little curly cue at the top of the capital letter O. By using both in my handwriting where they could be confused, I was free to use just an unadorned letter or digit in situations where context made it clear.
That seems like it must be a mistake. How could things have flipped 180 degrees like that? I'll need to investigate other programming books from this era...
EDIT: I do appreciate the Game of Thrones look it gives some of the pages. Especially the error messages like "T00 MANY L00PS"
Seems to be an artifact of the teletype system. From the bottom of page 4:
> The third observation is that we only use capital letters, and that the letter "Oh" is distinguished from the numeral "Zero" by having a diagonal slash through the "Oh". This feature is made necessary by the fact that in a computer program it is not always possible to tell from the context whether the letter or number was intended unless they have different appearance. The distinction is made automatically by the teletype machine, which also has a special key for the number "One" to distinguish it from the letter "Eye" or lower case "L".
>How could things have flipped 180 degrees like that?
Somebody other than IBM came along?
IIRC electronic communication started with just ones & zeros then later got the rest of the numerals.
Eventually they got (upper case only) alphabetic characters and by that time a plain zero had already been well established.
Teletypes made sure the letter O always appeared different than the numeral zero.
I never actually used Teletypes until the 1980s where we still got official cables by Telex. But even then a large number of experienced administrators had learned to type manually on traditional typewriter keyboards which are also hard to press but there was no zero or 1 key when they learned. It was important not to reflexively substitute letters for numbers, and if you did you needed to be able to tell the difference on paper.
The ones they are talking about are some of the earliest ASCII Teletypes, where the numerals were wisely standardized equal to the least significant 4 bits out of their total of 7 bits. Zero is 0000, and upper case O coincidentally is 1111.
IBM had its own standard before ASCII, apparently IBM printers produced a slashed upper case O before that.
At Dartmouth it looks like the Teletype Model 35's they had were combining or overprinting the regular slash and zero fonts to make upper case O IBM style, and it appears their keyboards reflected the slashed output. The actual printouts show this.
The main part of the documentation looks like the typical output of a manual electric IBM typewriter instead, where they slashed the O by backspacing. Looks even more funky compared to the real printout. Any typos would have been corrected the modern way, painting over them with Liquid Paper then retyping after it dried.
The original printouts didn't always need to be actually cut with scissors & pasted into the document like lots of ordinary graphics since sometimes it was a whole page, they just manually typed a page number at the bottom and it got Xeroxed into its place. Others were probably cut with a paper cutter beforehand, not scissors.
ASCII had early revisions about this time, within a few years I was a youngster in an office where we got one of the first "luggable" printing terminals and connected to a mainframe through a telephone modem over long-distance AT&T lines. With the suitcase model and acoustic coupler modems we could log in to the "host" with our "laptop" from clients' offices or residences to serve them with more computing power than anyone else around.
These were some of the first that were made for computer use not for the Teletype network. Just an early RS-232 connector with 6-conductor cable to the DB-25 socket on the modem. RS-232B at first then RS-232C, still waiting for RS-232D, expected after they landed on the moon. They weren't quick enough and IBM put RS-232C on the desktop, called it a "COM" port and its been that way ever since. Well if you still want to have the remaining possibility of doing fully-backward-compatible serial communication, and without needing hardware-specific drivers to communicate with any vintage of computer or modem.
This was the first time I saw the slashed zero, never the slashed upper case O. And it was a traditional non-typewheel mechanism, electically powered so the keyboard was not mechanical, but otherwise it made as much noise as a really fast hard-hitting typewriter, making multiple copies on modern carbonless triplicate rolls.
People were amazed with new technology back then.
Eventually with PC's a number of office people connected external modems to their standard COM ports to log into servers, usually plugged right into a modular Bell system phone jack rather than acoustically coupled to the standard telephone handset, plus autodialing became available. Their standard serial mouse was on their other COM port. If they had DOS software that supported a mouse, which was not yet common.
As a web began to appear these were the only people already equipped to partake, but in the long run the net got popular and built-in modems were common expansion cards (later standard equipment way before RJ45 ethernet) which connect right to the phone jack without needing a physical COM port. Plus the next modems were built into the motherboards which all just had phone jacks for a while, mostly software modems.
It’s a really nicely written manual, clearly targeted at the novice.
Appendix B (Limitations on BASIC) gives a rule-of-thumb for the maximum length of a program as "in general about 2 feet".
It was evidently possible to create a complex program in 2 feet as they make the suggestion (on page 46) that a GOTO or IF might jump to a REM statement.
However, some unusual advice is given on page 4 which might have caused confusion: Because “line numbers also serve to specify the order in which the statements are performed by the computer … you could type your program in any order”.
Yes, you could be right there. Perhaps that was why.
I do remember being grateful (in later versions of BASIC) for the RENUMBER command, which does not appear in this early version, but which allowed you to create space between lines to insert new ones.
The other reason that it might have seemed pertinent to the writer was that CARDBASIC (page 50), in which each line of program appears on a card, also requires you to place a line number on a card, and so the order of cards in the deck would not have been important.
It's a short document but basically all you need to go forward doing worthwhile programming from there.
And come back to as a reference to help you get the most out of the language within the limitations of that particular version.
There should be lots more pieces of hardware which are ready-to-program with at least a BASIC as minimal as this today, with concise documentation intended as a full reference first, with programming tips and examples along the way.
I have this weird problem where bitsavers.org never works for me. I think they've blocked my ISP (Aussie Broadband), or some routing issue, etc. It isn't a new thing, it has consistently been like that for many months. The trailing-edge.com mirror works fine.
I just tested on two different Aussie Broadband networks, didn't load on either of them, so yeah, it seems like it's inaccessible for the entire ISP. Weird.
Reminds me the first time I visited Dartmouth campus not so long ago and went to the Baker Library, I saw the real one ! I then met with Petra Bonfert Taylor working at the Thayer school of engineering and we produced this award winning C programming with Linux MOOC (on edx.org) Dartmouth is so inspiring !
I read that many non-computer departments used this BASIC system to provide "online" training for their departments. Classics departments like History and English etc. It was incredibly liberating when folks got the computer involved in their non-computer specialties.
The tidbit I didn't know was that (like ForTran) this BASIC system didn't require spaces on a line. I expect it recognized the keywords rather than looking for a space, and then comparing what word they found with a list of keywords.
Is the source code for the original BASIC available? What kinds of techniques did they use ? I think I read it was an actual compiler. Does anyone know what limitations were attached to BASIC from that compile-only implementation?
My first foray into programming was QBasic and my "manual" was reverse engineering GORILLA.BAS because as a curious 11 year old I just had to make that banana more explosive.
This article as others have said was more a good trip down memory lane :)
My first programming "job" (i.e. something I did for someone else, no money was exchanged) was removing all sound code from GORILLA.BAS so that it could be used in the back row of an otherwise boring class.
Funny how everywhere in the manual, the letter 'O' has a strike-through (slash), while the digit '0' does not. I have seen lots of slashed zeros in my experience, but I've never seen the reverse.
I did a lot of my early programming on BASIC6, since I grew up nearby and my school gave us DCTS accounts. It was basically the same language as BASIC(1): variables could only be two digits long and the result of an IF/THEN could only be a GOTO. I still loved the language, even though it was objectively garbage compared to BASIC7 and BASIC8 which had already been developed. They had cool things like proper subroutines, and using them would probably have been a lot more fun. The interesting thing was that the language was actually compiled, not interpreted like crummy PC BASIC. This made it (relatively) screaming fast.
I wrote my first big program in BASIC6. It was a chat application called ECHO that was a clone of something called Xcalibur that the college kids used. It was awful spaghetti code. I remember using up line number 10,000 and panicking because I could no longer find intermediate line numbers in big chunks of the code (I started counting by 10s and then gradually filled in a lot of the intervening lines. There was no renumbering procedure I was aware of.) All that speed came in handy since there was no concept of threading in BASIC6 and my program had to sequentially handle commands from all the users. People hated it with a burning passion.
One time I found John Kemeny’s email and showed him my amazing BASIC6 monstrosity. He wasn’t very happy about that.