I wrote plenty of programs that worked the first time I ran them, something that seldom happened to me in more modern programming environments, which put many mental burdens on you, and encourage writing fast and sloppily, and then improving your mistakes afterwards. You don't need any boilerplate - I regularly used QBasic to just plot a graph for example (in a program written from scratch).
Also, I like the approach to pointers is very nice, I think. There are no pointers. You need to PEEK or POKE into memory! In C you do
a = *b
a = PEEK(b)
Edit: More nostalgy! The help system was absolutely amazing. It was possible to read a keyword for the first time, press F1, and two minutes later you would actually understand the statement. Something I never experienced afterwards.
Turbo Pascal was also an awesome development experience back then.
Ended up being the right bet though. :)
Did the same thing myself. Struggled for years in Slackware and other builds... had to write my own ADSL driver for broadband to work. Had a dual booting Gateway 2000 PC and always cringed when I selected Linux from the GRUB booter... but it served me very well decades later.
That must have been debug.com 
QBasic happened much later, 1991, as part of MS-DOS 5.0. It was considered to be a subset of QuickBasic.
So QuickBasic, as well as it's big brother, Microsoft Basic Professional Development System (which almost no one remembers), was essentially dead by the time QBasic was released.
I only ever used QuickBasic. I still have the manual. Can't make myself part with it.
I had it a stack of 3.5" floppies, and the last disk had a bad spot, so some of the framework was impossible to use because I couldn't install it.
That being said, the QuickBasic line and the PDS were amazing -- even the PDS had reams of online documentation on how to use the tooling, straight from the IDE.
I distinctly remember the days my Dad showed me the linker and the library managers. It was like magic: now I could re-use whole chunks of my programs in other programs. Totally world-changing for a little 12 year old kid on an IBM PS/2 Model 50.
There was Visual Basic for Windows 1.0 and Visual Basic for MS-DOS 1.0. The UI frameworks were completely different – Windows was GUI, MS-DOS was character mode. The Windows versions continued on until VB6, before being replaced by the very different Visual Basic.NET. Visual Basic for MS-DOS was discontinued after 1.0.
You know, I never really thought of programming in this way. You are very correct about this and it makes me sad in a way.
Can't you do something like PEEK(PEEK(a)) or PEEK(PEEK(a)*256+PEEK(a+1)) ?
I always thought of pointers as indirect addressing mode when I was learning about the 6502.
Other Basic dialects, including QBasic's big brother, had a Varadr function or similar, to take an address.
C is not the absolute truth in pointer manipulation across programming languages.
Of course, it kind of hurts the urban myth that C is a special snowflake among systems programming languages, having languages almost 10 years older with such features.
' a = *addr
a = PEEK (addr%)
' *addr = value
POKE addr%, value
'a = addr
a = addr
INCR addr%, 2
'a = &variable
DEF SEG = VARSEG(variable)
a = VARPTR(varible)
a = VARPTR$(varible)
Or to stay with QuickBasic dialects, Freebasic music tunes
Which also provides the SIZEOF chord, simplifying the increment operations of memory addresses.
Pointers were invented in PL/I, where they were specifically untyped.
check out pete's qb site for programs and code snippets for one point of reference: http://www.petesqbsite.com/downloads/downloads.shtml
In my opinion it's great because of how accessible it is. It's very easy to experiment with unfamiliar commands and rapid prototype. I learned QBasic before Pascal and C++, and was disappointed at the absence of an equivalent to many of the functions that made programming quicker and easier.
More info and interview:
The dev even released a copy of the source code:
When you're starting out, and even more so when you're young, syntax is the issue. Every paren and sigil you have to type or complicated function name you have to spell is one more bomb that could explode. Keyboarding is tedious. Vocabulary is hard.
And on the TI's, most of that is eliminated by the method of input. While you can turn on an alphabetic mode to type characters, and simple algebraic expressions can be input from the calculator face keys, the environment pushes you to browse and explore the menus of library functions. Browsing is fun at any age. It is not the most efficient thing, but programming was never about how fast you can type.
So your syntax error rate goes way down and while some reference and guidance is needed to know how to use a lot of the functionality, you can focus more on the semantics of, e.g., "what is a subscript error?"
Environments that go heavily graphical like Scratch also have this kind of benefit, but the method used by TI-BASIC, which is mostly an artifact of being designed for a limited interface circa 1990, results in source code that looks and behaves similarly to other languages, making it that extra bit more transferrable.
And the functionality is good for drawing things and building up math concepts, although it's not great as a games environment. I remember my older brother making an Asteroids clone on his TI-82. He got it to where it rendered everything, moved and rotated objects, and took input, but he ran out of single-letter variable names to use. I think if he had known to use the built in matrix and list functionality to store object state and do transforms, he would have been able to take it much farther, but there was a lack of guidance on these things in the early 90's.
The sheer amount of time that would have had to have gone in to that is a testament to the power of kids feeling bored to a level that may not actually exist any more. We have, today, tools that make things less tedious than writing matrix transforms in single-letter variable names, but at the same time we have computer games and more recently social apps raising the tedium floor that anyone is willing to put up with. If there was a way to combine modern tools with 1980s tenacity we'd probably end up with a golden era on our hands.
Not on the same league as Asteroids on a TI calculator, I admit, but squarely in the realm of Computer Science.
And of course you can have this device in your hands during many classes making it so at a moment of enforced boredom you have the opportunity to learn.
I basically learned to code on a TI-83.
The fact that it's a mobile programming platform is a huge plus as well. I haven't seen anything comparable on smart phones. And a touch screen just can't compete with that keyboard. Even if it is a weird layout.
Underrated device !
Canon's BASIC for its 6809-based system was almost on par with FORTRAN in many ways.
It also had fraction, rounding, modulus, min and max functions. And even named program segments. Like this example from the July, 1982 issue of Byte:
10 FOR I=1 TO 100
30 IF N>2000 GOTO [FINIS]
40 NEXT I
50 [FINIS] PRINT I: END
You could also conserve memory space by keeping portions of your program on disk and it would load them on the fly. Kind of like today's function() calls, except each function would be its own file on the floppy. You could even pass variables between the main loop and floppy-based functions very easily.
This was back in the days where an undergraduate could work alternating quarters and entirely pay for a BS at Georgia Tech, out-of-state. Not all things progress over time.
All that said, I will live in C++17++/cmake/emacs till the day I die. Even though cmake is hideous (but does work) and hey what is that glitch in emacs about?
I'm assuming that this implementation was used for real work, as I doubt the CDC 6700 was used for playing around with.
An excerpt from the article: "Once upon a time, knowing how to use a computer was virtually synonymous with knowing how to program one. And the thing that made it possible was a programming language called BASIC."
By the way, I have preserved an IBM Logo interpreter, a GW-BASIC interpreter, and a QuickBasic compiler from my childhood days here: https://github.com/susam/dosage/tree/master/langs. These three tools has played an important role in my life because these tools got me interested in programming.
Logo showed me how simple and elegant programming can be. The fact that it produced cool visual effects was a bonus. For example:
REPEAT 10 [REPEAT 360 [FD 1 RT 1] RT 36]
GW-BASIC introduced me to procedural programming language. It was fun but I was unhappy that I could not produce .EXE files with it. Somehow .EXE files felt more real. Being naive, I would rename .BAS files to .EXE files hoping that it would somehow make it a standalone executable. Finally, I found the holy grail when I found QuickBasic compiler which could compile .BAS files to .EXEs. This experience taught me the difference between source code and machine code.
Ironically the way QuickBasic works is that it concatenates the BASIC interpreter with a copy of your .BAS file to form the .EXE file. There is no machine code generated.
Nice things about Turing (many of which are similar to QBasic):
- Hello world is `put "Hello World!"`
- Really easy syntax with no semicolons
- Super easy 2d graphics e.g `Draw.Box(0,0,100,100,black)`
- The graphics starts easy with immediate mode and named colors but ramps smoothly into double-buffered full RGB drawing while still using the same primitives.
- Easy to load images, turn them into sprites, treat a certain color as transparent so you can author things in MS Paint.
- Built in help with really nice documentation of every function, with a large standard library of helpful things.
Check out http://compsci.ca/holtsoft/IPT.pdf and http://tristan.hume.ca/openturing/
My personal favourite though, was creating 'look alike' apps of existing accounting and database system that ran on the IBM XT's in the offices I interned at back in the day, with fake but world ending error messages that would pop up when the operator tried to select menu options or fill in fields. (This was the days of "War Games" too, so it was fun to see a receptionist freak out when she saw a "Connected to NORAD - Commencing missile launch sequence in 10... 9... 8...")
The web page you are trying to access has been blocked.
The URL has been categorized under: Criminal Skills/Hacking for Merlin Entertainment. - LLC Hotel [220.127.116.11]
Please contact your support team if you feel that this is incorrect.
Here’s the code on Glitch: https://led-picture-frame.glitch.me
And here is the finished project:
1. How do I generate random numbers (great for making simple games)?
2. How do I get interactive input, including getting a single keypress without having to press ENTER (again, great for single games)
3. How do I draw graphics on the screen? Circles. Lines. Points.
With these three I made really fun programs as a kid about the age of this post's son.
The big languages in those days were C. C++ and Java. Doing these simple things was an incredible pain. I just cogent bring myself to go beyond (and I had not heard of Perl).
Then I took my first formal programming class in college. While it was a breeze, it was also incredibly unmotivating. I always felt if they started with these three things, an order of magnitude more people would do hobby software development.
Might as well mention, I have used FreeBASIC before and its quite good and very mature. It generates C/C++ code (I don't remember the details) and compiles using GCC! If the compiler can generate standard C/C++ code, there is good reason to believe the project could be maintained forever and compete with other languages.
Canvas is also somewhat nice once you figure out how to work with it. I still don't completely understand the ideas behind it. There seems to be a specific paradigm behind it that none of the tutorials I've gone through have shown me.
The editor has a built in sprite, level, and music editor, and you can also download any game and then enter edit mode to see what code it used, just like I did with QBasic games as a kid. It's not exactly as easy as QBasic, since it's based on LUA syntax, but it's pretty darn close that I don't think it should be ruled out.
It's the closest thing since Adobe Flash that gave me a similar feeling to making games in QBasic.
Check out PyGame for instance to do all that on current systems. That’s what I’ve used lately.
Professor Brian Harvey of Berkely has made freely available the three volumes of his 'Computer Science Logo Style', along with UCBLogo here:
Though I don't know if that counts as a Logo implementation.
The best part was that I was not forced to do it. I just tried it and liked it. Parents hear there is money in programming and try to make their kids do it. It can lead to an unhappy life. Parents, expose your kids to lots of things, let them pick.
10 PRINT “foo”
Initially I was pretty frustrated at how difficult it was to get into 256 color VGA mode 13 with Turbo C++ but remember the language itself seeming much more elegant even knowing the little I did as an early teen. Figuring out how to use inline assembly to generate interrupts to switch video modes, change palettes, etc. started me down the path of low level programming that led me to a career in video games though so it kind of worked out.
I am kind of jealous of you. I tried introducing my kids to programming and they simply do not show any interest.
It was really limited, with no real means to access the system's memory save for a couple roundabout ways including yes, actual PEEKing and POKEing of all things. But it was still extremely fun despite its limitations. I wrote a simple text adventure game, a virtual die (so I could play board games whenever my little brother "lost" their dice) and even a simple poll app (Presidential elections and how they were supposedly rigged with electronic voting was a big topic back then. Nowadays it's just a given where I live). I have no doubt these first approaches tend to be the most important and keep having an influence even way later on.
I just hit F9 and even my biggest programs compiled less than a sec on a 386 :)
And they already know the GOTO 10 loop and love it. Donald Knuth was wrong, IMO.
It was Dijkstra who wrote "Go To Statement Considered Harmful" and it was Knuth in reply who wrote the best analysis (and defence in appropriate contexts) of the GOTO statement (in Structured Programming with Go To Statements: https://pic.plover.com/knuth-GOTO.pdf), and who still uses GOTO cheerfully and liberally in his programs. (https://twitter.com/svat/status/913114286951505920 , https://twitter.com/svat/status/885344334735745024 , https://github.com/shreevatsa/knuth-literate-programs/blob/m... ) — so it seems really hard to understand that a comment that seems to be positive about GOTO is invoking as "wrong" the name of probably the most prominent programmer to still use them. Very puzzling. Of course it might just be a mistake, confusing Dijkstra for Knuth, but worth asking for clarification.
> It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.
from EWD498: How do we tell truths that might hurt? (http://www.cs.utexas.edu/users/EWD/transcriptions/EWD04xx/EW...)
I agree with few comments here that:
- Python has the nice syntax
- QBasic was a nice IDE
* Play with it at https://educa.juegos/
The full source code is at https://github.com/somosazucar/Jappy
Play with it! If you make something fun I'd love to hear it.
My professional programming started with a Wang MVP minicomputer using BASIC.
Next came years of MIIS/MUMPS, which is BASIC with a b-tree DB tacked on. After a few years of that I did consulting to help speed up sorting a text array in QBasic/GW Basic/BASICA. Also used the TI-99/4A to code some games for my kids. And a Commodore-64 for recreational programming: remember keying in code listings from computer magazines, one in particular involved POKE's which enabled direct screen addressing. Also developed a program to calculate BTU's for heating/cooling.
And translated an Applesift Basic program to MIIS to figure out calculated results for a laboratory.
It enabled my interest and career of almost 40 years as a developer.
I spent so many hours with the yellow-cover copy of Ahl's "BASIC COMPUTER GAMES":
name = input("what's your name?")
if name == "Noah":
print "you're the best!"
print "you're the worst"
Not to mention it does invert many of the things that people coming to it will want.
Consider, you want to tell the computer to ask for something. Once it has an answer, put it in a spot.
name = input("what's your name?")
Contrast with the BASIC.
INPUT "What's your name"; name$
* Editors are not necessary for beginners - just like disk I/O was unnecessary in the times of the C64 BASIC. Create a bookmark to any of the many online jupyter servers (e.g. ), or install Anaconda with a shortcut to a local jupyter notebook.
I installed QBasic on my son’s 11” HP Stream today, having to hack a DOSBox manual installation.
t = turtle.Turtle()
I also have questions:
What is import ? Why do you need to write turtle 3 times ? Why do I have to write t.circle ?
This is boring, I better play Candy Crush on my phone.
Compare your code with the QBasic equivalent:
circle (50, 50), 100
Also, I doubt many 7 year olds have a Jupyter notebook they refer to as "mine".
See? Way fewer questions than with Python or JS.
>Also, I doubt many 7 year olds have a Jupyter notebook they refer to as "mine".
That only reinforces what the parent said.
Hey, i remember that! I was actually 7 years old when i first encountered the `CIRCLE (xcenter, ycenter), radius` statement (in GWBasic, but it is the same in QBasic) and... it actually made sense to have the coordinates separate. Although probably a large reason it "made sense" was that i wasn't familiar with any other notation. I quickly jumped to Turbo Pascal and i remember the `Circle` procedure in Turbo Pascal (BGI) looking weird with it being `Circle(xcenter, ycenter, radius);` :-P.
I do not really remember much from that time (it has been almost three decades since then :-P), but this is something i sometimes think back to because of how each statement had a "customized" syntax that i found making sense (e.g. GW/QBasic has `LINE (x1,y1)-(x2,y2)` and this also made sense because i'm drawing a line between two points and again Turbo Pascal's `Line(x1, y1, x2, y2);` felt weird initially) and this sort of statement-specific syntax being something you do not really see in languages nowadays (or even at the time, AFAIK most other BASICs used `LINE x1,y1,x2,y2` - when they offered graphics functionality at all).
I know I started with the ZX Spectrum and I didn't need it.
As for the parenthesis, I was content with being told that's just how you make a circle
I've got a bunch of paper and crayons.
See you in a couple of hours, cause I'm just gonna draw.
(or is childhood that different, now?)
"Daaaaaaad, crayons are boring, can I use the computer?" (I think there's a few minor differences) :)
Which mean 9 in 10 don't. Not that, in any case, wetting the bed is actually in any way anything but a distraction in this discussion.
Owning a personal computer in the 80's wasn't given, even in the middle class white Connecticut suburban neighborhood I grew up in.
I was introduced to Logo at school maybe not at seven, but maybe 11?
Looking back, I would have given anything to just have more play time where I made up the rules, and those rules weren't called "rules" (because that's what playing is, right?)
As an adult, I find that I'm not playing by making computer programs, but by going into the woods, and pretending I'm basically a kid and exploring and getting a bit caught up in my own imagination. It's... it's lovely.
That to me is the ongoing problem with picking up a programming language these days, figuring out all the available libraries and which of them i need to load to get closer to my goal.
The Youtube channel The Coding Train is also great for this.
I learned programming in (q)basic because of the line and beep functions.
Only because of its connection to web display.
Not because of the language's syntax or qualities.
I don't know that QBasic is the best for teaching kids, but I don't have a better answer. OP has a point.
(and you can run this completely in your modern browser - amazing!)
Anyway, I have the feeling you will be interested to read this https://www.apl2bits.net/2015/03/09/john-carmack-doom-reddit... or this https://twitter.com/ID_AA_Carmack/status/569658695832829952
But QBasic was surely a very nice way to introduce high school students to programming back then.
We really need comparable systems on today's systems.
If my family could have afforded a better computer, I would have been playing games instead of tinkering with BASIC.
I'm not a coder, but 20 years ago I still felt somewhat competent in my overview of available platforms and what they are used for.
These days it feels like there's something new every day, which is great don't get me wrong, but I can't even imagine trying to keep up with all that because it looks like quite a Sisyphean task from the outside.
I made a resistor ladder DA convertor, and then wrote data to the parallel port with a few simple BASIC "OUT" instructions.
Some GB emulators (NO$ at least) used this for sound, and it was quite good.
I say this with a tremendous love of Turbo Pascal - it was my first real programming environment as a kid. But I also had a great time with environments like Klik-n-Play and GameMaker.
The point of the article was make it simple, make it easy (e.g. no compilation) and make it immersive (use it fullscreen without unnecessary distractions).
I’m very fond of it
Luca Novelli published in the 80s two books called “My first book about Basic” and “My first book about Computers” , which are specifically intended to introduce computers and programming to children. After an introduction to algorithms and flow diagrams, the book dedicates a full page to each (GW)Basic instruction, providing an easy-to-understand explanation and -even more important- a small code example (~10 to ~50 lines) in the form of a game.
This means that the whole book is a compendium of interesting games (guess the animal, guess my age, pick a number) that slowly build up knowledge without being boring or pushy, and gives the child tools to build his own games. It may look outdated, but programming hasn’t changed that much since K&R C, so most of the concepts are still relevant today.
Sadly, the few page scans I could find are in portuguese or spanish, but they will serve as an example of the pedagogical approach. I hope anyone interested can find an English copy on Amazon, as it is IMHO an excellent way to teach & learn programming.
 “Two children, a dog, and a personal computer explore the history, concepts, and uses of computers, identifying such aspects as binary systems, computer languages, programming, and memory.”
Instructions, strings and numbers
FLOW DIAGRAMS — AND instruction
IF THEN GOTO instructions
LIST and LOAD instructions
Basic and Binary
I coded many interesting things in QBasic in my childhood, way before the Internet era: a point-and-click GUI like windows 3.x, a compression format that I thought was marvelous. Later in my life I found I had “invented” Run-Length Encoding, haha.
> I, myself, originally taught myself to code when I was young, using BASIC on machines with BASIC in ROM, and have typed in listings from magazines and bought expensive BASIC reference manuals, read them in detail and used the language to create many utility programs and games. And even I wouldn’t wish BASIC on anyone when modern languages with features like (gasp) functions are readily available.
> I believe that my development and education as a programmer was severely hampered by BASIC’s lack of functions, and I would not wish it on anyone else.
Also, classic 8-bit era BASIC has functions, they are just limited to a single expression, and limited by BASICs limited array of built in functions and operators that don't require using statements (which I guess makes them the spiritual precursor of Python’s lambdas.)
If you mean “functions” in a strict mathematical sense, sure, I remember those. I remember thinking they were useless and wondering what anyone would use them for – I never used them myself. The way to do any sensible subroutine things (in the BASICs I had access to) was to use GOSUB to line numbers, which was a method inherently too cumbersome to do anything complex with, which is why I was stalled when my programs reached a certain level of complexity. I was then mostly halted in my education and development as a programmer for many years, because the language did not make program composition easy. It was not until I had the opportunity to learn other languages with did have proper functions and other methods of program composition that I could break through the barrier, so to speak.
> BASIC is great because it's basically structurally like simple assembly language
And why would that be a good thing? Classic assembly language is basically living on borrowed time, since the memory and CPU model they assume and expose are long gone, and only emulated in microcode on today’s hardware.
Modern hardware platforms are multi-threaded multi-core systems with multiple levels of cache for memory, nothing of which is exposed to classic assembly language. Also, disks are no longer nearly as slow, nor is RAM always volatile. Not to mention that even memory mappings and protection rings, as old they are, weren’t always there in classic hardware, and, when present, were mostly hidden from the programmer, since the programmers of the time were used to systems without them.
Just like classic assembly in a way created C (commonly called a “portable assembler” for its closeness to the old machine model), I’d like to first see an “assembly language” for the modern architecture – something low-level which exposes all the multi-core, cache level, and protection ring complexity as inherent parts of the language. Then, if I dare dream, I’d like to see many high-level languages built directly on top of that low-level one – almost certainly a Lisp, but also maybe an Erlang. All other languages (except maybe Prolog) would probably be too hard (or too like an emulator) to map sensibly to the inherent multi-core model of modern hardware; you might need an entirely new language paradigm. And, sure, also a pony¹, why not.
> which was a method inherently too cumbersome to do anything complex with,
It's really not. (Unless by “complex” you mean specifically “recursive (even indirectly), but not tail recursive”, in which case I'd have trouble disagreeing.)