Hacker News new | past | comments | ask | show | jobs | submit login
“Was isolated from 1999 to 2006 with a 486. Built my own late 80s OS” (imgur.com)
1284 points by shalmanese on Dec 14, 2014 | hide | past | web | favorite | 265 comments

I had a similar although not as tragic story of my own. My parents were poor (dad was in the USAF) so we couldn't get a Commodore 64 or Apply IIe/IIc like everyone else had. We got a Commodore Plus/4 because they were literally giving them away. Since I couldn't buy any games for it and there was hardly any software available, I taught myself BASIC and made my own games. Fast-forward 30 years (geez I'm getting old) and now I'm fairly successful (in my own mind) as a software engineer at a Fortune 100 company. I credit being deprived by my parents for becoming interested in computers like I am today.

Ha, I had a similar situation. I grew up with a trash-80 well after it was cool in the early 1990s! Though I did get to use an Apple IIe around that time too which was awesome (did not realize how old these computers were, just was so excited to write command line BASIC).

My first computer was a Laser 286 and I marveled at the power it had. Even the turbo button from 8->12 Mhz of course :)

Oh, and the video card that took 4 AA batteries to run...

I had a Laser computer with the same specs. Even slowed down to 8Mhz, some games that were meant for an XT weren't playable. I popped the cpu out of the socket and I've had it on my key chain since 1995.


Ha! Never met anyone who had the same model. It was a beautiful machine.

What is the grey material it's encased in? Really cool idea.

It's the resin of the CPU's top surface, just incredibly worn down from being on somebodies keychain for years.

I grew up with a 386SX , newer than yours but again very outdated when I got it.

I remember going to downtown in bus with my brother to a game store, watching amazing games in catalogs and then going back to home with 40 floppy disks just to figure out that number 38 was broken and after lot of trips to the game store, that the game didn't work on my hardware.

I learned programming much later in a newer hardware, but I learned about files, directories, etc in this machine.

What were the AA batteries for? Do you know why the card wasn't powered with the rest of the system?

>What were the AA batteries for?

Some motherboards from the 286 era used 4 AAs as the CMOS backup battery.

To be more precise, it was common with motherboards that used the MC146818 RTC chip used in the original PC/AT. The four AAs provided a total of 6V of power.

I actually don't, and I can't find a damn thing about it on Google. Weird.

Ahh I'd completely forgotten about turbo buttons! Thanks for the blast from the past there.

I wish I could upvote this more times. My parents also didn't have a lot and I spent 3-4 years using a 286. I cut my teeth writing code and optimizing like crazy because that's all I had. Couldn't just go get more memory or a better CPU.

20 years later I guarantee you my code reflects the ingrained thinking from those early days.

If I could have afforded a floppy drive for my C64, I would have been a great Loderunner player rather than a coder.

Different generation, but I used a Pentium 2 Acer laptop, with 76Mb of RAM and a 4Gb of HDD, from 1999 through til 2005, when I'd finally cobbled together enough spare parts from other people for free to build my own AMD-based desktop.

It's what made me learn Linux, as it was so low-spec that Windows XP wouldn't run correctly on it. Slax was an excellent distribution, a Slackware-based Live CD that I forced to boot from the HDD despite it not really supporting it then. Prior to that I ran Damn Small Linux, and Windows 2000. No gaming for me, so I learned to build software instead!

You could have played nethack.

Similar story, although I wouldn't call us "poor" for not being able to afford such things. My dad however managed to get his hands on an obscure failed home computer called COMX 35[1]. Not much software, BASIC, same story.

[1] http://en.wikipedia.org/wiki/Comx-35

I've always wanted to tinker with one of these PC's, something about using the same microcontroller as Galileo.

What was it like using it?

taught myself qbasic using the helpfile. There were tons of examples on how to draw and how if statements worked. Progress was really slow for the first month or so. I was like a monkey trying to mimic what they saw. Didn't have internet at first so sometimes at school i would go to the library during lunch and attempt to download example programs onto a floppy disk. This was around 2000.

The QBasic help system was really, really good and I don't think it gets enough love. I was seven years old when I really got started with it and it was perfectly comprehensible and super helpful.

They even translated it to German, I think. Got me started.

So amazing. I wonder if it's possible to manufacture that sort of limitation nowadays. Not sure if I would even want to!

I think the limitations are not main reason why these 8-bit computers produced many good programmers. It's the fact that they booted straight to BASIC. In many cases, that was all you had. That invited people to start experimenting.

My parents were also poor, I learned BASIC without having access to a computer except for one hour a week at school, I didn't have my own computer until many years later, I was fixing other people's computers to earn money to get components for my computer, I didn't have access to internet until around 2000, so I was learning from outdated books. I sometimes wonder what would happen if I didn't have those limitations.

Anyway, Raspberry Pi is definitely the closest thing to the experience now. You boot up the default image and you have access to a lot of programming tools. You even get to peek/poke memory to access the GPIO pins, but you have to do it as root these days. :)

"I think the limitations are not main reason why these 8-bit computers produced many good programmers. It's the fact that they booted straight to BASIC. In many cases, that was all you had. That invited people to start experimenting."

More than half the kids I knew who had computers growing up just used them to play games - there was no experimenting with BASIC or anything else. It was a games console with a keyboard - that was it.

OF those who did program, I think many were forced in to getting 'better' because of the limitations. You generally couldn't say "just add another meg of ram to run this" - there really was only one configuration in many cases - you had to think of interesting ways to compact more info in to a program (compression, page out to disk, etc).

Yes, having a programming environment as the first thing you're greeted with probably did get many people involved, but the limitations also produced a mindset amongst many devs of learning how to do more with less.

I like your hypothesis. Maybe I'll campaign to get BASIC material on Khan Academy ;)

Only semi-kidding.

We're developing a BASIC interpreter called retRUNner that's (initially) Applesoft compatible, but will have modern graphics and sound capabilities so users (kids, mostly) will be able to play games like Woz's original Breakout (we have a collection of over 400 listings we'll be shipping with it) and upgrade them to 3D, add art, music, sound effects and so on.

So we're teaching a bit of history along with some coding skills. Down the road we're going to add additional 'dialects' such as Apple LOGO, and Atari, Sinclair and Commodore BASICs, but the Apple II version will be out soon -- I'll be sure to mention it on here when we release it =)

So cool! Happy to give you early feedback if you so wish. Best of luck with the project.

I like this.

for bonus points support all the same PEEKs and POKEs and CALLs that Applesoft BASIC did on the II series. even if it's not strictly part of the BASIC itself, but the integration between it and the hardware/BIOS, it would be nice. technically might be required to correctly run lots of the BASIC programs from that era. I know most of my own relied on them for certain effects.

Why not. This is pretty much my game plan if/when I have kids: Start them off with a raspberry pi or similar, a keyboard and a monitor, then gradually explore with them and add more gizmos and software together. Some of my fondest memories as a child was playing around on my father's old IBM mainframe workstation computer with two 5 1/5 inch floppies as a three to five year old. At that age I really didn't need much, seeing something happen on a screen after some keypresses and some analog sounds coming from the floppy drive was enough magic for me. I'm thinking that giving a child an iPad at that age could be really damaging - there's just no sense of wonder and accomplishment outside the prepared paths anymore.

I'm doing pretty much that right now with my 5-year-old daughter with a raspberry pi. She can currently boot it up and check her email by herself (using claws-mail). One of the big things that I think we can do today that wasn't really available when I was a kid is robotics. For a couple hundred bucks and some time invested, you can make a robot platform for them to play with and get them playing with super low-level stuff without any kind of pretense (limiting her to just the r-pi with a BASIC repl would be pretty artificial in today's world, but the state of hobby robotics now is much like the state of computing was 25 years ago). I'm working on a robot repl right now so that she can get the instant feedback like we had with BASIC, but it's slow going for lack of time.

The thing I've found interesting is how she goes back and forth between the r-pi and the iPhone pretty seamlessly. I don't think she realizes that they're fundamentally similar devices. To her the iPhone is just a black box with some fun games on it, but the r-pi is a lot more like an adventure.

This is amazing what you're doing for her. I wish someone did something like that for me but as a girl I was usually discouraged from pursuing science, computers, physics or anything like that. I am sure she will appreciate it all when she's older.

That doesn't add up to me. The children of today will almost surely look back at the iPad the same way you look back at your dad's IBM mainframe and with the same nostalgia as others on this thread. It'll seem quite primitive compared to the contemporary technology of their adulthood.

Here's the big difference: You can't 'play' around with them the same way, they're not built to program on or change / break things significantly. It's not an empowering device.

I bought a kano/raspberry kit for my 5yr old. We explore the terminal, python interpreter, minecraft etc and he is totally pumped, with the ipad lying idly just by his side. What really clicked for him was the fact that he now had a real computer that did stuff like dad's and calls himself a 'real hacker'.

I still see tight hardware constraints now. Until '10 I was working at a company that made flash CF cards which used 8-bit processors. 32K of SRAM and 64K of code space with bank switching. Every clock cycled mattered. Hand coded division operations. 32-bit integers are a luxury. Even wrote some self modifying code for an inner loop that boosted performance by 25%. Nowadays that company is using ARM processors but you still hit constraints (usually memory) not seen on PCs and servers.

as I've gotten older I've increasingly wondered if I should make the jump (back, technically) to low-level programming. working on those highly constrained devices, every cycle matters, every bit, every watt, etc. because that's where I started as a kid and think it would give me a market advantage over the younger generation who've grown up with supercomputers in their pockets, a dozen high-level languages to choose from, frameworks for everything, tutorials and videos for everything, and not really having to worry about performance or resource usage like "we" used to have to do just to make anything work at all.

Yeah low-level programming was why I have been working on firmware for a long time. But working on truly resource constrained systems can get old. Instead of really solving the problem at hand, you have to work on quirks and idiosyncrasies. I'm glad that we are now working on 32-bit processors.

> Instead of really solving the problem at hand, you have to work on quirks and idiosyncrasies.

FWIW, I work in Big Data and feel the same way :)

A lot of kids probably get started programming with TI-BASIC on their TI83s/84s or whatever they have these days.

I think today's microcontroller are more similar at spec level. So programming Arduino-like is a maybe today's equivalent.

Hum, maybe I'm stretching a bit here.

My not-so-tragic story of programming in BASIC as a kid was on a Coleco Adam. I would pick up 1980's computer books/magazines from the library with BASIC code in them that I would type in. Some weren't meant for the version of BASIC the Coleco had and wouldn't run without modifications. Getting them to run was half the fun.

The Coleco Adam: https://en.wikipedia.org/wiki/Coleco_Adam

The sound of the high speed tape drive when booting up (or playing Buck Rogers) still remains fresh in my mind. That, and the daisy wheel printer's rhythmic whir-rat-tat-tat-chunk-whine whenever I printed out a letter asking for another tape drive.

I also had my first generative music experience with the Adam - POKEing numbers into various registers would trigger the waveform generators and if combined with various patterns/functions (sin, cos, and the like) would make some cool sequences.

Different generation. Couldn't afford a computer until 2008. Got a celeron clocked at 1.2 Ghz with a 128mb physical memory and 20Gb HDD running a win98. This was when everybody in the neighbourhood was playing Assasin's Creed in their core2duo.

If only my machine ran all those amazing games, I wouldn't have been a coder now ;-)

Ah, the "joy" of loading the assembler from the the cassette tape recorder from one tape, then switching the tape to load the code I work with, then changing it and then saving it on the tape before the run it since it will be lost if it crashes or just enters the loop. Then I've added the NMI hardware which allowed me to break the loop and keep the upper piece of the memory. And like some commenters also mention, that computer also didn't have the "real" keys.

But don't get me started on using the switches to input the single bits over the console on the even older computer.

Seems like a lot of us have a somewhat similar story. I grew up in the countryside without a tv. When I was between 10 and 13 we convinced our parents to let us buy a used c64. Since we had no tv we had to get a monitor as well which took another few month.

My career possibly started there but I was already infected by a text adventure that a grown up relative had made/adapted. (Or more specifically I was infected by the idea that you could write code and have the computer do stuff for you.)

I'm really shocked to see how many people were inspired by text adventure games. My story is similar to the others here (posted the short version). Scott Adams Pirate Adventure FTW!

Likewise here. In my case it was a Dragon 32 in the 486 era.

That was an absolutely awesome little machine. Did you ever figure out that it really has 64K of usable RAM?

I think the Dragon 32 really did only have 32KB RAM, but yeah, the full 32KB wasn't typically accessible. Part of it went to video RAM, and part of it went elsewhere, so quite a bit less was available to use by default.

However, there was some hackery you could do to get close to usable 32KB for your program, although I never did that. My tape drive never worked properly, so my programs were naturally very short…

When I graduated to a 286 (in the Pentium era), that was a pretty special moment. It had a 42MB HD!

Just poking around in the video chip would enable the second bank of 32K RAM for writing. Then copying the basic ROM by reading the ROM and writing the RAM would make it possible to switch to the next stage, which was to enable the second bank of RAM for reading and writing.

After that a relocation routine took care of moving the ROM code to the upper portion of the RAM which left you with 48K of usable RAM in BASIC and 64K - the video mode you were using for assembler!

I had a similar experience, but what really drove me nuts was the selection of ridiculously good and expensive computers they always had in the PX. Who could afford one of those things on military pay?

At least it had real keys...

Starting on an Atari 400 probably is the reason I have become so picky about keyboards.

I remember having indented fingertips.

Something about this story does not sound quite right to me. If I were essentially locked up in that scenario, I would at least be confident of the names of the 3 games which were installed, since they'd inevitably be played to death, regardless of how good they were. Also there's this http://www.reddit.com/r/australia/comments/2p63rv/australian...

OP here; Technically, I had already played them to death over the 1990s. Dad used to bring home multiple shareware games: Xargon, Jill of the Jungle, Wacky Wheels, Skunny Kart, you name it. I played them until I became tired of them, although I never got sick of Rise of the Triad.

I had a few floppy disks with these titles on them, but again, having played them for so long, I chose to start my own little endeavour to create games instead of playing them, so I eventually removed everything but ROTT for as much disk space as I could get (being my first computer and such, I believed it might've helped performance, was still learning at this point). When I wasn't writing crappy interactive textventures, I was writing up stories in general in EDIT. My reasoning was also that when my stepdad would come into the room, he could never say I was "off playing those bloody games" again in an argument, and that I could argue that I was studying the books I'd picked up from the library.

I did play a lot of ROTT in my spare time. It was an early copy (1.0) which had some features removed from the final game such as alternatively coloured pushpillars and the such, but as my system had no speakers, the pc speaker became more and more annoying over time as well. That much I definitely remember, but since we went through so many shareware games in the 1990s, I don't remember what was ultimately on the system when it was given to me (nor when it was thrown away). I would give everything I owned to get it back, but unless someone chose to salvage and restore it, that system is long gone and crushed up.

>If I were essentially locked up in that scenario, I would at least be confident of the names of the 3 games which were installed, since they'd inevitably be played to death, regardless of how good they were. Also there's this

You'd be surprised what happens after 10 years or so. There are games I played as a kid countless times, in arcardes and on my PC, and yet I coulnd't remember their titles at all when I was at university age and such.

I only rediscovered some of their titles with luck and queries on Google later (in my 30's).


"game where a frog tries to cross the road" was apparently Frogger.

"game where you have a crosshair and hit people" is Prohibition.

"pc game where a cat jumps in an neighborhood" is Alley Cat, etc.

This is absolutely true. One of my favorite arcade games from the 80's where I've spent countless hours and played it to death and I can't remember the name of it. To this day the quest to find it again continues. It's like an adventure or something.

You should post something here: http://www.reddit.com/r/tipofmytongue

Or try an arcade related forum or subreddit. I'm sure someone out there can help you.

Tried it already a year ago with no luck. Maybe I'll try it again!

You could try it here too, what do you remember about it? What country were you playing it in? What were the controls? Joystick, trackball, one giant button, several buttons? What year did you first play it? Any memories at all about what the gameplay was like or the overall quality of the graphics?

Here is most of what I remember:http://www.reddit.com/r/tipofmytongue/comments/162zpu/tomtvi...

The other game I can't remember was contra/turrican like game with main character like a robot and you could go in all directions on a map. I remember there were egyptian motifs around levels and I vaguely remember that game had a name Phoenix or something similar. Same time period, same nondescript arcade cabinet. Both in Europe / Croatia.

Very interesting, the reddit thread pretty much went through anything I could immediately think of. Was the game in English?

Yes, it was in english. I went through everything I could've found. I know it when I'll see it though.

Techno Cop?

It's not Techno Cop, though it looks interesting!

Rolling Thunder?

Sadly, no.

Also, "Osborne 486, with about 64k of RAM if memory serves me correctly. "

First, a 486 with 64K RAM is ridiculous as they typically had multiple megabytes of memory.

Second, he worked with that PC for 7 years and isn't quite confident of how much memory it had? This doesn't sound quite right either. I can clearly remember the amounts of RAM my first few PCs had, starting from 1999.

I know it had 640k extended memory via the startup memory check.

I don't know how much processing power nor memory it had. I had never owned a computer, and my dad left it behind for me after the divorce so I had something to use. I had no idea what was inside.

I'd agree that this would be a very relevant factor if I had programmed a "true" operating system (since most true programmers would probably be knowledgeable with the amount of memory usage going on etc), but to take things a bit more indepth, I suppose I haved't actually coded an operating system, but rather a dos Shell of sorts that is a facsimile of an 80s-style operating system. It merely uses the same commands as dos and interprets them into a visually descriptive feature on the screen (such as a split-screen with directory and file listings for example), as well as a few commands of my own that I added in. My terminology is not fantastic, but I suppose indicative of what I considered to be an OS. A Graphical shell might have been a more accurate term.

I remember the specs of all the systems I've had since then, but with the 486, I didn't spec it out myself, and was merely a kid with a computer at that point.

Incidentally if you ever find you miss coding, there are plenty of modern languages that are much easier than C++, which is pretty much the most complicated one you could have picked.

I'd like to add: a language that may be more up the OP's alley is Blitz3D, a BASIC dialect for games that's recently become free and open source: http://www.blitzbasic.com

Yeah that looks pretty cool.

A more mainstream option would be python. The free courses at udacity.com would be an easy way to get started.

Why are we trying to insinuate they're lying? Either there's evidence or there's not, and without some pretty strong evidence, this is just depressing. Especially from the point of view of someone who might like to one day present a project they've been working on to "the internet."

I'm close to someone who I've been watching slowly discover their passion for programming over the last six months or so. They never realized they'd care so much, or be any good at it. I've been giving consistent positive reinforcement, and I dread the day that they roll out their first real achievement to the masses. They might just throw their hands up in the air and give up entirely, judging by how people usually act.

I totally hear you and understand. However, two points to consider.

One, nobody's forced to share their project/efforts on the Internet, which is essentially the entire planet, all of humanity, 24x7 for all the rest of time (info wants to live forever, etc.) We are all free to keep our things private, off-line, just among friends, etc. You're free to approach people in real life you know for feedback, tips, encouragement. They're more likely to be polite, and also you're not asking something of a total stranger.

Second, there's been a cultural change that a lot of younger people or newbs may not be aware of. For folks of the pre-Internet, "first with computers" generation, that's precisely the standard operating environment when we we learning, just starting out, taking those early baby steps. When we were first scribbling out our first programs, however naive or full of bugs or flaws, we didn't immediately go push it in the faces of millions of strangers across the world (ala Show HN, GitHub, blogs, etc), a good portion of whom would be professionals alot further along, and say "omg look at this isn't it amazing I'm a hacker now please hire me k thx" -- we couldn't because the Interent didn't really exist and most of these modern ways of publishing and sharing didn't exist. So most of that early crude, naive, sometimes embarassing stage was done in private, off-line, or only among friends or teachers known personally. Today we have the equivalent of proud (and healthily, voluntarily deluded) parents posting their kindegartener's first crayon house-and-trees-and-sun drawings out onto blogs, GitHub, HN, Reddit, YouTube, etc and then being shocked or having their feelings hurt when others aren't as eaually blown away as them, and in some cases critical or dismissive. Its just the nature of the beast. Nobody forced them to share those things, and in that public global anonymous mixed (newbs/kids and experts/pro's) of a forum. They chose that. And it makes sense you'll also get back a mix of reactions, some of which you won't like.

What the modern Internet enables today has lots of great qualities (eg. I can download all this cool software for free, Wikipedia, Kickstarter, etc etc) but it also enables lots of sucky stuff (eg. eternal September, eternal kindegarten mode, arrogant assholes, FAQs, RTFMs, trolls, flamewars, phishing, astroturfing, doxing, etc.) Its hard to enable one without the other.

just starting out, taking those early baby steps. When we were first scribbling out our first programs, however naive or full of bugs or flaws, we didn't immediately go push it in the faces of millions of strangers across the world (ala Show HN, GitHub, blogs, etc), a good portion of whom would be professionals alot further along, and say "omg look at this isn't it amazing I'm a hacker now please hire me k thx"

No one was asking for anyone to hire anyone. I have no idea why you're saying this or belittling people's work.

If you can't be supportive, don't say anything unless you have strong reason to say it. Doesn't matter how far along the work is or how bad it is. If everyone followed this, the world would be a better place than it is, particularly the internet.

The lack of support is exactly what causes people to switch careers, because they get the impression everybody in the field is a flamer.

I don't think these are lies. I think this speaks to the level of technical understanding by the author. He thought 64k. He was writing everything in qbasic-- so a spec like that wouldn't matter. These are some good things to accomplish with basic, but it made sense when you get to the end and understand he is baffled with an ide. The term dos "clone" is weird--- I think he meant a sort of shell written in qbasic. This is essentially a "script kiddy" stuck in the outback. Admirable use of his time. I am scared about the level of worship the comments on the page provide.

I think "script kiddy" is overly harsh. Script kiddies can't program, they just run other people's "scripts." He clearly had some aptitude, he just didn't have the books or the right environment to learn what he needed to truly excel.

You must be right.

I learned to code in machine language, I was punching in hex codes into an Amiga, all from one reference manual. It was a painfully slow way to learn. With no IDE or other reference manuals, QBasic would be the only way to achieve any of this.



It's not a question of remembering. It's a question of it being deeply relevant if you're writing an operating system.

I've been there, just as everyone else who learned this stuff pre-internet. Not only did I know how much memory I could address, but the location and behaviour of various i/o ports and hardware interrupts, not because I have a good memory but because of necessity.

I think the 64k RAM perhaps refers to the QBasic module size limits [http://support.microsoft.com/kb/45850].

Quite likely. Many early x86 programming environments had limits around 64K due to segentation (e.g. early Turbo Pascal), so I'm not at all surprised that someone stuck with crappy old books and little access to information would think the machine had 64K RAM.

As an illustration to those that don't seem to understand, there was that year when I was a kid when my friends and I decided BASIC on the C64 was not good enough, and we tried to figure out how to get started with assembler....

Something that started with us trying to poke the character codes for BASIC statements into memory, unaware that we were not only still writing BASIC, but had "invented" a storage method substantially less efficient than what BASIC on the C64 already used (it was stored tokenized).. It took a long time before I managed to get hold of an assembler, and months more before I managed to get hold of a tutorial that got me anywhere.

This was with a group of 5-6 of us that all wanted to learn, but none of whom knew where to get the information, nor had parents that knew. My dad was programming, but only BASIC, and didn't know where to get what I wanted - few books were available, even fewer translated.

People really shouldn't underestimate how vital easy access to documentation is.

We're talking about a time when e.g. when I (years later) finally got hold of precise cycle counts for raster interrupts for the C64 (understanding precise cycle counts in the presence of sprites etc. was vital for many more advanced demo effects), it was a 3rd or 4th generation photocopy of a handwritten diagram that someone had partially learned from others, partially reverse engineered through countless hours of adjusting interrupts a cycle at a time to study the results.

None of the books I'd managed to get hold of contained even a fraction of the amount of detail.

arebop, why aren't you looking for the worst in people like the rest of us?

Frankly I remember playing with qbasic/basic and being put off by how unwieldy it all was. Kudos to any kid that makes anything in basic.

I didn't become interested in programming until I managed to get a copy of Modula-II off of a BBS.

> Kudos to any kid that makes anything in basic.

Relevant: http://www.pcworld.com/article/2033318/black-annex-is-the-be...

Homepage of this (incomplete) game: http://www.blackannex.net/

The confusion likely stems from the various limits which existed when running qbasic in real mode on a 486.

Conventional memory was limited to 640kb. Access to higher amounts of memory either required switching to protected mode or using XMS.

Additionally, there was the segmentation limit of 64k for pointers, which may have limited things in qbasic.

That seemed odd to me as well. On the other hand, he was programming in QBASIC and text mode (from what I saw in the screen shots). Nothing he did probably required a lot of memory, and since he never had access to new software, it was probably never an issue. So it's understandable. Unfortunately he wasn't left with a copy of Turbo C++ (or TASM) and a good programming book.

Also, Osborne went out of business in 1985, but the 486 didn't debut until 1989.

In Australia, there was a separate company called Osborne that indeed marketed the Osborne I -- but continued to live on and sell PC clones after the American Osborne went out of business.

It's a bit like Australian Woolworth's.

I'm guessing this was about Aussie Osborne: http://en.wikipedia.org/wiki/Osborne_%28computer_retailer%29

Yup, that sounds correct.

640KiB !! Probably would have 1 MiB of total RAM and the BIOS shows that it have 640KiB of conventional RAM

He's probably recalling 66Mhz as the clock as 64k.

People who get big projects done don't really play lots of games. Carmack, for example, has said he doesn't actually play any.

It's strange how consistently true this has been. I wonder how many games Notch plays, especially during his prime when he was making Minecraft.

Check out Notch's Twitch broadcast history[1]. Since Mojang was sold he has been playing a variety of games, but scroll down a bit and it mostly turns into live programming and Team Fortress 2.

1. http://www.twitch.tv/notch/profile/past_broadcasts

The earliest broadcast is titled "Notch: Creator of Minecraft," from August 2011. Minecraft was released in 2009. So those videos aren't really evidence one way or the other in regards to what I'm saying: people who do big solo projects don't seem to play many games during the time they're hammering out features. And while Notch's LD sprints are impressive, he hasn't really shipped a big project since Minecraft.

Whoops, I just noticed I phrased my original comment like "I wonder how many games Notch plays" (present tense) which is exactly what you showed. I meant to phrase it in past tense. Sorry, that was my bad.

Notch had handed over the vast majority of Minecraft development to Jeb at that point so its not surprising that he was playing so much TF2. I doubt he was playing as much when he was the sole developer.

It sounds quite reasonable to me. I did morally similar things on an 8088 machine, 10 years previous. Eg, no ray-tracing, but I did an animated movie. No operating system, but I wrote my own BASIC in BASIC. Everything described is well within the skills of a hobbyist.

While I played a bunch of games on it, they weren't games to play to death. Once I finished the levels of Loderunner, for example, it looped back to level 1, and I stopped playing it. The game instead was to figure out neat thing to do, then do it.

The comments on the reddit link are often uninformed, both the negative ones (few hobbyists of the era used source control) and the positive (the projects did not make real use of the hardware, and show no special 'knack' - I certainly am from that era but have no knack for the demoscene).

I'll remember visual images from tons of places, people, games, music, street names and books. But I'd not have a clue what they were called.

I'd know where my school was, I'd just 'go there', I didn't know the street name.

I'd turn on the TV, I didn't know what brand it was.

I'd just turn on the computer and play Escape Velocity, didn't know it was called that.

There are lots of exceptions, like I wouldn't miss the name 'teenage mutant ninja turtles' or ever forget it. But especially for things like computer games when I was 8, I'd just 'doubleclick the game icon with the picture of a car' not having a clue what that racing game was called.

Thank god it's not just me! I feel I went about my days the same way you did. I didn't "Get on the 486DX2", I got on "the computer".

I've had so many questions about my story with people saying "I call bullshit because he wasn't able to remember (minor detail X)" etc. It's only through absolutely racking my brains to bits and insane amounts of googling, facebook messages to old friends and such that I've managed to answer a few of these. It's good to know that my memory is not just deteriorating rapidly as I'd initially thought.

Some things stand out more than others. I was suspected of having ADD for a while, then the diagnosis was shifted to Aspergers Syndrome (something they apparently didn't know much about back then, or was not as commonly diagnosed/referred to as it is nowadays), but a potential giveaway for this was that I had an eidetic memory for some things, and a lot of other stuff that should've been important but I didn't care about was instantly discarded. I can remember down to a millimetre what few toys and possessions I owned, and what they looked like, yet I can't remember what my own BEDROOM looked like because it wasn't the bedroom itself, it was what interested me within it. The same went for most of my house, I barely remember what it looks like or what the furniture placement was. I put this down to perhaps some kind of mental block for traumatic memories. The old house appears in my dreams as a very very very large building and often in pitch black darkness so that I cannot really see what it looks like save for an insanely dim and flickering flashlight. I guess the mind has a funny way of dealing with inabilities to recall events.

Yet I can recreate in my head in incredible detail the structure of the school I attended as I spent quite a considerable amount of my time every day there, and I found the absolutely enormous building to be very interesting and different compared to the bland 1970s brick and tan buidings of my primary school years (I actually went on to recreate it in 3d from memory recently and was able to source some photos of it from the school themself).

http://i.imgur.com/eXb430f.jpg http://i.imgur.com/BDpt6wt.jpg

And yet, I can't remember how much RAM my system had, because it wasn't of any interest to me. It was just "the computer" to me at the time, not "the 486 DX2". Any time I would've seen how much memory the system had (I don't recall if I even used the MEM command, assuming it existed in that version of DOS) but if I had, it would've been "Oh I have X amount, okay" and then I probably never checked it again.

I also don't remember what exact games I had installed on the system, as I barely ever played most of them. I remember which ones I'd played at that point, but I do remember I definitely had Rise of the Triad shareware on there as it served as a key inspiration for writing a "3d" raycaster/FPS style engine. Interesting how selective memory can be, and how it will remember utterly useless details and points of information (or worse, remind you of the moments you did something embarassing or stupid) yet when it comes to crucial or vital events, it can draw a blank on something that was so striking to witness that you would think it would burn the image into your head for an eternity.


> old Apple Powermacs from the 1980s (the tiny ones)

Except that Power Macs were available from 1994 until 2006.

Obviously didn't (and doesn't) knew the name of the product? That makes it even more believable, not less.

I agree.

The only reason I knew anything about the hardware of the classic Macs at the time was that I was an Amiga user, and Macs were occasionally dragged in on the periphery of the war of the 68k based machines that was primarily an Amiga vs Atari thing.

Beyond knowing it too had a 68k, that it had laughable small screen and was monochrome and cost a ridiculous amount, that was it.

That was despite hanging out at the local computer store every few days, and reading every computer magazine I got my hands on (I can thank Commodore Computing International for a lot the earliest English I learned)

Most PC users of the era certainly had no reason to know much more about Apple than the name Mac, and know it was some other computer.

>Most PC users of the era certainly had no reason to know much more about Apple than the name Mac, and know it was some other computer.

Yep, especially in Australia...

I, and I assume anyone else that has tried, get much more enjoyment out of writing a game than playing games. And writing an OS is probably even more fun than writing a game.

Edit: Also just wanted to point out that writing a few games has in a way ruined playing games for me. I still load up a new or classic game every now and then but it's more to admire what the developers have done and not so much as to play the game. I don't really 'play' games any more.

That's interesting. After getting into game modding, I lost most interest in single player games other than modding them/playing with mods. Multi-player FPS was still fun, but playing Crysis after tweaking some LUA files so you were invincible and could punch trucks into the air was way more fun to me.

Two other similar things ring similar to me. I was big into haunted houses when I was younger, volunteered at them and hung out around the online "haunting" community every season, and to this day visiting haunted houses is more about admiring what is being done than being scared. I recognize the animitronics from the catalog, know to expect that panel to drop, and know how that illusion is done, but I still enjoy visiting a good haunted house.

The other really sad thing, is that living in downtown Atlanta where there is a moving/tv show being filmed within a block of my place every other week it seems, I've seen the sausage being made. Knowing how long it takes to setup and shoot a scene that may only be 30 seconds in the movie, it's "ruined" watching TV shows/movies, as I can't help but think about the filming process. I was blissfully unaware how much work went into filming a simple shot of two people walking down a city street.

Doesn't it make you appreciate movies in another way though?

Knowing how it's done and how difficult it is, don't you start looking at things such as lightning, cut techniques etc. in a more informed manner?

It's not usually the same movies (or music, or games) that gets popular with the professionals in the industry is popular with regular folks, for this very reason.

Yep. I think that applies to most things in life. It's like a separate category of emotion opens up of appreciation instead of things being pure enjoyment/entertainment after you know the work that goes into something.

I still play Doom at least weekly, and by play I mean really play. And it inspires me to work on my own stuff.

A handful of maps to go through and I've completed plutonia on UV. Next, I'm probably playing Back to Saturn X ep1. It's amazing there's so much new, inspiring stuff for this old game..

I'd love to go back to those days... 486 with DOS and QBasic, maybe a Borland C compiler if you're lucky enough to have a local computer store that sold copies.

Myself and a friend tried to create our own version of Windows 3.1 in QBasic when we were in middle school. I wish we would have saved our progress. I don't think it was as far as long as I remember. I know it read INI files, and we could drag windows around. Not sure how much else.

OP got to learn how to program the fun way, like a lot of us did in the 90's. Grinding away at crazy side projects on old DOS systems with lots of limitations. That opportunity just isn't available to kids anymore. High powered computers are everywhere and accessible, no reason to make an old PC do new tricks. But I guess they have the web to do crazy things with, so it's not so bad.

> I took another shot at coding this year to see if I could build anything interesting in C++, but the difference between something like QBASIC and a full blown C++ IDE just ended up baffling me more than anything else, and I struggled for a bit.

That's a really damning statement about our modern software ecosystem. You see what this guy was able to put together as a kid, and it's impressive. Sure, it was close to 20 years out of date, but it was good-looking and (appears) usable.

Why isn't it as easy for him, with the knowledge he's accumulated, to build software with modern languages, modern tools and and modern libraries as it was for him to build software with BASIC?

I would definitely put that down to experience, C++ is a much different beast to QBasic. Qbasic being geared towards beginners, C++ being geared towards professionals.

I mean it's the difference between a tool box 70 years ago (Software lives by dog years) and a tool box today, the tools you will find are vastly different, and even though you can build a house with the tools from 70 years ago, that does not necessarily mean you will be able to travel forward in time and build a house with todays tools because you have zero experience with it.

The truth. Take web development. If you're willing to look objectively at the state of HTML5 and the lack of feature support for browsers (that stick around), then yes we've established many good specs and communities, however for the enduser comparing it to Flash back in the days, nothing different. At least Flash development was easy.

goes back to building JS and CSS files with his frontend build system, and testing in IE8

I think that's a great question and I could list some contributing factors but I think a thorough answer could upset a lot of people. I lost 100+ karma just for some criticism of Apple ;-) Yeah, let's critique all the modern languages, tools and libraries--that will go over real well here.

It was incredibly frustrating for me as a young child, during the DOS days you had a fairly complete, if somewhat primitive, simple development environment with complete documentation available. I could easily do silly things like composing "melodies" or draw arbitrary shapes on the screen. Of course in retrospect any other programming language other than QBasic would have been more educational, if it had similar good documentation and integrated support for running programs. But it was still better than with windows, where there really wasn't anything comparable available that I knew of and visual studio (which I got as a christmas present, when I was 10 years old :)) seemed rather incomprehensible in comparison.

Who says it isn't? Maybe C++ just wasn't a good choice for someone coming from BASIC.

I don't think C++ is anywhere near the state of the art in terms of usability. I wonder what he would do with something like Python + PyGame.

I'm hearing people recommend I try Python quite a bit. I may very well give it a try. I'm not sure why other than a school friend recommending it, but whenever I think "programming", I always think of C files and CPP files, so my first option was to try the C programming language and then C++.

I heard that the Battlefield 2 engine was written on Python, or perhaps I could be mistaking it for another engine, but it seemed pretty cool knowing of something that was finally written on something else.

Even if that would offer some usability improvements, we've still lost something in how quickly you can draw a circle and play a sound. Don't forget your header files, can you use hardware acceleration?

I evangelize a lot for Processing [1] but it is the best experience I've had for getting to draw a circle as quickly as possible. It's a limited subset of Java included in a very basic IDE, that was developed at the MIT Media Lab. Its original purpose was bringing computation and graphics to artists and designers.

void setup() { ellipse(10,10,20,20); }

[1] processing.org

I guess it was for two reasons. Looong post ahead but it describes in a fair amount of detail why I eventually was turned off of C programming, and it was not necessarily the language itself.

1) I opted to try and self-teach using tutorials which were highly inconsistent with one-anothers practices instead of learning at an accredited college or doing some type of course. It's also possible the tutorials I used were not that brilliant. I did find some fantastic ones later on around the time I stopped (as I started working full time).

2) I believe my folly also lies in jumping the gun, a lot. With BASIC I was able to just skip around in the commands list and experiment with each command in an afternoon, figure out how it worked and see if it was what I needed. If so, I had learnt what I needed, if not, I had learnt something for use in another day, and another code. It mightn't be the most optimal way to learn, but for me it allowed me to create -exactly- what I had intended, even if the mechanics inside were smoking and banging around, they still did what they needed to.

C was just not quite the same, and I wouldn't expect it to be. It's a much more powerful language from what I've observed.

One of the issues I frequently stumbled into, would be a tutorial that would state to use an external library/header that was never explained nor included on the tutorial page. One might state that I needed to include SDL.H but at no point in the tutorial, did it describe what the contents of it were, or of its purpose / where to get it. Now sure enough, you can google it and find that it refers to the Simple DirectMedia Library but am I wrong to think it's not professional to write in references and sources in a tutorial intended for beginners? "Figure it out / google it" doesn't always strike me as the best way to teach someone, but instead writing down a clear explanation of what is required and where it can be obtained etc.

I also had stumbled upon tutorials where the author had written their own custom headers for use with the code, but the headers themselves were not included for download or reference in that tutorial, nor the contents of them explained, so I'd often have no idea of what to substitute in place of that header or how to recreate that header accordingly.

I also became frequently discouraged by the responses I got when asking for help. It seemed that if I (as well as a LOT of other people, I saw this happen to a tonne of others) was stuck or unsure and made threads anywhere asking for a bit of advice or critique as to where I've gone wrong, I was constantly met with replies of either "If you don't even know the answer then you shouldn't be programming." or "The answer is obvious, programming is not for you." etc. I made a post at one point asking for a bit of advice on building a simple calculator once as one of my first few applications, and was met with responses of "This is a HUGE waste of time, there's already calculator sources out there available for download." which sort of missed the point; I was trying to make one, rather than plagurize someone elses. Getting questioned as to why you're learning to code when you can just steal someone elses seems the wrong way to go.

Perhaps I was in the wrong communities for me, although interestingly enough there was a tech journal article that I had read not too long ago where a senior programmer slammed the sadly all-too-frequent types of these responses due to the fact that it discourages a lot of potentially good novice programmers, with people arguing for and against negative reinforcement. Personally, being told "your code is shit, you're definitely not made to be a coder" isn't really inspiration to me. Being told "You've made an error here, this is what's wrong, this is why it's not working and this is how you fix it. Have another go at it." seems more appropriate. Sure a few people react to being told they're not good enough by thinking "Well fine, I'm going to beat you at that statement and do the BEST DAMN APPLICATION EVER", but to me, I don't like to think of programming as a boot camp, but as learning just as you would if it were photoshop or 3dsmax, and I sure as hell didn't hear people in any 3dsmax class I've attended saying "You're really bad at modeling, you should quit before you waste too much time on this.".

i) In 1990 I was using a VAX computer in high school shared by a whole class and programming in Pascal. The compilation process (executed in a compilation queue) could take a few minutes or half an hour depending of how many people and processes were running that day.

I wanted to imitate this stuff in my Amiga 500, so I joined a friend who knew about electronics and he removed a Commodore 64 keyboard and made the interface to connect to the Amiga as a second keyboard. I made the software in assembler to use the same computer with a single monitor in two windows using two keyboards. A bizarre but funny project.

ii) In 1992 I wanted to make my own graphical operating system. Not in the real sense but in the Windows 3.0 sense. I started doing this in Turbo Pascal and at some point I stopped because my multitasking was cooperative just realizing a few years later that Windows (and Apple!) used this basic strategy too. I felt ashamed of them.

you were spoilt by the Amiga's pre-emptive multitasking. It's funny I recall "pre-emptive multitasking" being a big selling point of the Amiga's OS, but I never understood it at the time, I just knew it was better.

In which project (i) or (ii) ?

On (i) I took advantage of the preemptive multitasking being able to use the computer by two people in the same computer monitor.

On (ii) I knew that the Amiga was superior than a PC but I had access to more documentation and better programming languages (Turbo Pascal, Turbo C++) on the PC. In the Amiga you ended up doing multimedia stuff with a lot of hardware information about the coprocessors and DMA while in the PC was more straightforward.

Astonishing story. I'd like to meed this guy and have a coffee with him.

I have a similar story but not as harsh as his. I started programming basic(not sure which variant) on a small hand held system CASIO PB700. One this system I've created a Paranoid clone and a 4 frame ascii animation. My father saw my aptitude on programming he introduced me to his more advanced systems. He was a computer engineer and had an habit of buying new systems in 1980s. On the newer systems(I don't recall the name) I learned Turbo Pascel. Back then those systems didn't have any storage. MSDOS had to be booted from one 5 inch floppy drive and a new disk had to be inserted to get the different programs.

Every time you compile and run a program, it would take a couple of minutes to read the compiler from the 5inch floppy disk. Those floppy disks were fragile and they had a tendency to developed bad sectors after a frequent read-write session. To circumvent this, my father deployed a ramdrive on the system. A drive would be created inside the ram every time the system boots up, and it would copy the Turbo Pascel compilers and all relevant libraries to that ram drive. So next time you hit 'run', it would take seconds instead of minutes. This solutions still blows away my mind.

Another thing about these old machines, they all came with monochrome monitors. It was all green and had a very low refresh rate. Every time there was a large change on the display, you could still see the previous characters being faded away. Like matrix.

Later I had access to better machines(Pentium 1 133Mhz), but haven't forgot those days. I wish I had more information about those systems to post here, but I am long way from home. Back at home those systems are still all packed up and stacked on top on one another. The funny thing - all them work as if they are 'brand new'.

I've finally stumbled upon your post, after reading through all the comments this has gotten.

They definitely don't build 'em like they used to! The Pentium 1 133mhz system I used for a while after leaving is still in top notch shape today. I opened it up last year and there's no dust, rust or corrosion at all (and this is after it sat in a heavily tropical atmosphere for a few years recently). The hard disk is in perfect shape too, the whole thing runs like new.

I wonder how well the Osborne 486 would be running if I still had it, or if someone ever salvaged it out there and is running it today. Kind of a crazy thought but it's a shame to think that it's very likely become part of a landfill.

Absolutely. I still have my old machines. Some of them may be dusty but all of them are as clean as new.

I'd happy that you are taking your time to reply to this thread. How are you doing this days? What are you working on?

My first programming platform was basic on the Casio FX-702P, I remember reading the manual and realising what programming was all about... oh the sense of wonder and possibility on a 9 year-old. The peak of my achievement was a version of E.T. for it (my cousin had an Atari 2600 and to my little brain that was a pretty cool game). I saved my programs by writing them down on a notebook, as my dad never got any peripherals for it...

Going through the wiki page of Casio FX-702P reminds me how far we've come. Fascinating.

Nitpick: It's essentially impossible that a 486 would come with 64KB of RAM. My 8088 XT clone in '88 came with 640KB. 4-16MB would be more common for the time.

from another comment...

"The confusion likely stems from the various limits which existed when running qbasic in real mode on a 486. Conventional memory was limited to 640kb. Access to higher amounts of memory either required switching to protected mode or using XMS. Additionally, there was the segmentation limit of 64k for pointers, which may have limited things in qbasic."

Was it a PC-compatible? If so he might have had more memory but been unable to address it because he was running in Real Mode.

I'm not aware of a non-PC compatible that had an 80486. In any event, real mode can address up to 1MB of memory, with 640KB available to programs. But I do think you are right. If all he had was DOS and QBASIC, he had no way to go into protected mode. He probably just dropped the 0 by mistake.

No, he means 64 kB segments, which is what he had to deal with in the menu based file manager he wrote in QBASIC. That much should be clear if you read between the lines (the goto based text adventures etc.).

Fascinating story. If it's any consolation to the author, some of us still use Vi/Emacs, do batch processing in shell scripts, run complicated commands from the command prompt. And when it's time to take a break for some gaming, launch NetHack or some other roguelike -- in a terminal.

To hazard a guess, the biggest "wow" programming languages for the author now would be Python and JavaScript.

I'm saddened that most of the comments here are of the "me too, time for my story" variety. The amount of self references is disheartening.

I'm saddened that you're saddened by nerds relating to each other, talking about things they actually liked and feeling nostalgia.

Everyone, let's be sure we stick to the real business of HN: gossip about companies which were started for no reason other than to get sold, fighting over the brands of fetishized personal electronics, hagiographies of people like Steve Jobs and Edward Snowden, flamewars about women in tech, and preening about how we personally are the top 1% of software engineers and only hire people like ourselves.

Otherwise, someone might feel saddened.

My point is that they are not relating to each other.

Modern conversation seems to be morphing into the art of using others' comments as triggers for one's own monologue. Instead of having the discipline of quieting ourselves in order to comprehend what others are saying, we isolate ourselves into narcissism by placing mirrors onto their heads.

An awareness of this trend may surprise you about how little we actually relate to each other. Rather, we use others to relate to ourselves.

"People listened instead of just waiting for their turn to speak" [0] - Chuck Palahniuk

I have tried probably too hard to not be this person. As a result, i've had literally hundreds of people tell me I'm a great listener. I don't order food at my usual haunts, they just make it. Letting people talk, and actually hearing and giving one flying fuck, is more impactful than I ever imagined.

[0] http://www.quotationspage.com/quote/37182.html

You sure it isn't just the limitations of the medium? Threaded discussion lends itself to longer messages. And since the author of the post isn't guaranteed to be here, we can't ask questions and expect an answer. If we were on IRC, I'd expect more of a two-way conversation. Instead we have people sharing their comments (Hacker News often reminds me of blog comments.)

> Modern conversation seems to be morphing into the art of using others' comments as triggers for one's own monologue

I'm pretty sure in real life it happens that in a coming together somebody has a spark and tells a story in 5 minutes. This is in no way different.

By contrast, I'm delighted every time I come across another "me too, time for my story" comment in this thread, because those are the interesting ones.

Ditto, I've noticed this as an extremely common style of communication on this site, and I often have difficulty reading through the comment threads for this exact reason.

All of those "me too" stories are really about saying one thing: You Are Not Alone.

As the original author of the Imgur post, I agree with this.

Reading about people who have written PHP code on entry-level mobiles in 2005, or other Australians who tapped away on 15 year old systems at their school definitely has put me a lot more at ease. It has made me realize that my situation was not as bad as I'd felt initially, but rather many other people had the same deal growing up.

Contrary to the shock and rejection of belief expressed by a lot of people who have obviously grown up in areas with well funded schools and financially secure parents, this sort of thing happened to quite a lot of people, and it is STILL happening today.

I'd say there is still some kid out in some house out where I used to live, who is sitting in his room, hopefully not under the same oppressive circumstances as myself, but is working with his dads "old Windows 98 system" etc because it's all he has. There's no "right" in Australia to be up to date with the latest and greatest. No kid here has a guaranteed right to a mobile phone and a high powered computer in his bedroom, but is very fortunate to have them.

Thankfully, technology is much more affordable nowadays. In 2004, laptops were multi-thousand dollar items, but you can get a used system with Windows XP (or even 7) for less than a few hundred if you shop around now. I find that incredible.

I'm amused that a comment stating this point would (seemingly unironically) begin with the word "I'm".

For what it's worth I don't disagree that there is some cosmic absurdity and truth in what you are pointing out, but it's just such an innate component of human communication (in all of its forms, since long before there was an Internet and long before this post existed) that complaining about it is akin to saying "I'm saddened by the way these people are always breathing".

...and yes, I began this post with my own "I'm" to show I'm a proper software coder who appreciates the humor in meta-level recursion and thus I'm ever so clever!

Very happy to have found your comment. Some of them even start with "While not as interesting as OP's, [here's my boring suburban story]"

Guess this is the generation Me.

"The opposite of talking isn't listening. The opposite of talking is waiting." - Fran Lebowitz

I have a little bit similar, just not so sad story.

I was ~12 years old (around 2004), I really wanted to learn programming but all I had was a cell phone (one of the first budget class phones with GPRS, Nokia 3510i). I had very little access to internet / pc (about 30min/week) and couldn't speak English, Russian, or any other major language.

So I tried to get my hands on any PHP code examples, type it in using my phone, and upload to FTP using some online forms.

Eh, funny times. I wrote a bit more about it http://blog.gedrap.me/blog/2013/07/30/writing-code-with-a-ce...

This quote hurt my head "Windows 2000..., which was basically like 98, but crappier." But provided context for the entire thing.

Agreed, basically the most incorrect thing I've ever read on the Internet. Windows 2000 was amazing.

Windows 2000 is one of my favorite OS. I barely used XP because I liked it so much I almost went straight from 2000 to windows 7.

Mine too - All the stability of Windows NT 4, but with the addition of USB so I could use "modern stuff".

Of all the versions of Windows I've ever used 2000 was my favourite.

All depend on which hardware it runned.

Suggesting Windows 2000 gave him "no idea what modern operating systems were like" was pretty jarring too.... far from being "obsolete" it was only four years old at the time and from the perspective of the average user not stuck on a 486 lacked only a bit of UI gloss compared with the state-of-the-art XP. If nothing else, you'd expect he might be impressed at the progress the internet had made since 1999

That quote definitely doesn't make sense. Maybe he's thinking of Windows ME, widely regarded as being worse than 98? It's an odd thing to mix up for anyone familiar with the two.

It's quite possible he was not familiar with 2000 at the time (consumer vs enterprise), and conflated the both afterwards, because no one really talks about ME anymore.

Maybe it was because it was easier for the college to lock down the Windows 2000 workstations as opposed to the Windows 98 computers with Novell NetWare. I don't know, though, and I can't speak for the author. The story has more holes than swiss cheese. I honestly don't believe any of the account, but it is useful for bringing up nostalgia.

Indeed after hearing about Windows 2000, I'm under the belief that it wasn't the OS at fault, but rather the horridly outdated and poorly maintained systems that the school was running it on. They never saw a defrag in their life, and took at least 10-15 minutes to book up, and even at that, doing anything was painfully slow. Should you accidentally hit the start button or run something, you were in for another painfully long wait.

I apologise to the Win 2000 fans. I never actually have tried it on a system in much greater health, but from what I've heard in the last few days, I'm sure it's not that bad, and guess I was just a bit mis-lead by the ridiculous systems they ran on.

I'm actually very willing to bet the school is still running those systems today.

It shouldn't hurt your head. Windows 2000 marked a switch from the 95/98 line to the NT line and that created some compatibility problems in early days. The rest of this is subjective.

I relate somewhat to poster's situation, as one of my half-brothers had a somewhat related experience in rural NT, sans the happy ending. If the poster is reading HN, I'd be happy to catch you up on the state of Australian tech. My e-mail is in my profile.

I'm certainly reading! And with great interest at that, it's very interesting hearing of the other Australians who were in similar situations. I find it interesting too how a lot of those from America and other countries are quick to criticize the story and are shocked at how "you didn't have a mobile phone at age 13!?" or "I don't believe for a second that he didn't have an ADSL connection in rural Australia in 2002", or even the assumption that all our public schools in the outback of Australia were equipped with the latest in Pentium technology when in fact, some were as far back as 15 years out of date.

Thanks for the invite too. I don't know where I'd start with modern tech. I've been building up more knowledge over the years and started by purchasing each individual part to build my own system in 2009. I'm no whiz but I'd say I have a reasonable degree of literacy nowadays. The one thing I could use help with is getting started in modern programming. My head appears to still be stuck in the idea of using a simple text-editor-style compiler with no additional libraries, headers, (linkers?) etc required. I had a bit of trouble finding good tutorials that actually supplied the relevant libraries, headers etc for the code they were explaining, so it became difficult when they would expect that every new programmer knows straight away what the SDL header is for example, and where it can be acquired from.

Outback public schools are lucky if they're equipped with more than one toilet.

There are many good tutorials on the Internet for a lot of things, but many make assumptions for things that aren't specifically code-related (libraries, linking, etc) that can be frustrating when you're starting. Consider it a rite of passage.

If you have any questions on anything (there's no such thing as a dumb question), again feel free to e-mail.

This isn't an OS - it's closer to a DOS shell, like Central Point's PC Tools.

Yep, it's more or less a shell. I probably should've clarified it a bit better in the imgur description. It's basically a dos shell representitive of my rendition of a "late 1980s OS".

Would just like to say thanks for your really fascinating story :) This is the intellectual highlight of my day. Don't let the naysayers get to you. Not sure what you're doing at the moment, but I'm sure you've directed your technical skills towards something that pays well and acquired new ones in the meantime :)

The projects you describe here are obviously works of love that show great technical talent. Too bad you didn't have access to better learning resources at an earlier point in time, but learning is thankfully a multi-decade project. (I also hope your P.O.S. stepfather got what was coming to him in the end).

Thanks for your comments.

Technically, the naysayers haven't bothered me too much, but a lot of people seem irked (and rightfully) that it was initially mistermed as an Operating System when in fact a simple shell of sorts. I have amended the imgur description since however.

As for what I'm doing at the moment, I'm a Lead Quality Assurance Analyst for a video games company. I love what I do, but as for pay, it's less than half the minimum wage. Goes to show that I REALLY love what I do I suppose, but I would really like to do something more technical someday. Perhaps special effects or something of the like.

I'd say the works were definitely a very important part of my life, although they were not necessarily a technical triumph by any means. I still claim that they're nothing more than a series of cheap hackery and tricks, but hey, even a Magicians work will entertain and amaze, even if it's not "really" happening you know?

My stepfather is still out there, but both time and emphysema will catch him in the end, so I try not to concern myself too much with it. Nature and tobacco will do its work.

> Don't let the naysayers get to you.

Clarification of facts is not naysaying.

I think what they did was impressive and interesting.

I wasn't referring to you; I just found a plausible place to reply to OP. There were plenty of real naysayers in the thread, though. And even more that were just very insensitive. (Again, I'm not referring to you).

While I don't have a story quite like OP's, I wanted to share my experiences with old-school programming in more recent history. In middle and high school (2008-2014), I had access to modern hardware, the internet and all those lovely things at home. At school, though, I didn't have access to a computer most of the time, and yet I wanted to program. What I did have though, was a TI-84+SE graphing calculator, with a z80 cpu clocked at 15MHz and 24/128K accessible/total RAM, and 1.5/2.0MB of ROM. Most teachers didn't really care if I just stuck my nose into that thing for the entire class, so often I did just that.

For those not familiar with the calculator, it ships with an OS that provides a shell where the user can input mathematical expressions, as well as a graphing interface. It also provides a program editor where the user can write what are essentially shell scripts for the main interface in TI-BASIC, or run machine code written in hex if the program starts with the AsmPrgm token. It also lets you transfer files from a PC using a usb port on the calculator. You could download and install "Apps" for the calculator. When I was 12-14 I mostly wrote small programs, like Pong and a quadratic factorization utility, in TI-BASIC, and played games that I downloaded in class.

After a while I tired of TI-BASIC because it was slow, and that limited the possibilities. I downloaded Axe, an App that modifies the code in RAM of Kernel's TI-BASIC editor to allow for programs written in a different language, which was basically assembly + a lot of macros and a standard library useful for writing games, or like C without a type system. This code was compiled to machine code on the calculator, and therefore the code ran at a reasonable speed (keep in mind it's still just a 15MHz cpu from the 70s).

With Axe I made some more interesting programs. Unlike with TI-BASIC, where learning the language was easy since the calculators main interface was basically a REPL for the programming language, Axe was hard to learn. I needed to look at external documentation to know what I was dong in Axe, but even though I had a smartphone at this point, I couldn't use it in class, so I printed out the manual to reference it in class. I wrote a bunch of programs in Axe, like a cellular automata simulator, a game of life program, a 4-level grayscale sprite editor (which worked by flickering the pixels of the 1-bit display which had a slow enough response time to create a static gray color if alternated quickly enough). I also learned z80 assembly and used an on-calc assembler and editor called mimas, but editing assembly on-calc was too bothersome with a 64x96 display, so I mostly stuck with Axe.

Eventually I stopped being the nerdy kid who played with his calculator all day, but I had fallen in love with the platform, so even though I wasn't writing code in class, I wrote code for the calculator at home, sometimes on my PC and sometimes on the calculator just because I liked that experience more.

I'll be sad to see these ancient calculators finally phased out of the math curriculum in the US, so that kids no longer get access to that last stronghold of 80s computing. Raspberry Pi is cool, but it's still so much more modern than the calculators, which hurts its coolness factor for me, but more importantly, kids won't have access to it unless their parents want them to get into computing. Soon I see kids only having hardware that seems impenetrably locked down, which is a real shame.

> I realized that my work was technically obsolete as hell, and not really of any value.

For all of the benefits of the internet, I see this as a tragic side effect. Rather than just trying something and seeing if it works, the first instinct is to Google it to see if anyone else has done something similar. Even if you manage to start on something, eventually you'll be exposed to someone who's done it so much better and more thoroughly than you and get discouraged and give up.

Of course, plenty of innovation still happens and people can find inspiration in superior implementations, but the loss feels similar to the loss of biodiversity, or language diversity resulting from globalization -- that is, isolated populations create vastly different solutions to the same set of problems in ways that aren't possible without those barriers.

I actually see that as an advantage: instead of reinventing the wheel, you're using others work as a basis of something that has probably got more functionality and is more useful.

I do agree that this does mess with tinkering with the fundamentals and stuff. But here's my thesis: when you work with a technology for a long time, after a while you will naturally be curios about how the lower level stuff is working. I think this is why a lot of application developers still find systems programming exciting. And also why the hacker news audience is also keenly interested in stuff besides the very narrow focus of cs/ee.

> I actually see that as an advantage: instead of reinventing the wheel, you're using others work as a basis of something that has probably got more functionality and is more useful.

It's really both. The trouble is you want e.g. a protocol to secure your communications, and designing one yourself while making it secure is a huge ordeal, whereas you could just use TLS for free even though it's full of warts. So everybody uses TLS. Which then snowballs into everyone using OpenSSL for TLS even though it's ancient and full of cruft.

The problem is you do want everyone using the well-reviewed long-tested thing, but then that thing becomes a decaying designed-by-committee monstrosity which nobody can fix because it would break compatibility. See also HTML, HTTP, DNSSEC, javascript, etc.

Experience is worth something, even if it comes from toy projects.

The downside of "lazy learning" is you can't know what tools you'll need in the future. I was able to win at one of my favorite work projects because I'd played with signal processing and machine learning for fun.

Had I not, I would've concluded I couldn't beat the terrible solution we had—I wouldn't have had the tools to know better.

And of course, your reinvented wheel might turn out to be an actual improvement!

I would say that lazy learning is a pretty big boon overall for both you as a person and society at large. On one hand, you have niche skills that can be applied professionally, and on the other, you can find people with knowledge in some niche subject that they can develop on the job pretty quickly. You will never not find an application for some bit of knowledge.

Apologies for the late reply, but that's precisely my point: if I only did lazy learning, I would've missed awesome opportunities.

Of course, I can never know everything. I do both: I learn things as needed to get something done, and I learn things just because they fascinate me and might prove useful later. That latter part—gaining knowledge for the sake of knowledge—has been an amazing benefit, specifically by giving me a better map of what's possible, and how the seemingly impossible might be done.

IOW, I consider lazy learning necessary, because no one can know everything, but not something I'd encourage for its own sake, because sometimes you have to know something before you can know when it'll be useful. I absolutely encourage playing around just to learn more about your craft—you'll learn things that will prove useful later.

I also first learned to program in TI-BASIC, on a TI-83+ in 6th grade, and I did it prolifically for the next four years. Learning programming at that age was a huge advantage when I "officially" learned later: even though TI-BASIC only has 26 variables and no subroutines, flow control is basically the same, and it gave me an intuitive understanding to how programming languages work.

Also, TI-BASIC made it so easy to display graphics, just by using the 'output' function or plotting to the graph. I think a huge turn off to beginner programmers is the amount of material you have to learn before you can do simple stuff like that (I remember looking for a "draw pixel" function in Java, only to learn to my dismay that I had to choose a GUI toolkit and create a window first). The ease of using the TI API made it addictive, and it lured me into programming for hours at a time in school.

I was too young to remember much TI-BASIC but as for "draw pixel" I really enjoyed VGA's "Mode 13" which was very fast, great for making games, even without tricks like page flipping.

You can get infinite variables by storing things in the "list" variables. This was the breakthrough that allowed me to make various Snake-like games in 8th grade :)

Haha yeah I forgot about that. Also, if you write the programs on your computer, you can use lowercase a-z and get 26 more. And there's theta and a couple other weird ones too.

Aw, when I was an exchange student in Japan, all I had was my Casio graphing calculator (ironically bought in Norway, as it was one of a few required/approved for use on exams). The limited programming language was rather awful, but I still managed to implement plotting a zirpensky gasket, and a mandelbrot set (in black and white. The mandelbrot took roughly three hours to render, as I recall. I'd just start it, and leave it in my bag, and check it after a few periods... it was a little bit like self-mutilation -- but also a great feeling when you manage to make something like you intended, despite the limitations of the system... I had bought a copy of "The Mathematical Tourist: Snapshots of Modern Mathematics" by Peterson. A great and fun update on math, with pretty pictures. And enough detail that I could implement eg. the mandelbrot generator :-)

I'm not too sad to see old calculators go -- I'm just hoping we'll get something like python notebooks on smart phones used in the classroom. Might have some problems with the networking... maybe there's a marked for an android pad that doesn't have any wireless hw?

In Norway those Casio calculators were also approved for use at the college level -- but they were stupidly banned, because you could flash the rom, and store all sorts of useful things on them. Really stupid IMNHO, because if you can solve the (college level/sen. high school level) exam with the help of some utility or other -- or just a large book -- the exam is probably structured very poorly.

There was an initiative to move to "open book exams" -- where you were allowed to basically bring any books/notes you wanted. I think it was great -- the questions were a little harder -- and you still had to know the subject matter or you'd run out of time before completing -- but I think it encouraged better tests than when you could essentially administer a glorified spelling test for any subject.

I still keep the Casio fx-7700GE on my desk and I still use it for quick calculations. Hasn't failed me since ~1990. Faster than using my tablet. The programming language was useful for some tests that involved using a formula. For example, I took a business statistics class a few years ago and stuff like this is still in memory: NCX x (P^X) x (1-P)^(N-X)

You, youngsters. I started with fx-7200 with whooping 422 bytes of memory and it was awe-so-me.

As the original Author of Axe, that's an extremely inspiring story.

I also, as you can guess, first got into programming on my trusty TI-83+ when I started high school. Without really knowing the TI-BASIC language and lacking the link cable, I remember some time when I was in the library typing a Mandelbrot set program line for line from the computer, trying to figure out where all those random tokens were located. Once I finally got that working, I was hooked and started making all sort of different programs.

Some of the early programs I remember creating were mostly math programs and algorithms I learned from classes. But very soon I started getting into gaming. I had always wanted to make really cool games, but I always ended up limited by the capabilities of the calculator, and ended up mostly producing puzzle games; Minesweeper, Sudoku, Skyscrapers, etc. I also managed to make a text adventure and an 'AI' chat bot.

I wondered how people were able to make some of the more advanced games and eventually stumbled upon assembly. It took me a really long time to grasp all the concepts and it was very frustrating. Every instruction has a bunch of side effects, you have to remember what every register is holding at every point in time, and you have to memorize all the mnemonics. I was also annoyed that I couldn't program it on the go (Mimas didn't exist at the time) but that certainly didn't stop me! With nothing better to do during that bus ride to and from school, I actually printed out all the op-codes, ram locations, bcalls, ascii tables, token tables, and would literally type in the hexadecimal into the calculator to program! It was insane and really dangerous, constantly deleting the memory every time I typed a wrong character or forgot about the endianess. I eventually just gave up on that and stuck with programming assembly games on the computer in my free time.

By the time I learned enough assembly to produce my first full game, a PacMan clone, it was around the end of high school and it was then that I decided that I really wanted to make it easier for people like myself to be able to make actually good games on the go. It wasn't until a year later that I started creating Axe Parser for that purpose. Although I didn't have any formal experience with compilers, languages, or other relevant theory I started anyway, just doing what I figured would work best. It actually managed to succeed and after only a month of work, I was able to program a full game in about an hour! It was exactly what I wished had existed when I was in school, and it's super exciting to see there are students out there now that actually have this power and are making awesome stuff! Its even more thrilling when it influences people to get into coding more seriously. Warms my heart every time I see things like this :)

As for me now, I haven't touched z80 programming for a few years now. I'm working at an amazing start-up where I get to combine programming and mathematics to do some really interesting things.

I never heard about Axe, I wish I'd had access to it when I was in high school. I spent a lot of time in high school writing games in TI BASIC.

Good job!

> 4-level grayscale sprite editor (which worked by flickering the pixels of the 1-bit display which had a slow enough response time to create a static gray color if alternated quickly enough)

I have heard of games (like Doom) on the TI calculators, that also had sound effects. The program was sending pulses to the linkport, and the user needed a 3.5 mm to 2.5 mm adapter to connect a headphone plug into it, but then you got sound.

Reminds me of sound-sampling on the C64 via the tape port: The C64 tape port is connected to the GPIO pins on the 6510 CPU. You get 1 bit amplitude data, at roughly whatever spedd you manage to sample the register.

The problem there of course is that you're dealing with a 1MHz CPU, and 64K memory, and for real-time sampling you need to make tradeoffs between sacrificing sample rate for cycles to somewhat compress the data vs. running out of RAM in seconds.

For playback, it's equally primitive: With some settings, flipping the volume to max and down creates a sufficient click that doing that for 1, and keeping it off for 0 gives you recognisable sampled music output.

It's really quite amazing how low bitrate you can get to and still have recognisable music.

If you do pre-processing you can get quite tolerable, substantially longer samples (if you do lots of pre-processing you can end up with this: https://www.youtube.com/watch?v=D6MSbdUMFZ4 )

Yeah you can do that. I've messed around with emitting square waves but not much else. There's also a problem with that where plugging something into the link port triggered an interrupt that diverted control back to the kernel which would then try to initiate communication with another calculator over the link. Not the desired behavior. In the best case this would cause serious lag. In the worst case, a crash and hard ram reset. You had to patch the kernel in order to output sound. Fun stuff.

A kid in my year at school did the same in 2002/2003. He programmed multiplayer snake, pong and a little top-down tank game that was pretty basic. He'd write these up in various classes and we'd play them multiplayer over the link cables.

He was a really talented programmer but eventually became really weird and drove everybody away, so I have no idea what he's doing now.

Ah the memories of the TI range of calculators.

We had the TI-85 in our year. I wrote a chat program which used the link cable. With an "enhanced and extended" link cable, it was possible to chat with a classmate sitting several meters away. Teachers knew nothing, as it was a calculator and just a calculator for them.

1998 - chat on a handheld device without data or SMS costs :)

Countless hours wasted on playing ZTETRIS with classmates on a TI-86...

It was the time I first learned https://en.wikipedia.org/wiki/Tetris_effect is a thing. :D

Heh. Dreaming Tetris was standard these days...

Lots of nostalgia here. I had an HP 48G and it had it's own legitimate programming language of which I barely scratched the surface. I programmed and downloaded all sorts of games on there. Wrote basic quadratic equation solvers and graphers.

One of my favorite things about it was that it had an internal speaker. You could specify the frequency and duration of notes. I was in concert band at the time and was into "The Planets" suite by Holst. I ended up writing two programs that played the main tracks for Jupiter. I had another friend who had the calculator so I gave him one part and retained one for myself. We started both programs at the same time and listened to beautiful tinny classical music.

It was stupid easy to output sound on that thing. Like someone else mentioned it was also very easy to output to the screen. I know applications are much more complicated than they were back then but I do feel bad for kids that they might not have a sandbox environment to try to push the boundaries for their knowledge. This kind of stuff got me hooked on coding.

15MHz is not from the 70s ;-) (at least outside of labs). The IBM PC shipped with a 4.77MHz CPU in the early 80s, the 386 came out in 1986 at 12 MHz, the Mac II was 16MHz in 1987.

I got hooked on programming by the HP65, an early and very expensive programmable calculator (with magnetic data storage!) that a family friend let me play with for an afternoon. Eventually I got my own Sinclair calculator (36 instruction memory!) and programmed it obsessively. If that thing ran at 1MHz I'd be very, very surprised.

Incidentally, around 1983-85 there was a brief explosion of electronic typewriters with small (usually 2-4-line) LCD displays and oversized calculators with qwerty keyboards, a decent amount of memory (16 kB say), and some form of BASIC. They looked interesting, but I don't think I know anyone who bought one (and my friends and I were all programming geeks).

Computation-nostalgia is endless joy :)

In the mid-80s I did have access to several models of those pocket Japanese computers that ran BASIC, including the CASIO FX-750P. I was definitely a loner in this regard. I just got lucky -- my father was in Japan doing film work and probably spent his entire paycheck several times at the electronics malls. He bought these because he wanted to write software to help camera operators with setting up their shots, anywhere in the world.


I had dabbled in Apple IIs at school but this was my go-to machine at home. I could not do very much with a 1-line LED display but I wrote some simple games and animations.

Every kid learning engineering would benefit from working an old-school piece of equipment, even if simulated.

This is how I got into programming as well, creating programs for my TI-84 that helped me cheat on math tests. The watershed point for me was figuring out how to archive programs so that a "2nd+7+1+2" reset wouldn't delete what I'd created.

Frustrated the teachers to no end as well, I'd never do my homework but get A's on all of my tests/quizzes. I guess I was too busy learning the material to cheat, instead of doing the homework...

I eventually got caught when I started giving the programs to popular kids so they would include me in stuff, luckily I had written a failsafe that deleted the cheats, but left some games I had (even at 14 I knew how confusing false positives can be). I got in trouble for playing games, the popular kids got in trouble for cheating. I wasn't included in much for a while after that...

486s typically had 4-16MB RAM. To access this memory in DOS a special driver needed to be loaded, EXTMEM386.SYS or something like that from what I remember.

This article really is somewhat misleading, as that is not an OS, rather it is an application running on DOS.

Did the same sort of thing as a kid on an Atari 130 XE. Was a lot more graphical though. It's something you do when your computer is essentially useless :)

In the late 1990s, my parents divorced, and my mother took my brother and myself and had us go live in a very rural area of Australia with a psychopath who was wanted in 3 states. This was our new stepfather, so we were to remain in isolation so that he wouldn't be found. This being said, we were not allowed to leave the house after school hours, nor use the internet, nor own mobile phones.

Truly shocking that this could happen in Australia. Where were the child support services? Why didn't the teachers notice anything?

The rest they say, is history. Mum finally ditched the guy who made our life hell. I was allowed to move back to civilization and had my mind absolutely BLOWN away by Crisis screenshots.

I wonder how old the author was when this happened? Was he still being kept captive as an adult?

This story raises serious questions about the adequacy of child protection and law enforcement in Australia.

Consider that in the late 90's cellphone penetration was still low enough that not being allowed cellphones was fairly common, and certainly not something anyone would react to.

Not being allowed to leave the house after school hours, or use the internet is still a fairly common situation for kids in many sects that are not considered problematic enough to interfere with.

What you describe as "being kept captive", others would consider protecting their children from corrupting outside influences (I'd like to emphasise that I absolutely don't think that is right).

That the stepfather in this case seems to have been motivated by hiding from the authorities doesn't make much difference - the point is most schools have students that are in similar situations, so it's not something most schools would see as exceptional and something they can interfere with.

It's a difficult tradeoff in terms of child protection, because the traumatic experiences of being taken into care or disrupted by lots of interference can easily become worse than what the interference is meant to fix, and because the moment you start passing moral judgement over peoples parenting, you're in for a crazy struggle over how it is acceptable for people to raise their kids.

That's a very very logical perspective of it, considering it from both sides.

A lot of times, my parents (mother and stepfather) played it off as "he needs to be disciplined etc", although they omitted certain truths when it came time to talk to the police. A punch to the side of the head was a "clip over the ear" etc. The police however seemed not to care, I will never forget hearing their "Ah yeah." reactions to it, but again they face the same thing as mentioned above. Even if you're not legally allowed to hit a child now, it's still seen in some places here as an unwritten omission of law to discipline your child if it is "Reasonable", such as a smack on the bum etc.

Indeed being kept inside the home wasn't really the bad part of it all. I have always been much of a loner, and it was very likely I wouldn't have done much outside of the home. I did sometimes "Run away" for nights, and I would walk through the city or walk out into the country for a very very very long time at night wondering where to go or what to do. Now that I look back on it, I'm actually very surprised albeit greatful that I never was abducted or robbed. But at that point, I didn't really care where I went or what happened because anything seemed better than what was going on at home.

The teachers noticed, and they tried to help at first, but this only resulted in aggressive confrontations with my stepfather. They then wanted absolutely nothing to do with either himself or me.


The Department of Community Services was responsible for overseeing cases such as child abuse etc in during that time (or DOCS as we called it). My counsellor spoke in great depth to me about it, but he was not confident that much would be done (as seen above, many many cases are left behind or ignored). He put in a lot of requests to DOCS to look into my situation, and they did only a few of those times, but their methods were shocking and they did absolutely nothing. They interviewed my stepfather once and it more or less consisted of "have you mistreated a minor in your household?" "No?" "Oh okay well thanks anyway, bye". On another occasion, they spoke to me about it. I honestly couldn't remember what we talked about, as I felt a bit too young to fully understand what was really going on. I spoke to so many people about it and no-one was able to help, including the police.

(I'm wondering if the department was understaffed or suffering some other internal crisis? They still are apparently.)

In 1999, I would've been 10 years old, so when I finally left, I was 17 nearing 18 as the new year approached. Goodbye teen years, but it would've been a lot worse if I was a kid. A lot of the physical nasties that occurred stopped as I grew older, perhaps out of the fact that my stepfather was aware I'd be a lot wiser about how to tackle it (ie I could more easily convince police of something going on.). He was very careful to get as close to the border-line of crossing the arrest threshold as he could, without ever actually going over it. He never hit my mother, but he hit us and we were raised by him to believe that getting either choked, kicked in the back etc was "Discipline" rather than abuse.

I'd have to agree on the inadequacy of both child protection and law enforcement here, as it's been shown in the last couple of days with the "terrorist" in sydney, the justice system here is seriously broken. The same man who deprived us of liberty was walking free on around 40-50 other charges, including break and enter, sexual assault of a minor, assault etc, much like the fellow who held up the cafeteria. I don't know where the break down is, but the justice system always seems to "let people off with a warning" or a light sentence. If you assault someone, expect to walk free. If you break into someones house and molest a child as he did, you walk free with a big scary "warning". If you continue, it's another big scary "warning" etc.

He's still out today, and I haven't pursued it, because I know that there is no way he would ever be behind bars, the justice system completely let us down and I wouldn't trust it with my life. One of the first men I rented my first apartment with was exactly the same; multiple charges that should have landed him behind bars, yet he was walking free but with police coming to the door from time to time to "check on him". Of course it wouldn't stop any crook from doing something stupid if they are clever enough not to be caught.

I also have a similar, but definitely not tragic, story. In the late '90s my dad got us a ZX Spectrum, so me and my brother learned BASIC and played Deathchase.

Around ~2000 we got a Pentium "75", that is - 75mhz, 16mb ram, 500mb hdd with no cdrom or sound card. I think it had a 2mb S3 video card, and came with Windows 95.

At some point, after some tweaks, you could play Mortal Kombat 4 in a very small window, installed from multi-rar archives on around ~20 floppys.

We had a lot CDs with games from gaming magazines, with "cool" (or so I thought at the time) HTML/JavaScript autoruns, that ran with Internet Explorer 4. Since most of these games (eg. NFS 3) wouldn't run on my PC, I found out how to "View Source", and I basically learned HTML/CSS/JS from them. In 2003 I sold my first "DHTML" menu widget.

Throwing my story in the mix: I owned a 386 and a 486 when I was around 9. Hand me down computers. My family was poor and we didn't have the internet and I'm not sure it would have mattered if we did back then. I learned QBASIC and Dos from library books and help files. Made lots of text adventure games and later games with minor graphics. I'm a professional developer now (have been for 12 years). I know multiple languages including python and I still make games on the side as a hobby.

I keep seeing stories similar to mine. I really think we (humans) have lost something with new tech. Simple UI breeds users whereas hard seems to breed developers.

Edit: typed that out on my phone - made some typo fixes.

Also forgot to mention - I never went to school for CS and I dropped out of community college which I attended for graphic design.

I can relate too, though my situation was also not that bad. Too poor to buy a computer, learned programming (BASIC mostly for the C64) from library books, and electronics from library books, and general details about PC hardware from magazines. This all without having a computer. Then I managed to repair and salvage enough old PC parts to piece together my first 8086 PC, eventually upgraded to a 286. We were long distance calls away from any city so dial up connections were impossibly expensive, so I learned to hack my way around some online services and phone phreaking to get myself onto the internet in 1993. I then got hooked on Unix and the internet and it became my ticket out of our in the middle of nowhere town all the way to living in the developed world.

QBasic... those were the days.

I actually started programming around 2002/2003. At that time QBasic was of course already horribly outdated. But that was what my school teacher was using. We used it to have direct PEAK/POKE access to the LPT-Port (the big, bulky printer port!). So we connected LEDs and other Stuff to the LPT Pins and made them blink with QBasic (and Windows 2000).

I still sticked to QBasic a while because i refused to learn anything else and C-Style languages just looked scary to me. By the way, if you are looking for a cool, modern, cross-platform BASIC Compiler take a look at FreeBasic. It started out as a QBasic-kompatible Interpreter but now is a Project on it's own.

Now that would've been more interesting. I never got to experiment with printing or connecting anything to my own system, but it would've been fun to make something like what you've described.

Indeed C was a bit intimidating to me at first. I took up a TAFE games programming course in 2008 to learn it, but our teacher was an ex-banker and didn't seem to really have any enthusiasm for games, and there wasn't really any "games programming" in the course. I left after getting mugged in broad daylight on a busy road (Mount Druitt is absolutely atrocious, never going back there again!)

This OS is interesting enough on it's own, but I want to hear more of the story of how and why it was created.

I wrote it up as best as I could in the imgur description, but I'll see if I can summarize it here.

Basically, I was without any modern resources for quite some time. I had a 486, QBASIC and EDIT.com to play with; the rest of the DOS exes etc I didn't want to touch out of fear of buggering up my system as I wasn't very computer literate.

My library had two 1970s books on BASIC which I loaned out (Basic BASIC and Advanced BASIC by James S Coan) and with nothing else to do in the hours after school, I decided to try my hand at it. I played around with it for a bit and built a really primitive "DOS Clone", which relied on a simple series of INPUT's, PRINT's, IF / THEN statements and GOTO. I then also began experimenting with the ASCII Character map which was included in the QBASIC help file, and realized that if I used the correct ASCII codes, I was able to form visualizations on the screen, simple GUIs etc. I combined the two and began adding more and more functionality to it until I eventually named it "OSCI" ("Aussie"). That's pretty much all there was to it.

How: An old 486, QBASIC, and lots of free time

Why: Boredom, curiosity.

Where did he get the books? When I was a kid, the DOS interrupt bible was something I turned into a dog-eared nightmare. I wonder if he had access to technical books through his school maybe?

I had access to two programming books; Basic BASIC and Advanced BASIC by James S Coan which I loaned out (and frequently over-loaned) from my high school library. I searched intensely during lunch periods at the library for other books on anything related to computers and programming, but I don't recall any other notable computer books.

The books I had loaned out however were dated 1977 - 1978 according to a recent google search. Very obsolete for the time. I'm tempted to call the high school library and see if I can buy them, since they'd be even more so useless for any student who wants to get into programming now.

Better yet, it's just given me an idea, if they still haven't updated their books on programming, I might even see if I can buy a few good modern books on C / Python etc and donate them to the library, along with software. Who knows, it might stop someone else from ending up the same way I did.

The story sounds like it has some significant parallels to that of Julian Assange in his teenage years (at least as depicted in the movie Underground). Is he an inspiration to you?

I haven't seen the movie, but I have read once that he used to sleep to orange lights as he believed humans would sleep better to it after being accustomed to sleeping by the light of a fire for many many years.

I'll have to look into his early years and see, although I was probably nowhere near. I was a very emotionally unstable and unusual kid after the move to the country. I had zero social skills and a lot of post traumatic stress disorder from events at home. My methods to socializing and living in general were very odd.

I hoarded food and possessions as the cupboards at home were often (but not always) padlocked to keep me from picking my own food out. I would hunt around in bins for discarded food or 5 cent coins to save for canteen food and was referred to as a "scab" quite a bit, but I tended to ignore this, even though I was probably seen as one of "Those oddball kids". I had very little value placed on my reputation.

A lot of my possessions were often confiscated, and anything of special value (ie gifts from my dad) were often smashed and broken in front of me. I had a beautiful Carramar Lava Lamp once that he purchased for me which was destroyed by my stepfather to stop me from staying up at night and using it to read. The trait still sticks with me now; I don't hoard general junk, but I am extremely protective of my current possessions.

Not sure if this ties in with Julian Assanges story. Again I'll have to check it out. He sounds like a very intriguing person.

This is amazing. This took me back to the 90's when I was learning BASIC by scrounging for books in our school library. I grew up in Oman and there weren't too many programming resources. So I had to make do with random BASIC books that were often in different dialects (Apple BASIC for example).

I also reinvented the wheel many times and wrote a bunch of silly little programs. It was a great and fun experience and this story really took me back.

Two oddities, there's a technical difference between a GUI desktop environment and an operating system, even if the commercial vendors insist they're conjoined twins and alternatives to that worldview are a thoughtcrime. So this is an OS of the Gnome variety, not a OS of the retroBSD variety.

The other oddity having lived thru a slightly earlier era in high school, we did TONS of sneakernet peer 2 peer filesharing using floppies. The stuff he needed would be getting would be kinda obsolete, on the other hand the school had stuff laying around so if you wanted a "turbo C" it was literally sitting in the library waiting for you to copy it. Anyway I'm surprised he never found anyone to trade "warez" with.

In '99 when the story started I already had 5 or 6 years of experience with a linux desktop at home, first with SLS floppy based installs off a BBS, but by 99 I had a couple years experience with Debian, makes you wonder what would have happened if the guy had been on an abandoned isle with a linux distro instead of quickbasic...

Looks like typical apps we all created during the early 90s. I taught my little brother Pascal in what seemed like a few minutes (I think he was in 5th grade) and within a weekend he created a file manager that would blow this one away. My brother is very smart but my point is, coding some DOS games or a file manager is not anything remarkable, even if you only had the manual to work from. Most developers during that time period worked from a manual. We did not have StackExchange, haha. Where do you think RTFM comes from? I'm surprised this is trending. If you want to see what was typical of the early 90s, take a look at the Graphics Gems books http://www.amazon.com/Graphics-Gems-Andrew-S-Glassner/dp/012... every bookstore in the US had Graphics Gems.

What was your experience level before you started this?

I'm about to graduate with a degree in computer science and I would love to work on a long term project like this but I feel like I could never accomplish something remotely close to the complexity involved in your project, especially without outside help like the internet.

Note: I am not the person in the imgur. The title was changed after I submitted to one that is, IMHO, much more confusing.

(original title was "An Australian, isolated from 1999-2006 with a 486. Built his own late 80s OS").

My experience level with computers was minimal; I had basic knowledge of running games from the DOS prompt or a particular type of DOS Shell that my dad had set up on our home computer.

Other than that, I had no knowledge. I didn't know anything about processor power, RAM, etc.

I'm not sure if it's used for anything anymore, but QBASIC is, I believe, freely available on Microsofts legacy FTP. It's a very safe and easy to use environment to learn about putting together some basic programs (hence the name BASIC?). If you learned to say, build a calculator in QBASIC, it wouldn't be hard to take that programming knowledge and move it across to a modern language like C.

Reminds me of the story recently about the guy who built a game over what, ten years or so? (Can't find a link right now.)

"Babylonian twins"? Made by a Iraqi starting in the 90s.


Tobias and the Dark Sceptres? http://www.tobiasgame.co.uk/

Ha, no IIRC he began in college in the 90's, developing for the Windows platform, and finished it many years later, and his post documented its evolution and had the same sort of feel of working with outdated technologies, but seeing it through to the end.

Dwarf Fortress? Who's development is ongoing?

Reminded me of the classic 'stuck in a room' reddit thread


I guess I've answered the threads question in a sense.

Though I wish I'd at least had a basic copy of Windows XP, and a good C++ compiler + some reference books.

I find it painful to know that there are people like this who live like this painfully everyday: * people with old computers because they cannot get new ones, * people with old crappy android phones because they cannot get the new kind of phones, * people who are poor and are working on some rundown computer that does not work very nicely because it is worn down, and other people who are just stuck with whatever they have and how we will never hear their story because ... well.

And when I say people, I really mean kids. Heartbreak!

"Was isolated from 1999"

So why label it (c) Copyright 1995? http://i.imgur.com/XSDOXEZ.png

That's a very good question. I can't actually remember any particular inspiration for using 1995 as a start date, but it was probably just a random year to give it an "Authentic" look as if it had been some old shell system for the TAFE that had been in use for ages (ie much like how most software will have "copyright 1996-2014" to certify the date on which the copyright was filed and what the current year was. It was however, never meant to be an authentic piece of software or have any significant numbers, merely written as a brief demonstration that used a few simple commands to list file and directory structures. 2006 was the year that code would've been made though.

He should learn python, after wbasic it was the first language I used that was as easy or fun(after years of java).

I had a 486 in 1991. If this person had one from 1999 to 2006 I'm amazed it didn't suffer repeated hardware failures. Hard drives die often. I want to believe this isn't true because it's too depressing. I want to believe it's true because people deserve the benefit of the doubt.

Today I still have a working Pentium 1 with hard disks in full operating order.

As for the 486, I brought it down to my dads when I moved, and switched to using his much more modern system. It did eventually "die" in 2007-2008. My dad threw it out after the clock chip failed on the motherboard. I was pretty bummed because it sounds like a fairly straightforward fix / such a minor issue, but he's as sentimental as I and referred to it as "an old clunker".

I just read that part and was slightly horrified. It's extremely unlikely that there was anything wrong with the RTC chip. What's much more likely is that the CMOS battery was flat. Usually, that's a CR2032 "button cell" that you can buy for pennies. It takes about 5 minutes to change, most of which is unscrewing and re-closing the case.

At least you have your Pentium. I've still got a bunch of my old code from the 90s and I'd have a very hard time getting any of it to run on a modern PC.

Fair enough, pardon my previous skepticism, I hope things are going better for you now!

Completely understandable; My modern system has suffered enough hard disk failures to put it in the 2 digit range, and I've gone through around 7 graphics cards. Likely the result of having a computer that is constantly exposed to extremely tropical humidity and sea air. The last graphics card I had to die (only a few nights ago as a matter of fact) has rust on it, and is only a year old.

They don't build em like they used to! The Pentium 1 interior blew me away upon opening it. Not a tad of corrision, rust, or even dust! Also the disks are still in top notch shape and seem to read just fine.

I've seen a 486 machine with a QBasic program to read values from a parallel port, working perfectly on an university (on a Physics lab) in 2000 .

He has been lucky that his 486 don't broke, but sometimes the old hardware is more resilient what you think. The only hard drives that I seen to die, was modern hard drives of >100GiB . However, I have here an Amiga 1200 with a little hard disk of 200MiB that it keeps working fine.

Yeah. It's just amazing luck. Hope he can share more of his story. I still have a running clamshell iBook :)

I had a mac classic II (1991 vintage) that eventually stopped working in 2004-2005 or so (IIRC). Sometimes old computers live longer than average. I also had a PC XT clone from 1987 before that that I am pretty sure lived more than 10 years.

Why wouldn't it be true?

The terms used in the article are misleading though, like many people have already pointed out. Not an OS, a shell, I don't even know if you can build an OS with QBASIC alone, maybe with some assembler support, but I don't recall qbasic supporting assembler, at least without any binary tricks ?

I've amended the original imgur post to reflect this. You're 100% correct, it's not an actual OS. but rather a primitive GUI of sorts that is an imitation/representation of what I perceive to be an 80s OS.

heyya OP, I love CLI, so I really dig what you did here :D for now it runs under QBasic only?

Yep, so far it's just for QBASIC. I can't see much point in resurrecting it or attempting to port it, but if I ever get into C#/C++/another modern language, I will likely focus on writing new and more interesting apps.

Great story. Is it fair to say, if he had had internet, television, cinemas and tons of friends around, he would never have achieved that?

I see our kids every day staring at various screens, being miles away from any creative impulse and getting really mad, if we discuss (or execute) internet access limits.

It's very hard to say.

Right before leaving for the country, circa 1997-1998, my best friend was trying to get me into C++, as he was a games dev / games fanatic like I was. I had always wanted to get my hands into C++ and start working with it. It never really transpired sadly as I had no idea where to start.

This guy's mother made a terrible and selfish choice :(, I feel terrible for him

If there is one story that changed me today, it's this. Kudos to you sir. With no internet/help you still pursued your passion and built this from Ground UP.

Thank you for the inspiration, and have a great career ahead.

I only imagine what would've happened if OP was stuck with a UNIX or even better Linux with source code files included.

It would be a really different story I guess...

I really wish that I had Linux around that time. It's always piqued my curiosity but I fear I wouldn't be able to think of a practical purpose for it (however that probably also stems from a lack of knowledge about Linux itself and what it can do). I have as much knowledge of Linux and its capabilities as a 3 year old would in driving a car.

Totally impressed. Yes, not an os but a shell but still impressive.

Boredoom can easily lead to creativity it seems. And this guy is smart.

Indeed, I've re-worked the title to include quotes around the "OS" parts as everyone who states that it is a shell is actually correct. I should've termed it better.

Indeed having no inspiration or having a lot of boredom leaves the mind to create its own entertainment and innovation, however I'd disagree that I'm smart. My code is merely a series of hackery and visual tricks. It technically -does- do what it is supposed to, but it is nowhere near as maintainable, optimized and well written as the code of that of a talented or seasoned programmer.

Thanks for your kind words!

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact