I just love this kind of thorough, well-researched digital archeology delving into the whys and hows of 70s and 80s computers. Having lived through that era as a fledgling teenage computer hobbyist when clock speeds were sub-megahertz and 4K of RAM seemed like an impossible amount of space to fill, we had no idea we were living in the Cambrian explosion of personal computing.
Every platform represented a unique branch in an evolutionary tree undergoing incredibly rapid evolutionary permutations. Each with their own clubs, offset-printed zines and software libraries, none compatible with the others. Few people outside some universities and corporations had ever touched a self-contained computer and no one in my family's extended social circles even had a computer at home! I remember it striking most people as simply being weird, prompting questions with a tone akin to "I heard you have a personal dirigible at your house. Um... what would you even do with that?"
No one knew what the future of computing would look like and we certainly never imagined that by the late 90s, the wild explosion of early diversity would have encountered an asteroid-scale die-off, shrinking to a single major survivor with a distant second place in household computers - leaving behind a landscape which felt shockingly empty compared to the wildly diverse frontier only a decade earlier.
> 4K of RAM seemed like an impossible amount of space to fill
It seems impossible to imagine the impossible these days. The first home computer I remember my family owning is a VIC-20 (5 kB RAM), where the screen could contain about 1/2 kB of text. Granted, getting anything onto and off of the computer was a lot more difficult/expensive. I can barely even imagine earlier home computers.
And that Cambrian Explosion is no joke. The 80's were crazy. The 90's stepped it up even further.
> getting anything onto and off of the computer was a lot more difficult/expensive.
Yes, that's why 4k seemed so huge (really about 3.5k after zero page and some buffers were allocated by the BASIC interpreter). Initially, I didn't have any method of storage. The tape cassette recorder was another $50 I had to save up for. Having to retype BASIC programs from hand-written notes every time the power was turned off strongly incentivized short programs!
Even with tapes, the user did a fair amount of the work: pushing buttons to read, record, or rewind. Many personal computers used standard cassette recorders, effectively a fire hose of data, so the file size was limited by the computer's memory. If they had anything remotely resembling a filesystem, it was an identifier the computer could look for as it marched forward, and only forward, through the tape. Commodores could start and stop the drive's motor, so maybe they could handle larger files. Yet (if I recall correctly) they too could only march forever forward, without the user hitting the rewind button while monitoring the counter, so you didn't have the utility of random access files.
It's hard to fathom today just how slow the cassette recorders were. You would routinely spend 15 minutes trying to load Temple of Apshai just to have it stall while you sat there for another 10 minutes hoping it didn't actually crash (until you reloaded it).
Indeed, a lot of 8-bit computer audio cassette recording rates were as low as 300 baud. I remember feeling spoiled that my Radio Shack Color Computer (based on the Motorola 6809) was 1500 baud. Also, depending on the quality of your audio cassette recorder and tape media, errors could occur. If something was really important, you saved it twice in a row and once again on another cassette tape.
Since tapes were relatively expensive (at least to us teenagers) we bought longer tapes since it worked out to be cheaper per minute. We then tried to maximize each cassette by recording a bunch of programs on each. Unfortunately, the only way to know where anything was on a tape was by zeroing the mechanical counter on the cassette recorder and scribbling the counter numbers down. Of course, these counters were only tied to the motor run time and thus quite inaccurate and, at best, fast forward and rewind could only get you in the approximate ballpark of a program's location. If we lost our scribbled tape counter notes, the only alternative was to play the tape at 1x and wait for the named program segment to be seen by the computer (on tapes up to an hour per side).
I also recall someone made a hack that patched the Coco ROM to enable 3000 baud recording and reading, however this did make read errors somewhat more likely.
They only had 8K of RAM. Naturally, most of my time was spent creating all sorts of games. Free memory was rarely an issue.
A key factor was a combination of the very nice PETSCII character set with all of its nice graphic characters and simply using the screen as your combined playfield and data structure.
You could do a lot simply PEEKing and POKE-ing your way around the screen. Very handy dual use memory.
Parts of this reminds me about an issue I had around the same time, I worked on porting A/UX the Unix port for the Mac II, we reportedly got half of the early production run, they came without plastics in cardboard boxes, with schematics and PAL equations (PALs were relatively new then) I found and they fixed a bug in the PAL equations (didn't handle 24-bit writes some new 68020 instructions could make) so I had some cred with them when we ran into a weird bug, very occasionally you worked with the IWM chip (used floppies) the keyboard would freeze up - but only on some machines.
Managed to figure out that the machines that worked had an ADB (keyboard) chip with markings on it, the ones that didn't had no markings - Apple swore they were the same, eventually told us the ones that didn't work were the ones the were doing the manufacturing run with ..... bug from hell ..... turns out there was some circuitry in there they were quite proud of, when you accessed the parallel IO (VIA) chip it tweaked the clock phase a little to give you faster access, part of working with the IWM chip involved setting up the timer in the VIA chip to time when the next sector would go by, we'd poll the timer fast enough to tweak the clock on every clock cycle resulting in the ADB chip which was connected to the VIA being clocked too fast .... we replaced that code with a simple delay loop
> Acorn did ship a computer with the 65816—the Acorn Communicator—but when Sophie Wilson visited WDC in 1983 and saw Mensch and crew laying out the 65816 by hand, it struck her: if this motley crew on a card table could design a CPU, so could Acorn. Thus Wilson and Steve Furber forged their own CPU: the Acorn RISC Machine.
The first commercially available Acorn RISC processor was released as a co-processor for the BBC Micro. Acorn always had processors on the mind it seems as the Tube interface and protocol [1] is solely for co-processors.
There's an excellent Rasperry Pi based project, PiTubeDirect, which emulates the ARM and many other co-processors on original Acorn 6502 based hardware; Atom, Electron, Micro and Master [2]. The original expansion hardware is, as expected, incredibly rare and valuable.
I still have my childhood IIGS. It sat in my folks’ poorly insulated attic for two decades before I rescued it, and it booted up on the first try! They don’t make ‘em like they used to.
Their power supplies have RIFA brand capacitors in the power supply. Their cases are 100% guaranteed to be cracked from moisture ingress at this age. They will fail catastrophically when you plug it in at some point. If you want to keep it running, you need to replace the supply or recap it (Just the X & Y RIFAs).
Replace the 3.7v lithium backup battery for the RTC before it blows up and spews acid all over your board! ROM01 versions have it soldered in, ROM03 is in a battery holder. Replacements for both are readily available.
NP - BTW just checked my Mouser order history, if you want to add a removable battery holder to a ROM01 board you need a battery holder #108 and retainer #108C. The same Keystone brand holder is used on the 03 boards.
As someone else said, google Apple RIFA caps and replace them. They are 100% going to fail on you and damage the machine. It's the first thing you do when purchasing an old Apple.
I remember our school computer lab had a whole bunch of Apple IIe's, a single Mac, and a single IIGS. The GS was by far the most coveted because (unlike most Macs of that era) it had a COLOUR screen and could play relatively advanced games. Eventually they upgraded to mostly Macs.
Dad ended up buying a 386 PC, which was probably for the best. Those SVGA graphics!
I rocked the IIe but wanted a IIgs like no other. I'd 'visit' it at our local 'Apple store' before the Apple store (It was Tokamac in Palo Alto if I remember correctly and I may not). I also collected an Amiga A500 and an Atari 1040st. Love. Those. Times.
To me the most fascinating nugget of history was how close a young Tony Fadell (later General Magic, iPod, Nest) came to supplying high-speed 65816 chips!
This was a fascinating and ... detailed ... story. I appreciate that it went a little further into the history of Apple's involvement ARM than the recent spate of blog posts that didn't go back past Newton.
I remember being surprised to see someone with an Apple IIgs when I was in undergrad (late 80s) and discovering the platform still lived. The fact that they were still selling the Apple II line in 1993 (1995 if you count the Mac add-in card) seems pretty damn miraculous to me.
In the early 1990s, the IIgs community went through this sort of "faux workstation" phase where we had 16MHz hotrodded accelerator cards, high(er) resolution graphics cards like SecondSight, and a pre-emptive multitasking operating system (GNO/ME) along with the beginnings of TCP/IP support (gstcp). If you squinted one eye, we could almost hold our own against something like SPARCStation 1 as long as you didn't want to run Mosaic. You could fire up a desktop with a few shell windows and compile C programs and telnet out while playing SoundSmith (not-quite-.MOD) songs. That is to say, we were pushing the whole mess beyond anything a sane person would even fever-dream of attempting.
The community today is putting out more releases more steadily than any other era I can recall since then, but we seem to have mostly backed off of the whole "workstation" application that was in a few dorm rooms at the time.
Edit: FWIW I just noticed the date today. Happy 8/16!
I remember being at a TUG conference (I think Texas A&M/1990) and speculating about whether it might be possible to get a working TeX implementation working on the //e. It would require hand-rewriting the whole thing in 6502 assembly and some insane use of bank-switching, but I think it could have been possible. But I had neither the time nor the hardware to do it which is, in the end, probably a good thing.
This seems to have been a pattern with all the 80s home platforms, especially in Europe.
I came from the Commodore Amiga, and used mine still in the early 00's all hacked up with cottage industry expansions and software hacks as a Mini-workstation writing LaTeX and C/C++ for Uni and a somewhat functional browser.
The Atari ST had a similar trajectory post Atari with FreeMiNT, and even smaller the Archimedes and Sinclair QL did!
There is modern new QL-compatible hardware on sale today, which still amazes me, 4 decades later.
But saying that... If one were so inclined, which I am, one could argue that the entirety of the Arm platform, including the modern Mac I'm typing this on, is an extension and development of the Acorn Archimedes.
It doesn't run the Acorn OS, but neither do modern x86 PCs. And anyway, the original OS is alive and well and runs on modern hardware:
It doesn't use the original GPU, but nor do PCs. It doesn't use the same RAM layout or anything, but nor do PCs.
But it's a compatible, extended, modernised version of the self-same CPU architecture, just like PCs are. It can emulate the old environment so you can run the old OS, just like PCs can.
Any current Arm64 device, phone or tablet or laptop or server, is every bit as much an Acorn Archimedes as a modern multicore x86-64 computer is an IBM PC.
K-12 schools are obviously their own whole situation, but in a well-funded, well-regarded public school system we were using Apple IIs in the mid-90s. Of course this was elementary school and not a college.
By middle school it was PowerMacs and by high school it was Windows 2000.
I think the built-in keyboard idea had run its course, so any possible Apple IV would probably end up looking a lot like an Apple IIgs. The little keyboard extension on the IIgs case is kind of reminiscent of the Apple ///.
It would have been good to end the line with what was planned for the Mark Twain, internal floppy and hard disks would have made the whole system a lot better. And of course if we could have got a 14 MHz 65C816 as well then it would have been a really interesting system.
It can still look like a /// with a detached keyboard that’s a continuation of the main case. Commodore did that with their office machines (the rounded ones).
I think the beige is important as part of the apology ;-)
Frankly, I’d accept a beige Mac Mini with a rainbow Apple logo on top and call it a day.
Given how there's plenty of market for retro nostalgia kit, I'm surprised nobody is making a modern desktop PC that pushes the same aesthetic buttons.
Beige paint is no more expensive than white or black. A desktop form factor would be good for today's super-heavy video cards. Since we no longer design around internal optical drives, etc, you could probably whip up something micro-ATX that looked a lot like an IIgs or the IIsi mockup, maybe with some USB ports in the front "lip" for convenience sake.
Its kind of crazy, I have no idea why anybody would think NeXT was worth $400 million. Who except Apple was gone buy them?
A user focused OS that is mostly costume is more of a liability then anything else. Keeping that maintained and up to date with technological progress is a huge effort.
And all the big companies already had their own. The up and coming companies all didn't need user first OSs.
It’s about how much money did it take for them to want to sell instead of continue as an independent business. In 1996 WebObjects was starting to get traction. After a decade it seemed like there was perhaps a viable future for the company.
There is a difference between a 'viable future' and 400 million $.
The reality is the most proprietary software companies of that area didn't make. Specially companies with their own programming, language, compilers and so on.
First a devastating economic crisis and then the rise of open source.
So maybe they could have survived, but likely nowhere close to that value.
> The reality is the most proprietary software companies of that area didn't make.
You're using 20/20 hindsight instead of putting yourself in the shoes of someone in 1996. The term "open source" was not even in wide use in 1996.
But actually even with hindsight your analysis is flawed. 1996 is right at the start of the .com bubble. A company whose software (WebOjects) was being used to power real world .coms (it was even being used by large companies like UPS and Dell) almost certainly would have risen tremendously in value (even as a private company). Now, if they would have been acquired before the 2000 crash is another discussion.
No. That's what you are doing. Apple didn't buy Next for 400 million $ to get their hands on 'WebObjects'. In fact that played very little role.
They valued the OS and Jobs. The reality is Apple had spent 10 years trying to make an OS and failed. Next was only really had one potential buyers and they had been losing value and losing money for a long time. So their value shouldn't have been so high.
That said, one could make the argument in the other direction, Apple needed an OS. BeOS was much, much cheaper but also less complete. Arguable just in terms of OS still the better value. Switching to something Solaris is what they arguable what they could have done instead for far less money.
In the end they had to put huge amount of work in anyway, so its not even really clear what Next really added that they couldn't have done based on Be, Copland or Solaris.
And in terms of WebObjects, the problem with the whole 'WebObjects' is gone drive value is that literally everybody was trying to sell some something like that. Every company and many knew ones were getting into that game and few of them were successful. And even those that were successful aren't worth 400 million $.
A company starting to use some Web framework stuff in 1996 doesn't necessarily stick to it forever. In fact historically many didn't.
Let's not forget that the World Wide Web, as it was known at the time, was developed on NeXT machines. For the record, I was on my 12th year of my career in 1996 - so I know this time well. Like you say, 'WebObjects' was called 'programmer's heroin' at the time. GUI development on NeXT was lightyears ahead of Windows and OS/2. Oh, I hope you hadn't forgotten about OS/2!
Also, I think everyone is forgetting about a certain client NeXT had that then became Apple's client and gave Apple considerable cachet, especially in the late 90's. Do you remember the entertainment company Steve Jobs started when he started NeXT and who famously used NeXT computers in making their blockbuster animated films? A company so big that Disney bought them in 2006? Of course I'm talking about Pixar. Don't forget Apple was also buying Pixar's business and the public perception that went with that.
There's a reason Apple went from a stumbling, bumbling company that was all but forgotten to being one of the most valuable companies in the world and one of the most recognized brands in the world. They bought NeXT.
In case you thought the reason was because Steve Jobs came back, consider that Jobs has been out of the picture at Apple for 14 years now. Hindsight informs us it was their acquisition of NeXT that made all the difference. Well, that and Apple's involvement in creating ARM, but that's a tale for another day!
> Let's not forget that the World Wide Web, as it was known at the time, was developed on NeXT machines.
Pretty irrelevant. The same could have been done on any Unix Workstation from the time period. NeXT machines weren't magic.
> Like you say, 'WebObjects' was called 'programmer's heroin' at the time.
That's a waste overestimation.
> I hope you hadn't forgotten about OS/2
Not sure how that's relevant. Just another example of an OS that by itself wasn't all that valuable. In fact it was a money loser.
> Also, I think everyone is forgetting
Non of the things you say are relevant to the discussion. NeXT wasn't making hardware anymore and Pixar also used other Unix workstations as well.
> Hindsight informs us it was their acquisition of NeXT that made all the difference.
Yes 'hinsight'. What we are talking about is if NeXT was worth 400 million $ at the time.
To say 'the bought Next and then 20 years later they are super valuable' isn't an argument about if NeXT was worth 400 million $ back then.
They were lucky that Jobs despite not being that successful for the last 20 years had the right combination of ideas and luck to save Apple. This was very unlikely to happen based on Jobs record.
Were you even working in technology in the early 90's? Your comments are off. Also, what you dismiss as "irrelevant" are things businesses pay attention to, which is relevant to this discussion. Your final remark "They were lucky that Jobs despite not being that successful for the last 20 years had the right combination of ideas and luck to save Apple" is laughably ignorant, and as such, I'm not going to put much stock in anything else you've said.
> Were you even working in technology in the early 90's?
No as a young child I didn't work in tech. Its a matter of historical record for me. Your feelings from the time aren't relevant. Did you back in the early 90s have detailed financial insight into these matters that are better then what we have now, that books were written on these topics?
Did you work at NeXT?
> Also, what you dismiss as "irrelevant" are things businesses pay attention to, which is relevant to this discussion.
Ok. So please explain it to me, because just saying 'X happened' isn't an argument. Yes, the web was created on NeXT machine. NeXT wasn't selling many more machines because of that, apparently nobody cared. They closed their hardware operation completely.
In the years after, NeXT didn't become massively successful as a software company either. They lost more money in that time period. So apparently people didn't buy their software because of it.
So please explain, why the valuation in 1996 should be massively impacted by what Tim did years earlier. By that time browsers existed on other platforms. Nothing about NeXT specifically magically created the Web.
I get that its culturally significant, but in valuation terms I really don't see it. That's why I said 'irrelevant'. Please show me how the first broswer being written on NeXT machine made them a valuable company. Tim using a NeXT machine impacts the valuation in 1996 how exactly?
> Your final remark "They were lucky that Jobs despite not being that successful for the last 20 years had the right combination of ideas and luck to save Apple" is laughably ignorant, and as such, I'm not going to put much stock in anything else you've said.
Again, you simply make an assertion without an argument. What is your argument.
Its unquestionable that luck is involved when becoming the largest cooperating the world. Are you denying that?
Are you denying that Apple in 1996 was in a lot of trouble and it wasn't at all clear that they would continue to be a important company.
Are you denying that what really changed Apple was not the NeXT base operating system, but rather Jobs changing company strategy? In fact, the most important part making Apple work, happened before OSX was even released.
Are you really gone pretend that the NeXT operating system is what turned Apple into one of the largest companies in the world?
They weren't my feelings - they were anecdotes. Anecdotes from my experience and the experience of others who were actually there and highly engaged in the developer community at the time.
Go peddle your garbage somewhere else. You don't have a clue what you're talking about.
Its actually hilarious that your argument is 'its not my feelings' they are my 'anecdotes' as if that was better.
Have you actually read books about this time period? Where they actually look at these companies, and how much money they make? What the problems were that they had? Did you go back and read about the larger trends that were happening?
Because from everything you are arguing you clearly haven't. You have some memories of this time and you think all that need to understand the world.
And again, you haven't made a single argument about what that I actually said was FACTUALLY wrong. If I got any facts wrong, please tell me what I got wrong.
I even pointed out that NeXT was culturally important and how that doesn't magically result in a fantastic growth company. It seems you can't see beyond that cultural impact.
You don't appear to understand how companies are valued. And, you haven't stated what NeXT should have been worth and provided the evidence to back up your claim. You've only stated Apple paid too much. History shows they didn't. You claim that's 20/20 hindsight - but you bring nothing to the table other than an assertion that they paid too much.
Meanwhile, I don't need to read books - I was there. You have someone who was an eyewitness account at the time, deeply involved in software development and the developer community at a national level - and you cast it away as "feelings."
> No. That's what you are doing. Apple didn't buy Next for 400 million $ to get their hands on 'WebObjects'. In fact that played very little role.
I never said Apple bought NeXT for WebObjects. I said that NeXT had value outside of the deal in the marketplace because WebOjects was starting to get traction.
My point was that Apple paid to much, and I do stand by that.
Yes during the biggest bubble in computing history they might have been able to IPO and that might have given them enough money to survive. Sure, great, but that doesn't mean there is inherent long term value in a web framework that made the company worth almost 900 million $ in today's money.
Yeah, but Apple didn’t care about WebObjects; it needed an operating system. And OpenStep on PC hardware was slow, crashy, unpolished, and generally much less pleasant to use than the fastest Macs at the time. Apple bought it because it had a fully working graphics architecture (BeOS couldn’t even print!), and vastly overpaid for it because Gil Amelio was an idiot.
Irony: the graphics architecture had to be fully rewritten from scratch for Mac OS X because Adobe didn’t want to support Display PostScript anymore. Have I mentioned that Gil Amelio was an idiot?
Do you remember how unstable the original Mac OS was? I remember working with a guy doing web development work on his circa-1996 PowerPC Mac. It would crash at least daily, resulting in a reboot. Perhaps it was Netscape's fault.
> Yeah, but Apple didn’t care about WebObjects; it needed an operating system.
I never said Apple bought NeXT for WebObjects. I said that NeXT had value outside of the deal in the marketplace because WebOjects was starting to get traction.
It most certainly wouldn’t. Would be a very tight window, and it’d be going against the likes of BEA and soon to be killed by open source solutions. The best bet was to vertically integrate with a large hardware manufacturer that’d bundle their OS. WebObjects was, however, doomed anyway.
And this, kids, is why everything now is a subclass of NSObject.
Rockwell and WDC had 65C02's up to 4 MHz relatively early on, but the 4 MHz versions seemed to be quite rare. WDC now has 65C02's rated at 14 MHz, but they go quite a bit higher than that if you've got fast enough RAM.
There are some technical details on why a 4MHz Z80 is roughly equivalent to a 1MHz 6502. As always with processor design there are tradeoffs in every decision. The Z80 had a 4-bit ALU, but I'm not sure if that slows it down.
The Z80 has a more complex architecture than the 6502. A 6502 clock cycle is one bus cycle and simple instructions can execute in one clock cycle. For the Z80 a clock cycle is called a T-state, and one machine cycle consists of multiple T-states. A simple instruction like INI takes 4 T-states.
There is also some tiny pipelining at play: the 6502 needs to access (read or write) memory on every clock cycle. A 2-cycle instruction reads the opcode on the first and the actual work is done on the second. This leaves the second cycle's memory access open for doing something cool like fetching the next instruction's opcode.
But so much of the Apple II design was focused around minimizing chip usage and counted on exact timings that I don’t know that it would have been possible to change the clock speed. Add in the fact that there was no access to a real time clock so things like timing delays counted on things like run this empty loop 100 times, or the fact that the graphics memory layout was tied to how the electron beam on the monitor refreshed pixels or the reliance on a quirk of timing of the 6502 processor in the disk II controller hardware and I don’t know that a 6502 at a different speed could possibly work.
>But so much of the Apple II design was focused around minimizing chip usage and counted on exact timings that I don’t know that it would have been possible to change the clock speed.
A recent Adrian's Digital Basement video <https://www.youtube.com/watch?v=dt1eSXpo1SA> discusses this topic. While showing how a 80286 system runs at 1MHz, he discusses how the PC architecture allows (most) software to run at that clock speed while the Apple II architecture and software are inherently tied into the 1MHz clock speed.
In retrospect it seems so sensible to have the IIe and IIc in 1983 and 1984 move to, say, 2MHz, that I'm sure that fears of breaking software compatibility contributed to that from happening. (That almost certainly would have been a short-term problem. Given how quickly the Apple II software moved en masse to 128K/80 columns by the mid-1980s, developers would have accounted for a faster clock speed too.)
The III's II compatibility mode ramps that down to 1MHz, and has other restrictions to make sure that II software cannot use any III-only features.
Without III sucking up all of Apple's R&D budget and attention c. 1979-1980, the Apple II would surely have seen earlier enhancements. The II+ (1979) would likely have had lowercase and better keyboard (which did not occur until IIe in 1983), and a new model in, say, 1981 might have shipped with an optional Apple 80-column card (again, with the IIe in actuality). Built-in 128K RAM probably would not have occurred until 1984, akin to the IIc's introduction, but earlier support for RAM expansion alongside 80 columns is possible. One of these models would likely have had the 2MHz clock, too, while no II in actuality shipped with a faster clock until IIgs in 1986.
I never felt the IIgs to be an actual Apple II. It feels like a different computer that can almost accurately emulate a //e or //c, but has that inelegant impedance mismatch between. I also get the same impression from the Commodore 128, but feels even worse that it needs to drop to 1MHz to use the C64 video modes, and don’t even get me started about its Z-80 side.
There was a 4 MHz 65C02 model .. but not until 1988: The Apple IIc+.
You could lower the clock speed to 1 MHz for programs that relied on it, but the 65C02 did not support the unofficial/undocumented instructions of the original 6502 so there were still some programs that did not work.
The IIc+ was such a refined form of the II. Motorized 3.5" floppy, integrated power supply for maximum portability (I'd take it to do basketball stats for the high school team), platinum color that carried through to Macs until the iMac.
I need to fill out the missing pieces of my Apple II collection. Most of it is still in Brazil, and they are all clones. At least a //e platinum and a //c, preferably a plus, but I can’t be picky.
A higher priority is a graphics-capable VT terminal to act as a reference for a promise I made to the VTE team. Unfortunately, they are incredibly rare this side of the Atlantic.
The big deal is video and RAM timing. If the memory is twice as fast, you’ll still be able to read from it for the video refresh at times the CPU won’t be able to. Also, I think DRAM refresh was done on the video timing. You’d need to rework all the logic on the board, but that was doable when they built the Mega 2 chip (and done in the IIgs FIP). The trickiest part would be the Disk ][ interface, and disk I/O done on it, as everything was timing-critical. A different disk controller that doesn’t rely on timings would be able to function.
As for program timings, you don’t need an RTC. You want timers and interrupt generators. I believe the //e and the //c could generate interrupts on vertical blanking. For a game you could run all logic and drawing and set up the interrupt vector so that the next interrupt starts the next game cycle. To count time, a cycling timer that increments on vblank would be quite enough. IIRC, MSX had one of those.
> I believe the //e and the //c could generate interrupts on vertical blanking
VBlank interrupts were only available through the Mouse Firmware, which was built into the IIc but a rarely-installed option on the IIe. As a result, there were no interrupt-driven games for the 8-bit Apple II machines.
The complete lack of consistent frame rates and timing is a hallmark of Apple II gaming.
Conversely, screen interrupts were at the core of Commodore 64 games, and of the demoscene. And also of arcade machines, NES and the whole next generation of consoles and non-IBM computers.
So I wonder why it wasn't added. It wouldn't have been hard, exactly (enthusiasts for the TIKI-100, a Norwegian educational 8 bit, have gotten into the habit of repurposing the printer interrupt by means of a dongle in the printer port).
Was the idea that educational machines shouldn't be too game-friendly?
> So I wonder why it wasn't added. It wouldn't have been hard, exactly
The answer to the question "Why didn't Apple add X/Y/Z to the Apple II?" is that they did add those features, starting with the Apple III in 1980, and continuing with the IIc and IIgs.
The problem is that there was a 2-year window between the release of the III and the explosion of the home/education market that Apple ignored the II and assumed sales of that quirky, obsolete system would dry up.
The IIe was designed within that window, and the skeleton crew of engineers who worked on the IIe did not have the green light to add significant new features. The only goal was to reduce manufacturing costs and maintain compatibility.
It wasn't until after the IIe was locked in that the Apple leadership began to realize the importance of the II within the suddenly booming home/education market, and only then did they put any significant resources back into the platform.
The IIc (1984) and IIgs (1986) were the result of those renewed efforts, but by that time the cat was already out of the bag. The IIe remained the most popular machine of the platform, and the "modern" features added to the IIc and IIgs were left unused by most developers and users.
the idea was the original Apple II was made in 1977 as a game machine (Woz wanted to play breakout in software) but there really wasn't much of a concept of what a "Game Machine" was back then, it was mostly a huge hack trying to get minimal chipcount
years later when the c64 and IBM PC came out, the IIe was released which did have vblank support, but Apple II devs were reluctant to break backwards compatability.
You can still do a lot of cool games w/o vblank support. I'd say it'd barely makes the top 5 list of most annoying things about programming games on the Apple II.
IIe and IIc have a vsync status bit that you can poll, but IIRC the polarity was flipped between the two models and there are other quirks that can interfere with using it, so it was neither recommended nor popular.
it's more complex than that, I think IIe and IIgs have the same register but polarity reversed, the IIc has a weirder interface that generally involves setting up an interrupt through the mouse firmware
I don’t know how you remember that considering it doesn’t.. I’m as rabid a 6502 fanboy as you might meet (and have been programming them for some 40 years or so), but that’s not a competition it wins, all 6502 coders know that ;)
We think it does because usually the 6502 machines have oodles of (ab)useable hardware ;)
Per clock cycle, the 6502 could do a lot more compared to the Z-80. For instance, a 6502 can read from memory on every cycle, so it can execute one instruction while fetching the next. A NOP takes four cycles to complete on a Z-80 and only two on a 6502. The Z-80 also has a 4-bit ALU and has to operate on 8 bits on two passes. Finally, another very clever thing with the 6502 is the single byte addressing, which kind of gives it 256 “registers”. Of course, it has a single byte stack pointer, which makes languages like C or PASCAL a terrible match for it. Z-80s on the other hand have much more complex instructions, which make Z-80 code denser, something very important in those days.
It slowed down to 1 MHz for I/O and Apple ][ compatibility.
I wouldn't call it a disaster, sales and marketing wise mainly, but that also had a lot to do with the IBM PC coming out around the same time.
It was probably the most complex 6502 design, and mainly consisted of discrete logic chips rather than custom chips that other manufactures were starting to use. It had advanced features like an additional addressing mode to access up to 512k RAM without bank switching. (Plus two speed arrow keys)
It was a disaster for a lot of reasons but not because it was a bad architecture.
It overheated, unseated chips, had a non-functional clock chip and other kinds of terrible quality controls. It also had to compete against the IBM PC while Apple still didn’t even had added lowercase input to their II+.
I love the mention of Mobius - I remember seeing a few caseless ones on folks’ desks when I was there in ‘88-‘90, and worked a bit with the OS designed for it, but there’s little concrete info about it on the web.
Every platform represented a unique branch in an evolutionary tree undergoing incredibly rapid evolutionary permutations. Each with their own clubs, offset-printed zines and software libraries, none compatible with the others. Few people outside some universities and corporations had ever touched a self-contained computer and no one in my family's extended social circles even had a computer at home! I remember it striking most people as simply being weird, prompting questions with a tone akin to "I heard you have a personal dirigible at your house. Um... what would you even do with that?"
No one knew what the future of computing would look like and we certainly never imagined that by the late 90s, the wild explosion of early diversity would have encountered an asteroid-scale die-off, shrinking to a single major survivor with a distant second place in household computers - leaving behind a landscape which felt shockingly empty compared to the wildly diverse frontier only a decade earlier.