IMHO Commodore was right to kill the C65 when they were already making the Amiga 1000. The end of all the 8-bit systems was pretty obvious by '89, when rumors of the c65 started going around - the Mac had been out for five years, the Amiga and Atari ST for four. It would have suffered a fate almost as ignominious as the Sega Saturn.
The Amiga would have been more responsive and really just as powerful with the same custom chipset tied to something like a 65816 clocked at 8mhz. Would have been able to address just as much RAM but would have higher interrupt responsiveness, and Commodore could have made use of its existing 6502 expertise, thrown in a couple SID chips, and maybe even a VICII for C64 portability. What a machine that would have been!
Likewise, I think Tramiel could have done something similar instead of diving down the 68000 path with the Atari ST. They could have improved on the excellent Jay Miner designed A8 chipset but tied it to something like the 816, which was just coming out in 84 when they started the ST project.
Just some fantasy alternative history :-)
The Atari ST was built from scratch in less than 8 months after the deal to license the Amiga chipset fell trouhg. The ST is an amazing engineering feat considering the insane schedule.
I agree on you that the 68k is a terrible CPU for gaming, but it got better when there was some cache added to it, starting with the 68030.
I had an Acorn Archimedes with an ARM 2 clocked 8 Mhz (the same speed as the Atari ST, and 1 Mhz faster than the Amiga 1000/500), it was so much faster than my ST.
Are you sure? With memory costing hundreds of dollars per MB in the mid-80's, I remember thinking a 24-bit address bus was plenty. (I actually don't think I personally had a computer with enough RAM to need more than 24-bits of physical addressing until the mid-90's.)
Ofcourse that was probably pretty unlikely considering the Acorn/Olivetti bet so heavily on the education market.
According to https://en.wikipedia.org/wiki/Instructions_per_second#Millio...
the ARM2 at 8MHz was 4MIPS. The 386DX at 33MHz was 4.3MIPS. The 8MHz 68000 was 1.4MIPS.
Of course there could be other things that made it hard, such as too few bus cycles when running the 8-bit chunky graphics mode? Or that sound did not use DMA(I don't know if the Archimedes had sound DMA) or that the CPU had to mix samples instead of the sound hardware doing it?
This was the gold standard at the time.
I wonder what the connection is.
OTOH, the sqare grid terrain can be traced to at least https://en.wikipedia.org/wiki/Return_Fire although that was a sequel to a completely different game
You may be overstating the death of the 68000... the 68060 was introduced in 1994, and I remember doing embedded work on 68000 based designs in 1999-2000. (In fact, some of the processors are still available, although not recommended for new designs.)
I still like the 68k. It has a very nice instruction set. It makes a great target for C compilers. I still enjoy writing assembler for it here and there. I like my Atari STs.
But for the market that Atari and Amiga were going for, I am still not convinced it was appropriate. I think Apple made the right choices with the IIgs. Except they underclocked it, overpriced it, and handicapped it in favour of promoting the Macintosh despite the II line being a big profit centre for them.
What was the problem with Amiga responsiveness? Never heard about this. Interrupt response? What?
For several years, it had the best games. The bitplane-based graphics system became a hindrance with FPS-games but that was never about the CPU.
Also, as far as 68000-based machines goes, the X68000 kicks the Amiga's ass.
Which games do you think were constrained by the CPU? I can think of other factors first:
- sloppy ports
- it was a more complex system to tame than most of those that preceded it
- the sprites were simply too limited, so you had to use the Blitter a lot
- unlike a lot of chips of the era, Denise had no tile or character generator so, again, you had to use the Blitter
- the CPU was clocked at 7.14MHz to align with the PAL/NTSC clock even if it could run at 8MHz
- if you used more than four bitplanes in low resolution, the Agnus chip would let Denise starve the CPU even on cycles where the latter would normally have access (if you set the right bit, the Blitter could be allowed to win, too)
I don't see how a different CPU would have helped with any of the above. In a time period during which memory wasn't typically the limit, the Amiga was one of the first systems where the CPU starved for it.
The X68000 was better, sure, but it came out later and, being in the same mold as the arcade systems of the era, it had dedicated video RAM, tiles and more powerful, actually useful sprites.
As for your statement on "many of the 8-bit machines", only the C64 had good 50/60fps shooters of the 8-bit home computers. The 8-bit consoles had better graphics hardware than the C64 and sometimes had the better games.
Properly implemented Amiga games were not that many though. It was hard to utilize the machine and it required good graphic artists to make it look fresh and sharp with the limitations of dual playfields.
But take a look at these games, I think they all run on a 512kb chipmem only OCS Amiga.
Mega Typhoon: https://www.youtube.com/watch?v=zk3LNdPTnMw
And of course the R-Type and Turrican series. Those were 12 or 25fps on the ST but silky smooth 50fps on the Amiga 500. Especially Mega Typhoon looks like a Raiden clone. It's very hectic, lots of bullets and big enemies and snake formations. The only thing that stands out is the lack of colors. I suspect this is because of using dual playfield and that limits the number of colors to 8+8+sprite colors.
I'm an old demo coder(C64 and Amiga) and was very picky about framerate back then. The only shootemups I liked that was not 50fps was Battle Squadron and Wings of Death.
And then you talk vaguely about responsiveness. Do you mean the number of cycles spent by the machinery to process and IRQ? I never saw that as a problem. On the Atari ST they use timer IRQ's to make rasterbars in demos and games. Seems to me they would not do that if the cost was too high.
The Amiga mouse-pointer was a sprite. I never saw a delay on that. It felt like a game character. At most 1 frame delay. And dragging Amiga "screens" up and down was usually only one frame after the sprite movement. But that's logical since the system copperlist has to be updated and that is definitely one frame delay on that.
Again, what "responsiveness"? There doesn't seem to be any problem with that on Amiga other than what you say.
Two different computer from 2 different visions...
My take on it, in a nutshell, is really to come up with a bridge in between the C128 and the Amiga500... With the design restriction of the time. At least as much as I can with the resources I have. I won't go about creating a new mold... Cheers!
While it might not have been as easy to see /in/ '89 in hindsight the seeds for the rise (and crowding out of other architectures) of the Intel x86 chips were already well sown and growing rapidly by then:
i386: released Oct 1985 (https://en.wikipedia.org/wiki/Intel_80386)
i486: released April 1989 (https://en.wikipedia.org/wiki/Intel_80486)
And the consensus was, probably it would have looked like the Acorn Archimedes. Acorn really got it right, and they got it right shockingly early.
I'm going to steal that phrasing.
- Igor Stravinsky
I know this comes up a bit too often on HN but it's pretty, um, glaring, in this case.
Naïvely, I would assume that something like Forth or Occam would be much better for this, and both are also age-appropriate.
Not even close.
BASIC was a very powerful language. Not "powerful" in the way we think of languages today with OOP and all that, but powerful in its flexibility. Each machine had its own version of BASIC that made the most of each machine's unique capabilities.
Games were probably the minority of BASIC programs. BASIC was a serious language for serious business programs, especially in sales and accounting.
If your needs were scientific, you went with FORTRAN. If you were needs were hardcore business, you went with COBOL. If your needs were academic, there was Lisp and a bunch of others. But BASIC was the common language that almost every computer had available.
Huge companies managed inventory with BASIC. Transit timetables were calculated in BASIC. Machine control, non-mainframe astronomy, specialized journalism applications, record-keeping, and dozens of other needs were handled well by programs written in BASIC.
The first program I ever sold commercially was essentially a single-user Salesforce for the Commodore 64 tailored for limousine companies. I wrote it in BASIC.
If you think BASIC was only used for games, that's a reflection of your limited experience, not of the limitations of BASIC.
Seriously, there was no support for what made the C64 the C64 in the C64 BASIC! You wanted grpahics? PEEK and POKE. Sound? PEEK and POKE.
That said, I didn't have any trouble doing graphics on the 64 with PEEKS and POKES that was good enough to get one of my screens on the cover of Run Magazine. It wasn't easy, but once you got your brain around it, I remember it being pretty fun.
Sound, however... no argument there. But that could be because I've never been musical in any way whatsoever. Never understood notes and scales and such.
You might not know that is where Microsoft made its initial $
That's the book I learned to program from, a compilation of games in BASIC published in 1973. I don't know the truth of the GP's claim but reportedly that book was very influential so there might be something to it.
(I didn't downvote you, and I have upvoted your post.)
I learned c on my c64, in fact. a skill I still use 32 years later.
you can download the manual courtesy of archive.org: https://archive.org/details/Super-C_1986_Abacus
It didn't support then entire K&R standard though. No function pointers.
mine came from a small software store that was a bed store the last time I drove by.
BASIC had a couple huge advantages in that it ran in minimal memory and had a reasonable infix syntax. The infix syntax was a particular advantage in educational scenarios where it matched (roughly) with the way math was taught. (Which was never in pre or postfix notation.)
The cross-platform nature of BASIC was also useful. If you knew one microcomputer, it made it more likely that you could sit down at another and get it to do useful things.
It's easy to second-guess, but even in hindsight I think BASIC was a reasonable choice for the time, goals, and hardware limitations.
Also, Occam?! That is a concurrent programming language... I'm not sure it would have done too well on the machines of the day.