
A Graphical OS for the Atari 8-bit computers - rbanffy
http://atari8.co.uk/gui/
======
nickpsecurity
One of my research areas is security or trust enhancements in compilation.
Synthesis, too. Almost all of that work focuses on 32-64 bits due to
limitations of lower ones. Yet, I keep seeing projects such as this and SymbOS
that blow my mind in what can be done with so little hardware. I think the
implementation techniques used in them probably contain plenty of wisdom that
could lead to improvements in embedded compiler and metaprogramming research.
Maybe even 32-64 bit systems in just making better usage of resources.

Need to have a whole collection of articles for implementing HLL constructs
and algorithms effectively in 8-16 bit systems. Maybe 4bit while we're at it
to give these 8-bit show-offs a challenge. ;)

EDIT: Editing to add the interesting link below I found on 4-bit computing
courtesy of Wikipedia's article on 4-bit. Interesting part was how
manufacturers brought offer to developers as invite only. Wonder if they've
since transitioned to 8-bit.

[http://www.embeddedinsights.com/channels/2010/12/10/consider...](http://www.embeddedinsights.com/channels/2010/12/10/considerations-
for-4-bit-processing/)

~~~
seibelj
That sounds like an interesting subject. Could you please link me to your
research or other relevant information?

~~~
nickpsecurity
I'll email you some of the work I've posted. Background is high assurance
security engineering. Dealing with software lifecycle, compilers, etc are part
of it. Meanwhile, I just posted an essay with references relevant to the topic
you're interested in:

[https://www.schneier.com/blog/archives/2015/09/understanding...](https://www.schneier.com/blog/archives/2015/09/understanding_s.html#c6705056)

------
beloch
Interesting observation: Despite it's graphical resemblance to early Mac OS,
preemptive multi-tasking actually makes this OS substantially more advanced.

~~~
gress
With 30 years of additional development time, you'd expect some advances.

~~~
nickpsecurity
Haha yeah. This and SymbOS are what we might see in an alternative future if
all the chip fabs announced: "Ok. We tried to scale. The physics were just
impossible. You have to keep using the same hardware and just do more with
it."

Much more interesting than I thought they'd be, though. :)

~~~
zantana
Another example is here
[https://www.youtube.com/watch?v=Z3-J2-VeoH8](https://www.youtube.com/watch?v=Z3-J2-VeoH8)
a port of wolfenstein 3d I wonder what people will be putting on raspberry pis
in 30 years? :)

~~~
nickpsecurity
If they ported Wolf3d, that would be awesome. That was just a guy walking
around in a Wolf3d room. Lamest vid ever. Is there one of them actually
playing the game, sound FX, shooting bad guys, etc?

Nonetheless, the 8-bit stuff described here does make one wonder what people
will do with raspberry pi's in 30 years. Might be really cool stuff. Two teams
give me a hint at where it might go based on prior lightweight work:

[http://www.menuetos.net/](http://www.menuetos.net/)

[http://web.syllable.org/Syllable/index.html](http://web.syllable.org/Syllable/index.html)

They both run with limited hardware. Moving things like graphics and audio to
dedicated hardware as many SOC's do today will only expand what you can do
with minimal processors.

------
mietek
> _After a further six months of nagging doubts about the proprietary “window
> mask” technology I had designed, I decided to take the plunge and do what
> Jörn had suggested might lead to a considerable increase in rendering speed:
> namely, to abandon the window masks (which, it turned out, might as well
> have been called “regions”), and use a traditional dirty-rectangle window
> management system, as used in SymbOS._

This is pretty funny when you recall regions were a key part of the original
Macintosh system software — as described in “I Still Remember Regions”:

[http://www.folklore.org/StoryView.py?story=I_Still_Remember_...](http://www.folklore.org/StoryView.py?story=I_Still_Remember_Regions.txt)

See also “Busy Being Born”, currently on the front page:

[http://www.folklore.org/StoryView.py?story=Busy_Being_Born.t...](http://www.folklore.org/StoryView.py?story=Busy_Being_Born.txt)

[https://news.ycombinator.com/item?id=10188952](https://news.ycombinator.com/item?id=10188952)

------
z2
The site might be down right now...

Google text-only cache:
[http://webcache.googleusercontent.com/search?q=cache:http://...](http://webcache.googleusercontent.com/search?q=cache:http://atari8.co.uk/gui/&strip=1&vwsrc=0)

------
terinjokes
I just started getting into programming on the 6809, which looks to have be
designed by the same developers as the 6502. Wonder if in a few years I'll be
able to do the same :)

Edit: I'm reminded of OS-9, but I don't believe that matches the bullet points
from the article

~~~
orionblastar
6809 was Motorola.

MOS took engineers from Motorola who made the 6800 chip, cloned it and MOS
6501 was pin compatible with the 6800 and the 6502 changed two pins. Motorola
sued and won over the 6501 but not the 6502.

The 6809 was a 16 bit chip IIRC. TRS-80 COCO used it. It was the original chip
to be used on the $600 Macintosh project until Steve Jobs took it over and
changed it to the 68000 CPU.

6800 cost $300 and 6502 cost $30, guess which ones the 8 bit video game
consoles and computers wanted to use?

~~~
protomyth
I'm pretty sure the 6809 is still considered a 8-bit processor. You could
treat 2 of the 8-bit registers as a 16-bit, but it was still 8-bit registers.

~~~
orionblastar
[https://en.wikipedia.org/wiki/Motorola_6809](https://en.wikipedia.org/wiki/Motorola_6809)

Apparently it is 8 bit with 16 bit features.

I had a friend in high school who upgraded from the TRS-80 COCO to the TRS-80
COCO2 and offered me his old computer. I already had a Commodore 64 and I told
him no, to give it to another friend who didn't have a computer yet. I should
have taken it so I could run OS9 on it.

------
protomyth
I guess my Atari 400 won't run it (only upgraded to 48k - yeah soldering), but
my 130XE looks good for it. I didn't know about flash cartridge (or Ultimate
1MB/Incognito). If the mouse uses the joystick port, how does it work
(paddle?)?

~~~
Turing_Machine
Doing preemptive multitasking (with hardware memory protection, more or less)
was actually not too hard on the 130XE (and the variants that added more
RAM... I had mine up to 320K at one point, and some people even had 1 or 2
MB).

The extra RAM on those machines appeared in 16K banks that could be switched
in and out by setting a control register.

Code in one 16K bank was totally isolated from code in other banks. Obviously
code that wasn't located in that part of the address space could access
anything that was in the currently switched-in bank, but there was no way for
one bank to directly access another.

I wrote a primitive multitasking kernel (nowhere near as elaborate as this)
using a C package that simulated a 16 bit machine by using multiple bytes of
Page 0 as pseudo-registers. You just had to save those off before switching
banks and jumping back to the other task.

It actually worked reasonably well, but not fast enough to be practical.

Yeah, I'm old.

------
ddingus
Here is the development thread, starting in 2009.

[http://atariage.com/forums/topic/154520-new-gui-for-the-
atar...](http://atariage.com/forums/topic/154520-new-gui-for-the-atari-8-bit/)

------
yitchelle
I wonder if this is a stock 6052 running this OS? Anyone know?

~~~
dsp1234
_In addition to this, Jörn convinced me that pre-emptive multitasking was
possible on the 6502, so in May 2014, I began the arduous task of converting
the existing code to run from a bank-switched cartridge, while simultaneously
designing the pre-emptively multitasking kernel._

 _What is especially heartening at this stage is that the 8-bit Atari can
actually run a pre-emptive scheduler, and run it well – in spite of the 6502’s
fixed stack._

Looks like it is.

~~~
alain94040
In 1987 I published a preemptive multitasking for 6502[1] (Apple II). If I
remember properly (it's only been 30 years or so), the stack is fixed at $100,
so all you need to do on a context switch is save the stack somewhere else and
restore it. That's 256 bytes to move around. I used the mouse card as an
interrupt source (video refresh) to get a regular source of interrupts.

[1] I can't find any trace online, except for some weird Word document of what
must have been the original README file.

------
analognoise
Is the code up somewhere? It sounds very innovative and I'd like to check it
out.

