
“No Man’s Sky” Displayed on the Amiga 1000 - doener
http://www.bytecellar.com/2018/03/14/a-planetary-anachronism-no-mans-sky-beautifully-rendered-on-the-amiga-1000/
======
unwind
I agree that this is not really "rendering" the image on the Amiga, it's
merely using the Amiga's display hardware but no processing is happening on
the machine. Converting the image to HAM is a bit like (re-)rendering, but
that's a one-time operation and done elsewhere.

Anyway, the HAM converter used sounded interesting (ham_convert [1]), it does
various optimization/search passes to try to generate good HAM.

There's similar code in gimpilbm, the ILBM plugin for the GIMP [2], but not
quite as advanced. Perhaps something worth looking at and adding. Disclosure:
I happen to be the maintainer of gimpilbm.

1:
[http://mrsebe.bplaced.net/blog/wordpress/?page_id=374](http://mrsebe.bplaced.net/blog/wordpress/?page_id=374)
2: [https://github.com/unwind/gimpilbm](https://github.com/unwind/gimpilbm)

~~~
paulmd
HackerNews: may contain up to 25% "posts whining about titles" by weight.

------
exogeny
The OCS Amigas were amazing, way-ahead-of-their-era machines. I can't tell you
how many times, in need of inspiration or just to be amazed for five minutes,
I'd load up a demo from Sanity or Phenomena and just let it all sink in - the
code, the graphics, the music, all in Assembly and all from pure imagination.

I'm being nostalgic - and super preferential to the things that I enjoy - but
nascent and emerging technology for me seems to have always generated the most
exciting art. The demoscene from the early days of the PC, the ANSI/artzine
scene from the early days of BBSes and telecommunication, the personal home
pages from the early days of the web.

I'm going back and forth about whether or not I see the same art in today's
emerging tech. I think I see where the tech is going and I'm excited by it,
but I don't think I see the art as much. Happy to be convinced otherwise!

~~~
sundvor
Ah, Phenomena - that reminded me of one of my favourite demos back then:

[https://www.youtube.com/watch?v=iGpU3DicbLQ](https://www.youtube.com/watch?v=iGpU3DicbLQ)

I was a bit involved in the Norwegian Amiga scene then back in Norway. Nothing
major, but very formative - and those were good times.

~~~
blakespot
Enigma is one of the very best demos I've ever experienced on any platform.
Pygmy Project's Extension is another Amiga favorite of mine. I've never lost
my love for the demoscene, but I far prefer new demos on retro systems over,
say, Windows 10-based 3D demos. Very much so.

------
fake-name
What part of this is rendering? It's just file format conversion, with a
oddball output format.

If they actually ran the no man sky engine on the amiga, even if it was 1
frame per hour, /that/ would actually be impressive.

~~~
jacobush
OK.

 _" No Man's Sky" viewed through the lens of old technology._

Now go enjoy the beauty.

~~~
ainiriand
I find a bit sad that the beauty has to be directly given to you. I will
always remember the first games of the Amiga as the most beautiful graphics
I've ever seen, the imagination completed the part that the technology missed.

This 'lens of old technology' as you call it made the dreams of many come
true, without 4k or anything. The beauty is in your mind, not in the amount of
pixels that reach your corneas.

~~~
jacobush
Oh, I am with you 100%. My go to example were flight simulators. The plain
colors, filled vector, no shading, no textures, fixed palette were the most
beautiful to me. I don't think even now the best graphics beats the
imagination of that. The same goes for NES 8 bit Mario. The pixelness is
great, even compared to the latest Mario, which is a really, really good game.

------
bhaak
I was surprised to see that the images were 320x400. I completely forgot that
this resolution existed and supported HAM6.

I converted the images with imagemagick and it really achieves a retro SF
look, e.g.
[https://bhaak.net/nms/3_output.png](https://bhaak.net/nms/3_output.png),
[https://bhaak.net/nms/9_output.png](https://bhaak.net/nms/9_output.png)

Although I think this images might be a bit better than how they would be
shown on an original Amiga. I don't see any of the typical HAM6 artefacts one
would expect.

~~~
jacobush
It's because 400 is interlaced, and thus it's really the same as 200, only
with every other line shifted half a line's width to the left (or right, I
don't think it makes a difference).

~~~
dzdt
Interlace displays every other field shifted half a raster line _down_ \-- it
doubles the vertical resolution not horizontal.

~~~
jacobush
... down, right, whatever. "Delaying" it half a line might have been better
worded to describe what's happening.

------
continuational
This article is rendering No Mans Sky on Amiga 1000 in the sense that looking
on a photo of a beautiful landscape is the same as visiting it.

Ie. it just displays a slideshow of screenshots on the Amiga 1000, which might
as well have been vacation photos.

------
nateguchi
Could someone explain why there is such a cult following behind these old
Amiga machines? I see them mentioned a lot on HN and would like to understand
the interest...

~~~
drblast
Imagine that tomorrow a company started selling a game console that does VR
perfectly. No lag, fully immersive, it just blows everything else out of the
water. And it's cheaper than a PlayStation, although sadly few people buy the
thing and the company completely mismanages the sales and marketing and it's
ten years before another consumer device comes close.

That's what the Amiga was like in the 80's.

~~~
pavlov
Just to clarify for those not familiar with the Amiga, it wasn't a game
console but a fully featured computer with a custom OS that supported
preemptive multitasking a decade before Windows and Mac did. Graphics hardware
was just one part of what made the total package so innovative.

~~~
Flow
I must say I like the static priority of the threads in the AmigaOS.

On the fat iMac I use, doing some video conversions in the background makes it
impossible to play TF2 at the same time.

------
brador
Could you do this in real-time with a post-process shader on PC?
shadertoy.com?

~~~
boomlinde
I don't think fragment shaders are suitable for this, since the HAM modes are
very stateful. In HAM6 for example, a screen is composed of a bunch of 6-bit
instructions encoding one of four operations each with four bits of data. You
can either set the color directly to any of the colors from a 16-color
palette, or you can use the color of the previous pixel, modifying one of its
RGB channels.

This is similar with HAM8 but in order to be able to use more of the 24-bit
color space the instructions are 8-bit wide with six bits of data, and you can
only modify the six most significant bits of a color channel and directly use
a 64-color palette instead.

~~~
dahart
FWIW, you can do plenty of stateful things on shadertoy, as long as feedback
and multiple buffers are allowed. There are complete interactive games on
shadertoy. I think it wouldn't be hard to render HAM in a fragment shader. If
you're okay with very non-optimized colors, of course, you can use the direct
set instructions only, and skip the neighbor modify instructions. But I've
done several shaders on shadertoy with neighbor-modify type functionality,
including a Floyd-Steinberg like dither, it just takes multiple frames.

~~~
boomlinde
Sure, you can maintain state in a texture and iteratively re-apply the shader
to it. The problem with HAM is that it potentially requires many iterations.
Consider the worst case scenario of an input resulting in a single set color
operation followed by width x height-1 modify operations. Not too bad for a
simple stateful loop on the CPU, always keeping the last color in a register,
but with a shader re-applied once per frame to 320x200 pixels you'd be waiting
for ~20 minutes for the complete result at 60 fps.

Then there's the problem of selecting a useful base palette, if you want
consistently decent results regardless of the input. That's maybe the kind of
optimization problem that OpenCL can be applied to.

So I don't mean to say unsuitable as in impossible, just in the sense that
it's exactly the kind of problem that the shader model wasn't designed to
solve.

~~~
dahart
I don't disagree with the idea that ShaderToy wasn't designed for this, or
with the idea that it'd be harder to do in a fragment shader than a CPU
program. But fragment shaders weren't designed for most of the things you see
on ShaderToy. ;) The question is whether it was possible in real-time, and I
still think it is.

I'm not very interested in the worst case, I think a HAM image with width *
height modify ops is _very_ unlikely. I might even be surprised if such a HAM
image has even ever existed, outside of perhaps a contrived programmer's
example.

The average case for a direct HAM implementation would probably be more like
hundreds of frames, or a couple of seconds, not tens of thousands.

If a modified HAM is on the table, the worst and average cases could be
improved considerably. I modified Floyd-Steinberg (which is worst case) so
that it runs in more like 10-20 frames, by making it radially symmetric and
limiting the propagation distance, which results in negligible differences but
vastly improved run times.

The palette optimization is also possible in a fragment shader, not quite as
easily as OpenCL, but possible.

If the question were about transcoding a video stream in real time, that will
be harder to do real time. If the optimized palette is fixed, it will be
easier. Either way, I wouldn't rule it out. But I would agree it's "hacking"
in the sense of abusing things they weren't designed for.

~~~
boomlinde
> I'm not very interested in the worst case, I think a HAM image with width *
> height modify ops is very unlikely.

Sure, but maybe not as far off in some real world cases. Say, black-but-not-
perfectly black letter box bars where the nuances could result in long
sequences of small alterations to the base black color. This becomes less
likely with an optimized palette, and of course you could set a hard limit to
length of modify sequences in conversion, though that would have to compromise
the quality of conversion.

> The average case for a direct HAM implementation would probably be more like
> hundreds of frames, or a couple of seconds, not tens of thousands.

Then the problem, with respect to the original question, is the definition of
real-time. For the purpose of a game like No Man's Sky for example I'd think
hundreds or even tens of frames would be unacceptable.

Don't get me wrong, I love technology applied to the "wrong problem" and see
that as an end in itself sometimes :). I'm just considering the feasibility of
accurate HAM in realtime using shaders, and am really enjoying the discussion
on what compromises have to be made for it to perform adequately in a shader.
But with respect to the original question I can't quite bring myself to say
"yes"

~~~
dahart
> Then the problem, with respect to the original question, is the definition
> of real-time.

Yeah, true. But there are two factors here that still meet a reasonable
definition of real-time, which is >= 30fps in my book. First, a predictable
number of frames of latency still allows 30fps throughput. Second, this
encoding can be done in multi-pass batches in a single frame. The dependency
of one sweep of modify ops doesn't need to wait for a full frame to do the
next sweep. This is very common on ShaderToy where people build dependent
buffer chains. It's perfectly reasonable (outside ShaderToy) to do 10 sweeps
in a one frame cycle. I don't know what the limit would be on a standard GPU
these days, it might be much higher. Consider that shaders on ShaderToy are
routinely doing several hundred tap texture samples per pixel, and/or several
hundred Ray marching iterations per pixel. The amount of work the GPU can do
now in real time is incredible.

You can combine the inter-frame sweeps with cross-frame passes, and you only
lose latency but not throughput. There's a likely chance you can also unroll
once or twice for a single pass. If you unrolled once, and had 10 passes
inter-frame, and had a 10 frame latency, you could in theory get a modify
radius of 200 pixels, while maintaining 30fps throughput with 10 frames of
latency, assuming I haven't taxed out the GPU at this point.

So, you're right, it would be silly & hard to do this, but I still think HAM
encoding could probably be done in real time, with perhaps some compromises or
quality limitations to keep the encoding predictable, or at least have a known
reasonable upper bound.

------
tombert
I really wish that the Amiga had caught on in the US. Things like this never
cease to impress me. It feels like Amiga was lightyears ahead of its PC
Counterparts; my dad's 1985 IBM PC junior could never display a picture that
well (as far as I know).

------
AJRF
Thanks for sharing OP. This is a lovely little blog in general, not just the
main post, glad its popped up on my radar as I hadn't heard of it before.

~~~
blakespot
Thank you, and you're welcome.

------
jlebrech
we should all by on amigas right now, commodore just sucked at marketing.

~~~
KozmoNau7
The Amiga architecture was very limited in regards to 3D (or pseudo-3D), so
was pretty much left in the dust when Doom came out.

Very awesome at 2D, though.

~~~
jacobush
Doom was not very 3D, it just could address 1 byte = 1 pixel on VGA. That
would have been trivial to fix in the mythical AAA chipset, which was
scrapped, then replaced years later with the slightly better AGA chipset in
the A4000 computer.

Commodore was in a (brief) position to destroy Apple and ridicule IBM, but the
execs preferred to "Toys-R-Us"-it and take the bonuses.

~~~
DerekL
You've identified a major reason the Amiga failed. The chipset was originally
great, but Commodore should have spent the money to keep up with Moore's Law.
Something like AGA should have arrived in 1988, not 1992, and with a blitter
four times the power as OCS, not just twice.

------
tnolet
This has nothing to do with rendering a game engine on an Amiga.

~~~
TuringTest
Rendering the game engine was not mentioned in the title.

