Hacker News new | past | comments | ask | show | jobs | submit login
“No Man’s Sky” Displayed on the Amiga 1000 (bytecellar.com)
142 points by doener on March 16, 2018 | hide | past | favorite | 55 comments



I agree that this is not really "rendering" the image on the Amiga, it's merely using the Amiga's display hardware but no processing is happening on the machine. Converting the image to HAM is a bit like (re-)rendering, but that's a one-time operation and done elsewhere.

Anyway, the HAM converter used sounded interesting (ham_convert [1]), it does various optimization/search passes to try to generate good HAM.

There's similar code in gimpilbm, the ILBM plugin for the GIMP [2], but not quite as advanced. Perhaps something worth looking at and adding. Disclosure: I happen to be the maintainer of gimpilbm.

1: http://mrsebe.bplaced.net/blog/wordpress/?page_id=374 2: https://github.com/unwind/gimpilbm


HackerNews: may contain up to 25% "posts whining about titles" by weight.


The OCS Amigas were amazing, way-ahead-of-their-era machines. I can't tell you how many times, in need of inspiration or just to be amazed for five minutes, I'd load up a demo from Sanity or Phenomena and just let it all sink in - the code, the graphics, the music, all in Assembly and all from pure imagination.

I'm being nostalgic - and super preferential to the things that I enjoy - but nascent and emerging technology for me seems to have always generated the most exciting art. The demoscene from the early days of the PC, the ANSI/artzine scene from the early days of BBSes and telecommunication, the personal home pages from the early days of the web.

I'm going back and forth about whether or not I see the same art in today's emerging tech. I think I see where the tech is going and I'm excited by it, but I don't think I see the art as much. Happy to be convinced otherwise!


I'm with you - and on the subject, having been exposed to a hell of a lot of iOS apps, I kind of think the walled-garden has, in many ways, both promoted the 'weird-art' aspect, as well as limited it.

Promoted: there are now more apps out there than ever before, and more app developers, and more app artists. And they're doing interesting and cool things - most of the top 10 games of the last few years have been artistically appealing pieces, which draw you in with a screenshot and keep you engaged with mechanics.

Hindered: I really, really don't want to see a certain color palette being used to burn itself into my retina and shock me into clicking the icon, among hundreds of others, that stands out the most. I can't see another candy-coating, or amber/cyan highlight, again. But, yet .. still they come. Endless streams of cutesy dreck.

That said, I still think the most interesting game/developer art is coming from the indie gaming/open-source world. There's something about, for example, the benign rag doll look and feel of a game like "Human, Fall Flat" that draws me and the kids back into it. It feels like the good ol' days of gaming, and its something I'm sure has an Amiga heritage, somehow. I don't know if there were really opportunities made for the average mobile App Store to entertain such titles ..


Ah, Phenomena - that reminded me of one of my favourite demos back then:

https://www.youtube.com/watch?v=iGpU3DicbLQ

I was a bit involved in the Norwegian Amiga scene then back in Norway. Nothing major, but very formative - and those were good times.


Enigma is one of the very best demos I've ever experienced on any platform. Pygmy Project's Extension is another Amiga favorite of mine. I've never lost my love for the demoscene, but I far prefer new demos on retro systems over, say, Windows 10-based 3D demos. Very much so.


The demoscene is the one true free art form of the digital age. Not beholden to corporate interests, not generally responsive to attempts to enact prior restraint, not riddled with people who feel it is their duty to police the content of others.


I think you might enjoy this: http://www.brutalistwebsites.com/


Here's a quotation you might like from Andy Hertzfeld, who helped write the original MacOS (official employee title: "Software Wizard.")

"The Mac team had a complicated set of motivations, but the most unique ingredient was a strong dose of artistic values. First and foremost, Steve Jobs thought of himself as an artist, and he encouraged the design team to think of ourselves that way, too. The goal was never to beat the competition, or to make a lot of money; it was to do the greatest thing possible, or even a little greater. Steve often reinforced the artistic theme; for example, he took the entire team on a field trip in the spring of 1982 to the Louis Comfort Tiffany museum, because Tiffany was an artist who learned how to mass produce his work."

Source: https://www.folklore.org/StoryView.py?project=Macintosh&stor...


I remember there being some demos burned onto a CD in the early 90's. They were really impressive. I didn't grasp at the time how much work went into them.

I think of the things that makes this stuff work is that they have constraints and hold themselves to be creative within those constraints. Contrast that to the processing power an average developer has today and it seems nearly unbounded. That's one of the things I am nostalgic about games and demos from yesteryear.


You might enjoy http://codetapper.com/amiga/sprite-tricks/, which talks through how some games use the Amiga's graphics hardware to achieve interesting effects - I found it really interesting!


re: Art in emerging tech. Though "VR" has arguably been emerging for many years, I do occasionally see glimmers of what we once saw in the past when I load up random apps for the HTC Vive. That community still feels like it produces some very artistic work from time to time.


What part of this is rendering? It's just file format conversion, with a oddball output format.

If they actually ran the no man sky engine on the amiga, even if it was 1 frame per hour, /that/ would actually be impressive.


The Amiga had a rather interesting program called Vista Pro. All it did was take a seed value and some tweakable values and use that to render a semi-realistic looking landscape. It did indeed take quite a while to render a single frame, but you could batch them up and bring them together to make a fly-over animation.

https://en.wikipedia.org/wiki/VistaPro

I spent ages playing with this piece of software. Trivial by todays standards perhaps, but a window into what was possible.


OK.

"No Man's Sky" viewed through the lens of old technology.

Now go enjoy the beauty.


I find a bit sad that the beauty has to be directly given to you. I will always remember the first games of the Amiga as the most beautiful graphics I've ever seen, the imagination completed the part that the technology missed.

This 'lens of old technology' as you call it made the dreams of many come true, without 4k or anything. The beauty is in your mind, not in the amount of pixels that reach your corneas.


Oh, I am with you 100%. My go to example were flight simulators. The plain colors, filled vector, no shading, no textures, fixed palette were the most beautiful to me. I don't think even now the best graphics beats the imagination of that. The same goes for NES 8 bit Mario. The pixelness is great, even compared to the latest Mario, which is a really, really good game.


I was surprised to see that the images were 320x400. I completely forgot that this resolution existed and supported HAM6.

I converted the images with imagemagick and it really achieves a retro SF look, e.g. https://bhaak.net/nms/3_output.png, https://bhaak.net/nms/9_output.png

Although I think this images might be a bit better than how they would be shown on an original Amiga. I don't see any of the typical HAM6 artefacts one would expect.


It's because 400 is interlaced, and thus it's really the same as 200, only with every other line shifted half a line's width to the left (or right, I don't think it makes a difference).


Interlace displays every other field shifted half a raster line down -- it doubles the vertical resolution not horizontal.


... down, right, whatever. "Delaying" it half a line might have been better worded to describe what's happening.


I chose 400-line interlaced as it is twice the resolution of 320x200 mode. Quite a difference in clarity.


This article is rendering No Mans Sky on Amiga 1000 in the sense that looking on a photo of a beautiful landscape is the same as visiting it.

Ie. it just displays a slideshow of screenshots on the Amiga 1000, which might as well have been vacation photos.


Could someone explain why there is such a cult following behind these old Amiga machines? I see them mentioned a lot on HN and would like to understand the interest...


Imagine that tomorrow a company started selling a game console that does VR perfectly. No lag, fully immersive, it just blows everything else out of the water. And it's cheaper than a PlayStation, although sadly few people buy the thing and the company completely mismanages the sales and marketing and it's ten years before another consumer device comes close.

That's what the Amiga was like in the 80's.


Just to clarify for those not familiar with the Amiga, it wasn't a game console but a fully featured computer with a custom OS that supported preemptive multitasking a decade before Windows and Mac did. Graphics hardware was just one part of what made the total package so innovative.


I must say I like the static priority of the threads in the AmigaOS.

On the fat iMac I use, doing some video conversions in the background makes it impossible to play TF2 at the same time.


Try these:

Ars Technica's History of the Amiga - https://arstechnica.com/gadgets/2007/07/a-history-of-the-ami...

"The Future Was Here" - http://amiga.filfre.net


You have all these 8-bit computers limited to fixed palettes of 8 or 16 colors (except the Atari, MSX2 stuff) at the time. Then you have the Macintosh with its monochrome flat framebuffer, or the IBM with what, four to 16 colors? The Amiga chipset was originally designed as a video game system. Its predecessors were really the Atari 8-bit computer family and the Atari 2600. It had a bunch of coprocessors to accelerate 2D graphics, offloading the CPU for other crap while the graphics chipset handled sprites, video memory copying and merging, and even a simple "line shader" that could update graphics registers on a line-per-line basis independent of the CPU. Add similarly powerful audio hardware with four channels of arbitrary sample playback with per-channel frequency and volume independent of the sample data, and a pre-emptively multitasking operating system in many ways ahead of its time and you have a hell of a computer that can do things with the CPU running cold (the boing ball demo on the Amiga was famously implemented using the blitter hardware to draw the checkered ball and changing the palette once per frame to "rotate" it).

Of course, these features became increasingly irrelevant as CPUs got cheaper and faster and could perform the graphics duties of the Aliga adequately in software, and the it didn't keep up. Particularly devastating IMO is that the Amiga uses a bit-plane model for graphics. Essentially, this means that graphics screen is composed of several single-bit per pixel planes that are interleaved to index a palette. This is great for memory consumptions because if you don't need 4 bit graphics for a particular screen, you can use two or three planes. It's awful when it comes to render raycasting columns because it makes each pixel in a comumn based renderer take several writes and reads. So a popular opinion is that Doom killed the Amiga. IMO it was just the final nail in the coffin.

I got myself an Amiga 1200 with an accelerator rather late, trading it for a Pentium PC when those went out of fashion, just out of interest. It's a great machine and I still sometimes use it for games, basic internet duties like IRC and... Doom!


Look at the "Reception" area of the wiki post for one of the Amiga's famous early games, Defender of the Crown. It's a quite from Brian Bagnal's Commodore book that explains to a degree the significance of experiencing the Amiga's graphics way back when. I remember the feeling he describes, well.

https://en.wikipedia.org/wiki/Defender_of_the_Crown

Also, I saw this article pop up yesterday that takes a look at "The Amiga Concsiousness." It may help, here.

http://countingvirtualsheep.com/the-amiga-consciousness/


Same way there's a cult following behind Lisp, Prolog & APL. They were way ahead of the time. In a different timeline, if they had been marketed well and adopted, the entire industry would be far ahead and better. It makes us sad that things didn't turn out well and entire industry/world was delayed by years if not decades. We keep talking about this as a reminder that popular is not necessarily best. Those that forget the past are bound to repeat the same mistakes again.


Could you do this in real-time with a post-process shader on PC? shadertoy.com?


I don't think fragment shaders are suitable for this, since the HAM modes are very stateful. In HAM6 for example, a screen is composed of a bunch of 6-bit instructions encoding one of four operations each with four bits of data. You can either set the color directly to any of the colors from a 16-color palette, or you can use the color of the previous pixel, modifying one of its RGB channels.

This is similar with HAM8 but in order to be able to use more of the 24-bit color space the instructions are 8-bit wide with six bits of data, and you can only modify the six most significant bits of a color channel and directly use a 64-color palette instead.


FWIW, you can do plenty of stateful things on shadertoy, as long as feedback and multiple buffers are allowed. There are complete interactive games on shadertoy. I think it wouldn't be hard to render HAM in a fragment shader. If you're okay with very non-optimized colors, of course, you can use the direct set instructions only, and skip the neighbor modify instructions. But I've done several shaders on shadertoy with neighbor-modify type functionality, including a Floyd-Steinberg like dither, it just takes multiple frames.


Sure, you can maintain state in a texture and iteratively re-apply the shader to it. The problem with HAM is that it potentially requires many iterations. Consider the worst case scenario of an input resulting in a single set color operation followed by width x height-1 modify operations. Not too bad for a simple stateful loop on the CPU, always keeping the last color in a register, but with a shader re-applied once per frame to 320x200 pixels you'd be waiting for ~20 minutes for the complete result at 60 fps.

Then there's the problem of selecting a useful base palette, if you want consistently decent results regardless of the input. That's maybe the kind of optimization problem that OpenCL can be applied to.

So I don't mean to say unsuitable as in impossible, just in the sense that it's exactly the kind of problem that the shader model wasn't designed to solve.


I don't disagree with the idea that ShaderToy wasn't designed for this, or with the idea that it'd be harder to do in a fragment shader than a CPU program. But fragment shaders weren't designed for most of the things you see on ShaderToy. ;) The question is whether it was possible in real-time, and I still think it is.

I'm not very interested in the worst case, I think a HAM image with width * height modify ops is very unlikely. I might even be surprised if such a HAM image has even ever existed, outside of perhaps a contrived programmer's example.

The average case for a direct HAM implementation would probably be more like hundreds of frames, or a couple of seconds, not tens of thousands.

If a modified HAM is on the table, the worst and average cases could be improved considerably. I modified Floyd-Steinberg (which is worst case) so that it runs in more like 10-20 frames, by making it radially symmetric and limiting the propagation distance, which results in negligible differences but vastly improved run times.

The palette optimization is also possible in a fragment shader, not quite as easily as OpenCL, but possible.

If the question were about transcoding a video stream in real time, that will be harder to do real time. If the optimized palette is fixed, it will be easier. Either way, I wouldn't rule it out. But I would agree it's "hacking" in the sense of abusing things they weren't designed for.


> I'm not very interested in the worst case, I think a HAM image with width * height modify ops is very unlikely.

Sure, but maybe not as far off in some real world cases. Say, black-but-not-perfectly black letter box bars where the nuances could result in long sequences of small alterations to the base black color. This becomes less likely with an optimized palette, and of course you could set a hard limit to length of modify sequences in conversion, though that would have to compromise the quality of conversion.

> The average case for a direct HAM implementation would probably be more like hundreds of frames, or a couple of seconds, not tens of thousands.

Then the problem, with respect to the original question, is the definition of real-time. For the purpose of a game like No Man's Sky for example I'd think hundreds or even tens of frames would be unacceptable.

Don't get me wrong, I love technology applied to the "wrong problem" and see that as an end in itself sometimes :). I'm just considering the feasibility of accurate HAM in realtime using shaders, and am really enjoying the discussion on what compromises have to be made for it to perform adequately in a shader. But with respect to the original question I can't quite bring myself to say "yes"


> Then the problem, with respect to the original question, is the definition of real-time.

Yeah, true. But there are two factors here that still meet a reasonable definition of real-time, which is >= 30fps in my book. First, a predictable number of frames of latency still allows 30fps throughput. Second, this encoding can be done in multi-pass batches in a single frame. The dependency of one sweep of modify ops doesn't need to wait for a full frame to do the next sweep. This is very common on ShaderToy where people build dependent buffer chains. It's perfectly reasonable (outside ShaderToy) to do 10 sweeps in a one frame cycle. I don't know what the limit would be on a standard GPU these days, it might be much higher. Consider that shaders on ShaderToy are routinely doing several hundred tap texture samples per pixel, and/or several hundred Ray marching iterations per pixel. The amount of work the GPU can do now in real time is incredible.

You can combine the inter-frame sweeps with cross-frame passes, and you only lose latency but not throughput. There's a likely chance you can also unroll once or twice for a single pass. If you unrolled once, and had 10 passes inter-frame, and had a 10 frame latency, you could in theory get a modify radius of 200 pixels, while maintaining 30fps throughput with 10 frames of latency, assuming I haven't taxed out the GPU at this point.

So, you're right, it would be silly & hard to do this, but I still think HAM encoding could probably be done in real time, with perhaps some compromises or quality limitations to keep the encoding predictable, or at least have a known reasonable upper bound.


I really wish that the Amiga had caught on in the US. Things like this never cease to impress me. It feels like Amiga was lightyears ahead of its PC Counterparts; my dad's 1985 IBM PC junior could never display a picture that well (as far as I know).


Thanks for sharing OP. This is a lovely little blog in general, not just the main post, glad its popped up on my radar as I hadn't heard of it before.


Thank you, and you're welcome.


we should all by on amigas right now, commodore just sucked at marketing.


The Amiga architecture was very limited in regards to 3D (or pseudo-3D), so was pretty much left in the dust when Doom came out.

Very awesome at 2D, though.


Doom was not very 3D, it just could address 1 byte = 1 pixel on VGA. That would have been trivial to fix in the mythical AAA chipset, which was scrapped, then replaced years later with the slightly better AGA chipset in the A4000 computer.

Commodore was in a (brief) position to destroy Apple and ridicule IBM, but the execs preferred to "Toys-R-Us"-it and take the bonuses.


You've identified a major reason the Amiga failed. The chipset was originally great, but Commodore should have spent the money to keep up with Moore's Law. Something like AGA should have arrived in 1988, not 1992, and with a blitter four times the power as OCS, not just twice.


AGA being slightly better than the original chipset, not slightly better than AAA, of course.


Yes, but they could've caught up if they weren't releasing products which made no sense and ate in each other's lunch. They went bust, or were good as dead, before Doom made final blow to it.


I feel that Alien Breed 3D was pretty ok; it didn't perform great but it was certainly playable. Weren't there a fair number of Doom-clones on the system?


Definitely, but too little, too late. Alien Breed 3D is barely playable without an accelerator. IMO Gloom ran fine without an accelerator but is a rather boring game compared to Doom, just like AB3D.


Or Acorn Archimedes.


Unfortunately they were an UK phenomenon.

I could see Amigas and Ataris everywhere, back in Portugal, while Archimedes were only known to me thanks to Computer Shopper UK edition, Alternative Platforms sections.


Very similar except Amigas had major market adoption as well as tech and cultural great points. But tech wise, Archimedes was a beast.


Well, Archimedes didn't really go away, being the first real ARM-based computing device. The legacy is in everyone's pocket.


This has nothing to do with rendering a game engine on an Amiga.


Rendering the game engine was not mentioned in the title.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: