I still remember the day I read the article on the Mandelbrot set and went, "Whoa, I understand the math!"
I went on to write my first Mandelbrot renderer in Python. I was delighted, in a way, that it worked, but badly disappointed at the performance.
I rewrote the thing in C, and then rewrote it again to make use of multiple CPU cores. I am certain there are Mandelbrot renderers out there that put mine to shame. But still, it was fun, and I got some pretty desktop wallpapers out of it. And I learned a thing or two along the way. Also, I rewrote it again using PVM (Parallel Virtual Machine) to spread the work out across several computers. So I actually learned two or three things along the way. ;-)
In a previous job, I had a supervisor who learned programming in the mid-80s who told me about rendering the Mandelbrot set, then using bit-twiddling on the VGA memory to change the color palette on the display. I was not sure if I should be envious to have missed those times or happy.
yeah, once z = z2 + c was explained, most of Mandelbrot was easy for me too.
I tried it in BASIC on an Apple IIe. it was the first time I was confronted with the question of whether it was safe to run the computer over night, or if it would overheat.
The next morning, the computer was fine, but it still hadn't reached an interesting part of the set.
So once I got a PC I ended up using Fractint. In those days, Intels had slow floating point, so they ported the algorithm to use integer. That was much more effective than jjust speeding up the machine clock (my first learning about performance).
Palette cycling was "cool" but by the time I got to college, while I thought I'd spend all my time generating fractals, I found the real world more interesting...
The first Mandelbrot generator I played with took more than 3 hours on a BBC Micro to render a 160x256 image. That was 30 years ago and I still try out variations on this from time to time (larger exponents, fractional exponents, complex exponents, animations, etc).
And "Fractint" (a fast viewer for many fractal types) was my introduction to open source, although they called it "stone soup". https://en.wikipedia.org/wiki/Fractint
Fractint was, if not a killer app, really popular for EGA on PCs. Compute was getting fast enough at about the same time that the computations weren't too excruciating and EGA cards were the first mainstream graphics that Mandelbrot etc. sets looked decent.
I was there too. Plus at the time Mandelbrot was new, as new as Deepdream images are today.
So the Alto must pre-date Mandelbrot sets and therefore this demo is non-canonical, like playing Tetris on it, not possible at the time or even imagined.
This reminded me of an interview w/ Carl Sagan, Stephen Hawking, and Arthur C Clarke from 1988. Clarke demos the Mandelbrot set and is clearly very excited. If only they knew...
The Alto was introduced in 1973, but was still available in the early 80s, so there is every possibility that someone wrote a Mandelbrot generator on it since it was discovered in 1980. However it is more likely that this happened on the Xerox Star which was around from 1981 and also after 1985 when the first popular Scientific American article about the set appeared...
I worked in a books warehouse before college and two of their books had a little accident and ended up in the rejects bin along with some of my course texts. I lent the P and R books to someone and forgot who so actually bought them later on. They are amazing.
I also remember how short things like Iterated Function System plotters could be - eg a few dozen lines of BASIC could get you a complicated fern piccy in a few seconds on a 80286.
I suspect they had access to some decent Unix workstations with expensive displays. The quality of the plates was breathtaking - you can barely see the graduations in many of the images and they are totally smooth in most of them.
I've just had a massive hit of nostalgia - thanks!
I recall seeing full-colour real-time Mandelbrot set zooming and fractal landscape rendering on a Meiko Computing Surface (containing 32 or 64 Transputers, IIRC) at university, circa 1990. I studied Occam at the time but we weren't allowed to actually program the Transputers ourselves.
They reference IBM labs. The Beauty of Fractal Images talks about an IBM 4361-5 with extended precision, and an IBM 4250 printer for one of the monochrome plates near the end of the book.
The Science of Fractal Images lists, as credits for the colour plates, these people and institutions: http://imgur.com/a/KSnDZ
A cousin, the 4341, was my first mainframe. That was the first time I was exposed to the concept behind pg's "beating the averages article": at a time most of my colleagues were writing dBase or COBOL code, I interned with people using Cincom's Mantis. It was the Rails of the 3270.
One simple way is to start with a largish square, say 8x8 pixels, and evaluate the Mandelbrot set at the corners. If you get the same color, paint the square in a single color. Otherwise split it in 4 smaller blocks, evaluating the set at a 3x3 grid (which is not really correct, the four squares don't actually have any corner in common, but it's twice as fast). Again if any small square has the same color at the corners, fill the block with one color.
One improvement for accuracy is, if you find all four corners have the same color (no matter if it's an 8x8 or 4x4 block), pick another pixel and iterate that one; if the color doesn't match, skip the optimization. That is more expensive, up to 25%, but it can avoid mistakes in the chaotic areas of the set.
And of course, the large cardioid in the center has an analytic expression, and you can skip pixels inside that area, or a conservative approximation of the area.
I remember trying to do this on a 286 and it would take the better part of a day to do an EGA quality image. A friend at the time had a 386 and it seemed blazing fast, it would take only a few hours to do a 640x480 VGA image.
Frax is a masterpiece. It's one of those extremely rare apps that manages to effortlessly combine extreme technical competency with a creative, intuitive and inspiring user experience.
I just got a new 12.9" iPad Pro this weekend, and Frax absolutely pushes the boundaries in terms of what the iPad can do. Just the maths involved to render the light, colour, texture and movement - in real-time, at 2732x2048 resolution - blows me away.
------
EDIT: I just made this[1], and a video exploration of the fractal too[2].
If you're not feeling so artistically inclined - or maybe just burned out on the Mandelbrot set - you can get a "flame" fractal image or video every month from a guy I know who does them professionally. https://www.patreon.com/QuinnPalmer
Super late edit - it's several wallpapers per month and one video. Also you can get physical objects with the designs on them through his kinda-defunct old site http://fractal.world/
Reminds me of writing programs to map the Mandelbrot and Julia sets onto the display of my Kaypro 2X. Didn't even have real bitmapped graphics -- just a way of packing... I think it was 4-pixel squares, into a character representation and then writing that.
Similar to packing up bits to characters shot to some sort of dot matrix printer attached to VAX, to print out such images, a couple of years earlier. At least that generated the images more quickly.
The Kaypro 2X was long in the tooth, by that point.
I've come to understand that many CPUs were not single-chip, well into the integrated circuit era. Today we might assume that a microprocessor will come in a single package, but that wasn't a reasonable assumption in other decades, where the computer designer might even be creating the CPU as well, and not have access to a semiconductor fab.
Are you trying to make me feel old, simply because I once had the 74Fxxx TTL catalog nearly memorized? Or that I can do an instruction decoder and conflict interlock unit in 100K ECL?
Now get off my lawn or I shall be forced to shake my cane at you quite vigorously.
It's surprising to me how in the 1970s, if someone needed a processor for something (even a video game or CNC controller), it was normal for them to invent a new instruction set and put together a processor from TTL chips. The barrier to entry was very low.
>...put together a processor from TTL chips. The barrier to entry was very low.
It was mostly because of the clock speeds, wasn't it? I seem to remember once hearing that below 20 Mhz you could play "TTL Lego" but above that you ran into problems with things like propagation delay.
The stated propagation delay for TTL chips was usually about 10 or 20 nanoseconds (sometimes larger for more complex logical operations, and frequently smaller in subsequent logic generations, like 5 nanoseconds or so). I just checked this in a late 1970s databook.
So, 20 MHz seems pretty credible for a region where you would see problems due to propagation delay—maybe even somewhere below that with the original 7400 series.
That's also a reason that I hadn't thought about so much why putting the whole CPU on a single die would look more and more practical over time: propagation delay in wires!
If you had the money. Those chips weren't cheap back then (even the 74xx branch), and you still needed quite a few to do anything useful when it came to building a CPU.
The Alto has up to 3K RAM to hold custom microcode, and some programs used their own microcode for performance. So, yes it would be quite possible to speed up the Mandelbrot set with microcode. But writing Alto microcode is very difficult, so I'm not planning to do that...
Even though its BASIC interpreter used a 40-bit float, that I was downsampling the resolution to get 64-ish dithered colors, and that it was compiled down to mostly ROM calls, it took a very long time to render regions, in special those closer to the set itself.
Ok. As clock-efficient as the 6502 was, it still ran at 1 MHz.
I remember writing Mandelbrot on C64 and Amigas. Depending on depth you could wait some time. So it often was overnight, with a pleasunt suprise the next morning (or some ugly looking picture)
On a modern computer, a Javascript program
can generate the Mandelbrot set in a fraction
of a second, while the Alto took an hour.
Hm... a "fraction" could mean anything. If it's exactly a second, then the speedup factor would be about 3600x.
The Alto was built about 40 years ago, right? That would be 30% performance improvement per year. Is that about right for single cpu performance acceleration over the last 40 years?
Well with this[0] implementation[1] my laptop can render 3840x2400 pixels at 60fps, so ignoring the larger resolution, so how about 1/60th of a second?
I'm working on getting Mesa and Smalltalk running on the Alto, so stay tuned...
But BCPL is actually an interesting language. I knew that it influenced C (as did Algol and other languages), but until using BCPL I didn't realize how much of C was really the same as BCPL.
I went on to write my first Mandelbrot renderer in Python. I was delighted, in a way, that it worked, but badly disappointed at the performance.
I rewrote the thing in C, and then rewrote it again to make use of multiple CPU cores. I am certain there are Mandelbrot renderers out there that put mine to shame. But still, it was fun, and I got some pretty desktop wallpapers out of it. And I learned a thing or two along the way. Also, I rewrote it again using PVM (Parallel Virtual Machine) to spread the work out across several computers. So I actually learned two or three things along the way. ;-)
In a previous job, I had a supervisor who learned programming in the mid-80s who told me about rendering the Mandelbrot set, then using bit-twiddling on the VGA memory to change the color palette on the display. I was not sure if I should be envious to have missed those times or happy.