
Making a 4K Fractal Movie with Fractal EXtreme - dmit
https://randomascii.wordpress.com/2016/01/20/making-a-4k-fractal-movie-with-fractal-extreme/
======
pdkl95
If you like this kind of zoom into the Mandelbrot set, check out this[1] zoom
to 2^1116 from a few years ago (which is also available in 4k). The final ~2/5
of the zoom is probably the trippiest part of the Mandelbrot set I've ever
seen, with a repeating "evil eye (nazar)"/"eye of sauron" motif[2] of
incredible complexity.

[1]
[https://www.youtube.com/watch?v=PbwaFQ2r2c4](https://www.youtube.com/watch?v=PbwaFQ2r2c4)

[2] [http://imgur.com/a/cwVHZ](http://imgur.com/a/cwVHZ)

~~~
brucedawson
Very nice. Thanks for sharing.

------
tetraodonpuffer
Like many others here I am sure, I was quite obsessed with fractals way back,
fractint was awesome, and I did spend quite some time writing an integer math
fractal generator in assembly on my AtariST back then (I remember enjoying
just how many registers I could use on M68k compared to my friends that were
working with x86 and how it made things so much simpler, I really enjoyed M68k
assembly)

When I first found XaoS in later years it was amazing, thinking that you could
actually zoom in real time (!) and it's nice to see that some programs are
still around allowing you to do fractals.

It would be interesting to know which would be the fastest way to calculate
them nowadays, if on the CPU or on the GPU (I think in both cases one would
have to roll their own high precision math for deep zooms)

(edit) found a post on the Fractal Extreme blog discussing GPU math, it is
from 2012 though, I wonder if the intervening 4 years have changed the
author's opinion on this.

[https://randomascii.wordpress.com/2012/03/28/fractal-and-
cry...](https://randomascii.wordpress.com/2012/03/28/fractal-and-crypto-
performance/)

~~~
RamshackleJ
I think it is no question that GPU's handle fractal renders significantly
better than the CPU does. If you are rendering a fractal properly each pixel
can be computed separately from all others, a problem that is perfect for the
massive parallel multiprocessor.

I would even go far to say that with current GPU hardware rendering fractals
is "solved" in terms of realtime capability. see:
[https://www.shadertoy.com/results?query=fractal](https://www.shadertoy.com/results?query=fractal)

I think fractals are still awesome despite being pretty simple these days.
They are an excellent gateway into doing programing on the GPU. I think they
are exactly the type awe inspiring programs that younger students should be
shown.

~~~
ghusbands
A lack of arbitrary precision makes GPUs unsuitable for deeper Mandelbrot zoom
levels, so "solved" is overstating it.

~~~
RamshackleJ
how many levels are you talking about?

I agree the worse floating point precision of GPUs puts them at a disadvantage
but they still do better for massively parallel problems like fractal renders.
([http://www.bealto.com/mp-
mandelbrot_benchmarks.html](http://www.bealto.com/mp-
mandelbrot_benchmarks.html))

~~~
brucedawson
GPUs are happiest when doing float precision, which is 24 bits.

Most GPUs can, at reduced speed, do double precision, which is 53 bits.

The maximum precision for this zoom movie is 960 bits.

So, unless you write code to compose high-precision math operations out of
double-precision math then the GPU cannot even participate. And once you do
that you will find that the GPU's speed advantage is mostly or entirely lost.

~~~
Houshalter
CPUs can't do 960 bit operations either.

~~~
brucedawson
\- CPUs can do 64x64 multiplies that give 128-bit results. GPUs cannot. \-
CPUs can do add-with-carry. GPUs cannot.

That means that GPUs are not very good at doing high-precision math. They lose
a lot more efficiency implementing high-precision math than CPUs do. And GPUs
are also worse at doing parallel calculations that end up following different
paths, which inevitably happens when one pixel diverges before another.

My back-of-the-envelope estimates show that GPUs are not worth it for this
use-case. I will change my mind when I see a GPU-based deep zoom fractal
program.

~~~
WheretIB
While AMD GPU can't do 64x64 multiplies that give 128-bit results, they still
can do a 32x32 multiply with a 64-bit result. (MULHI_ _INT and MULLO__ INT
instructions).

And you can do add-with-carry manually on the AMD GPU using two instructions:
ADDC_UINT to get the carry flag from an addition followed by ADD_PREV to add
it.

~~~
brucedawson
Good to know. I was not aware of those instructions.

It would be interesting to see how well a GPU can do compared to a CPU. Doing
64-bit math gives an automatic four-to-one advantage (it takes four 32x32
multiplies to replicate a single 64x64 multiply) so the GPU would be starting
out at a disadvantage, plus GPUs run at lower clock rates and have other
disadvantages, but perhaps their many many cores would be enough to overcome
this.

------
Animats
Fractals in the browser, in WebGL, in real time.[1] Pan and zoom around. Much
fun.

[1]
[http://hirnsohle.de/test/fractalLab/](http://hirnsohle.de/test/fractalLab/)

~~~
exodust
Absolutely. And it's not obvious, but you can move around with WSAD keys, and
QE for up/down.

It would be nice to make rendered animations in this, I don't think that's a
feature of the web app. Maybe there's some other way to make animations? Would
make a great background for an alien ship flying through an alien cityscape.

------
pantalaimon
Ever since I've discovered 3D fractals, 2D ones simply don't cut it for me
anymore.

Unfortunately they are much harder to calculate.

[http://blog.hvidtfeldts.net/index.php/category/mandelbulb/](http://blog.hvidtfeldts.net/index.php/category/mandelbulb/)

------
kristiandupont
It's funny, I always found fractal zooms to be deeply unsatisfying. Although I
know it won't happen, I keep waiting for some kind of resolution or goal. In a
philosophical way, I think I have a bit of the same feeling towards life
itself.. :-)

~~~
inDigiNeous
Yeah I always get the same feeling too. Traditional mandelbrot/julia whatever
fractals are nice, but really boring IMO.

------
Houshalter
What's with the all the software and even file formats not supporting 4K? I
mean yes it's a very uncommon resolution, but why have the number of possible
resolutions fixed at all?

~~~
brucedawson
An excellent question. I was surprised to realized that all three video
editing programs had hard-coded limits. Odd.

------
ytdht
If you plan to watch most of your movies in 4K, you probably should wait about
10 years before buying a new TV... the movie industry doesn't adapt very
quickly and cable compresses everything so much that you can't even get true
1080 even if they say that they have HD

~~~
ploxiln
Really, for video, 1080p is absolutely fine, if the video is not over-
compressed. Even around 20mbps, 4k won't look better than 1080p, for full-
motion full-color video content.

Typical movie theaters use 4k today, and 2k (very close to 1080p) less than 10
years ago. For their huge screens. Of course the bitrate is very high, "up to
250 Mbit/s"
([https://en.wikipedia.org/wiki/Digital_cinema#Technology_and_...](https://en.wikipedia.org/wiki/Digital_cinema#Technology_and_standards))

For small text and fine lines such as used in a close-up computer monitor, 4k+
can be very beneficial. But for video, it's marketing, and a costly waste.

~~~
ytdht
One good example is that most Blu-ray movie discs are just DVDs converted to
take more space (quality was not increased)

~~~
arthurfm
I have rented lots of Blu-rays and never found one that was upscaled from DVD
to 1080p. Can you provide more details please since I am having a hard time
believing that's true.

~~~
ytdht
I did a quick Google search and found this article [1] ... I recall seeing a
website that was listing Blu-ray titles with true-1080p but I can't find it
right now. Cable does something similar by highly compressing HD content.

1\. [http://conversation.which.co.uk/technology/many-blu-rays-
no-...](http://conversation.which.co.uk/technology/many-blu-rays-no-better-
than-upscaled-dvds/)

