Hi! I'm a college freshman at Northeastern, and this is my first hackathon project for a major hackathon (MIT Media Lab). It won the Unconventional Computing track prize, so I'm pretty happy! I made this because I'm a major conlanger, and I was wondering if it would be possible to think in terms of sound when looking at an image. Fractals have easy-to-map parameters, so I created SHFLA, a language which takes in music, and creates fractals based on 0.1 second (you can change this) chunks of music! It's Turing-complete, so you can technically encode a ridiculous amount of information and computation in my system, although writing it out as music might take a while.
SHFLA is Turing-complete though, so I am hoping people see that it has more potential than just being a music visualizer as I keep improving it! I'm currently in the process of rewriting it in Nim using SDL2, so once I get it to a performant state I'm going to implement an information-as-music encoder and all that.
Upvoted for Shoegaze reference. Suggest hosting a demo with My Bloody Valentine "Loveless" as the default musical selection. I will also echo the other commenters who believe your communication needs to improve; it is unlikely that I will download and install your code because the risk/reward ratio is poor, as it is for most OSS (open-source software).
Note: distribution wins in software. Suggest you rewrite this to be at the intersection of distributable and local, which basically means running in the browser. Consider porting to JavaScript and distributing either as a simple static webpage, or perhaps a codepen. This might be an interesting use of two interesting but little-used browser APIs, namely WebAudio and WebGL or perhaps WebGPU. For simplicity I'd target Chrome first.
Hope you find the project cool :)