Distributed systems are hard. I like the idea of "semantic locality." I think it can be achieved to some degree via abstraction. The code that runs across many machines does a lot of stuff but only a small fraction of that is actually involved in coordination. If you can abstract away those details you should end up with a much simpler protocol that can be modeled in a succinct way. Then you can verify your protocol much more easily. Formal methods have used tools such as spin (promela) or guarded commands (murphi) for modeling these kinds of systems. I'm sure you could do something similar with the lean theorem prover. The tricky part is mapping back and forth between your abstract system and the real one. Perhaps LLMs could help here.
I work on hardware and concurrency is a constant problem even at that low level. We use model checking tools which can help.
In practice, the Mandelbulb is usually only computed to a few iterations (e.g. 20) in order to maintain smooth surfaces and prevent a lot of surfaces from dissolving into ~disconnected "froth".
So deep zooms and deep iterations aren't really done for it.
Also, it's generally rendered using signed distance functions which is a little bit more complicated. I haven't looked at the equations though to figure out if perturbation theory is easy to apply -- I'm guessing it would be, as the general principle would seem to apply.
Step one should be having an edtech ecosystem that doesn't allow students to use the web. There are just too many distractions online. I think noone has really invested the kind of capital required to do a good job with this. Most of the software my kids have to use for school is pretty bad.
Until you consider that almost all children in N.A. that are learning rely on Kahn academy & YouTubers to make up for the fact that their school’s can’t figure out how to replicate a great learning experience that is available for free.
Single player is a different kind of experience, and no less valuable. You might as well say
> It's a lot harder to make a book have depth and complexity than it is for going to a party, since you don't have human conversation partners.
It depends on the book and the party. Similarly, maybe there is more depth and complexity to Dwarf Fortress than there is to Rocket League. (Not to pick on RL in particular, it is just the first thing that came to mind.)
I'd say Breath of the Wild and Tears of the Kingdom had a lot of depth. There was a ton of stuff to do. I have 340 hours into Tears of the Kingdom and hit 100% on the map, but there are still things I haven't done and stuff to explore and try. I find they also have a high replay value, since they are so open and there are nearly infinite ways to solve the various puzzles, traverse the world, or engage in the various battles... or don't. I once started up a new BotW game to see how far I could get without actually fighting anything.
I think it's highly dependent on the type of game. Games that involve planning and strategy like Slay the Spire or Factorio have enormous depth despite being single player. But I think that it's hard to make the actual execution of mechanics as fun or deep against computer opponents.
We certainly intend to add semantic zoom levels, but for now I think zooming out is just helpful in navigating the canvas when you have multiple connected components on the canvas at once and need to pan to an area in the canvas.
It is interesting to see the type of analysis he does and the visualizations are impressive, but the conclusions don't really seem too surprising. To me, it seems the most efficient learning algorithm will not be simpler but rather much more complex, likely some kind of hybrid involving a multitude of approaches. An analogy here would be looking at modern microprocessors -- although they have evolved from some relatively simple machines, they involve many layers of optimizations for executing various types of programs.
I don't know how to express my thoughts coherently in such a small space and time, but I will try. There isn't "one" example.
----------
Almost all the code and its display is some form of meta-programming. Stephen Wolfram is literally brute-forcing/fuzzing all combinations of "code".
- Permuting all the different rules/functions in a given scope
- evolutionary adapting/modifying them
- graphing and analyzing those structures
- producing the HTML for display
I get that "normal machine learning" is also permuting different programs. But it's more special when you are using the same language for the whole stack. There is a canyon that you have to cross without homoiconicity, (granted I don't know exactly how Wolfram generated and analyzed everything here, but I have used his language before, and I see the hallmarks of it).
I can't really copy and paste an example for you, because plaintext struggles. Here is an excerpt some fanciness in there:
And as an example, here are the results of the forward and backward methods for the problem of learning the function f[x] = <graph of the function> , for the “breakthrough” configurations that we showed above:
You might see a "just" a small .png interspersed in plain text. The language and runtime itself has deep support for interacting with graphics like this.
The only other systems that I see that can juggle the same computation/patterns around like this are pure object oriented systems like Smalltalk/Pharo. You necessarily need first class functions to come even close to the capability, but as soon as you want to start messing with the rules themselves, you need some sort of term re-writing, lisp macro, or fexpr (or something similar?).
Don't get me wrong, you can do it all "by hand" (with compiler or interpreter help), you can generate the strings or opcodes for a processor or use reflection libraries, generate the graphs and use some HTML generator library to stitch it all together. But in the case of this article, you can clearly see that he has direct command over the contents of these computations in his Wolfram Language compared to other systems, because it's injected right into his prose. The outcome here can look like Jupyter labs or other notebooks. But in homoiconic languages there is a lot more "first-class citizenry" than you get with notebooks. The notebook format is just something that can "pop out" of certain workflows.
If you try to do this with C++ templates, Python Attribute hacking, Java byte-code magic... like... you can, but it's too hard and confusing, so most people don't do it. People just end up creating specific DSLs or libraries for different forms of media/computations, with templating smeared on top. Export to a renderer and call it a day -> remember to have fun designing a tight feedback loop here. /s
Nothing is composable, and it makes for very brittle systems as soon you want to inject some part of a computation into another area of the system. It's way way overspecified.
Taking the importance of homoiconicty further, when I read this article I just start extrapolating, moving past xor or "rule 12", and applying these techniques to the symbolic logic, like Tseltin machine referenced in another part of this thread: https://en.wikipedia.org/wiki/Tsetlin_machine
It seems to me that training AI on these kinds systems will give them far more capability in producing useful code that is compatible with our systems, because, for starters, you have to dedicate less neuronal connections on syntax parsing with a grammar that is actually fundamentally broken and ad hoc. But I think there are far deeper reasons than just this.
----------
I think it's so hard to express this idea because it's like trying to explain why having arms and legs is better than not. It's applied to every part of the process of getting from point A to point B.
Also, addendum, I'm not 100% sure homoiconicity it "required" per se. I suppose any structured and reversible form of "upleveling" or "downleveling" logic that remains accessible from all layers of the system would work. Even good ol' Lisp macros have hygiene problems that can be solved, e.g. by Racket's syntax-parse.
Gaussian Process Regression (a form of Bayesian Optimisation to try and get to the right "answer"/parameter space sooner) - explained in some context here...
For me, Joyce is the pinnacle of the English language. I can't say I understand too much of what is happening but noone writes more beautifully. I just love the sound of his words and the images he conjures.
Whats this obsession with beauty in language among some English-first speakers? Aren't meaning, insight and import of more consequence than beauty? Every single time I hear someone wax poetic about beauty and elegance in things it immediately sets off my bs meter. If you haven't got much of anything substantive to say, you use flowery artifices to mask it.
Also non-English-first speakers, do you see this to be a very English-thing or is this sort of fixation if not fetish with beauty in language and other things present in your current day language too?
This is a bizarrely anti-English take. There's appreciation for the beauty of the language for literature/poetry in every language, as far as im aware. Look at Japanese poetry for one obvious example that takes appreciation
of beauty in language to its absolute extremes.
Same thing in all languages I know well enough to read books in.
I can't at all explain how or why it works, but certain kinds of writing style have an almost magical effect on me. This feeling of well-rounded beauty, even when the content is barely relevant, is just amazing. One could maybe describe it as a kind of brain hacking, which is also what drugs do.
That’s completely fine, but hopefully you can take people’s word on that it can be very beautiful to them.
It feels a bit like saying “if you don’t have meaningful lyrics, why even sing a song”: Different people can appreciate different layers of literature differently.
> Aren't meaning, insight and import of more consequence than beauty?
You are close to setting up a false dichotomy here. It isn’t those qualities or beauty, it’s those qualities and beauty.
I first experienced it when I read Michael Ondaatje’s The English Patient. I was able to enjoy the book on the usual axes of plot, character, etc… but there was another level that had me rereading pages because every chapter, paragraph, and sentence felt perfectly constructed. Some of it I read out loud to myself because the rhythm of the words and sounds were musical. It is a great story, beautifully told.
That said, I’m not a fan of Ulysses and I’m sure a lot of people here would call me an uncultured rube for enjoying Odaatje’s writing like I did.
This happens in every language. I'm Norwegian, and there's an old Danish translation of Whitman I much prefer over the English original, not for ease of reading (Norwegian and Danish are close to mutually intelligible without much effort) but because the translation is beautiful.
I wish I could remember the edition.
There are books I prefer in one language or other. English tends to feel like it has a "darker" texture to me (no, it makes no logical sense) and so the same book - Lord of the Rings is an example I've read in both English and Norwegian - will hit me very differently emotionally depending on language.
Sometimes I'll read something just for the beauty of it. Other times I just want to get at the ideas.
For me, reading fiction is all about beauty. I liken it to listening to good music. It isn't really to learn anything "substantive." It is to experience a feeling or be transported to another place. In fact, I like to listen to (typically instrumental) music while I read as a sort of "soundtrack." I would liken reading a good book to watching an epic movie. I guess you might occasionally gain some insight into the human condition, but it isn't primarily an informational medium.
I feel the same way about reading beautiful language that I feel about reading a beautiful mathematical proof. It's not that it doesn't have substance, it's that the substance is put together in such such an elegant way that you don't just marvel at the content but also at its presentation.
It is very different, indeed possibly the opposite of a show off of cleverness. It is beautiful because it feels that it's just the right way to do it.
As a writer, I think I can speak to this. I can certainly understand a non-native speakers frustrations with the complexity of english grammar, the enormous number of synonyms and colloquialisms, the variety of 'codes' and kinds of 'jargon' that must make learning and reading English profoundly difficult. Especially where the non-native speaker or reader's mother tongue isn't a romance language. I get that it must make certain forms of English - from the dense AAVE of the wire, to Elizabethan sonnets all but impenetrable.
However, I see this 'pragmatism in all things, including art' perspective quite a bit on hackernews, and rarely enough anywhere else. Most often concerning fiction, but also contemporary and modern art. I'm not sure if it's a neurodiverse perspective, or a philistine one, but I can confirm that it's missing the aesthetic function of art. i.e.: the pleasure many people obtain from creating and experiencing it. There seems to be a frustration that some people who don't or can't engage in producing or enjoying certain kinds of art have - that becomes a denial of the value of the work altogether. 'I don't get it, so there's nothing there to get'.
Specific to Joyce and the modernists is an absolute mastery of the complexity and nuance of a wide breath of kinds of English (and in Joyce's case numerous other European languages). To a native speaker with a strong grasp of language, and a love of words, reading Joyce or TS Elliot, or Yeats etc, is like listening to a complex piece of classical music. The use of reference, of meter, of onomatopoeia, the play with homonyms and antonyms, and at a higher level with the structure of stories and narrative traditions etc - all give the reader pleasure. In the hands of a truly great writer, like those above, they also serve to create layers of meaning in the way a koan or painting can contain complex fractal patterns of meaning. Reading a great writer, working with the nuances of language and narrative can literally lift the reader into a state of heightened consciousness. A place where new realisations about society, the self, the emotional depths and nuances of other people are elucidated in a way that's genuinely mind expanding.
It's absolutely fine if you don't find this in literature - whether in a second language or your own. It's naive to assume that it doesn't exist because you personally can't perceive it. Aptly enough - that's a contradiction many writers have explored. Our tendency to diminish the inner lives of others, or the worth of things we cannot appreciate. One piece that springs to mind is David Foster Wallace's essay 'This is Water' - https://fs.blog/david-foster-wallace-this-is-water/
Far too many of these supposed greats works of literature get an easy pass from uncritical also-rans of the world, who just want to move on in the name of different-strokes-for-different-folks without ever calling out the bs for what it really is. I'm not saying there aren't valid detractors of these works - there are - but far too often they're drowned out.
Far too many of these works hide behind the crutch of 'fiction' to spew utter hogwash without making an ounce of sense to the regular, impartial and non-dyed-in reader.
Far too many of these books - when coupled with a lackadaisical populace in general who are more concerned about seeming non-fussy - get that stellar mythical hallowed status and lore.
I'm not saying that there are not enough people who genuinely get entranced with these works (although if you run that through a fine comb your results may vary) - its that the gatekeepers of education seem to be entirely made up of these uncritical clowns who will nod away in affirmation, decade after decade in cementing the undeserved status of these works.
I agree with your take on the literature. My point wasn’t in response to Ulysses or criticism of it, but of your statement regarding over-obsession with the English language.
I work on hardware and concurrency is a constant problem even at that low level. We use model checking tools which can help.
reply