While both MJ and Turnstyle are Turing-complete, Turnstyle shares more similarities with Piet (mentioned by the author), Wireworld, and, of course, Conway's Game of Life.
Two years ago I got carried away and built a tool with a code name "notespub". It is the very same idea, albeit less polished. I documented my journey of extracting content out of Apple Notes, which naturally has been published with Notes as well [1].
I thought this was a cute idea; however, as I started using it more, I inevitably learned that there are features that are hard to mingle in Notes alone, i.e. custom themes, metadata, automatically generated home page, top/side navigation, cross-references, etc. These features could be implemented with an external system that helps you to manage a blog, yet that ruined the idea that I wanted a tool that puts local & privacy above everything else, a tool that has as little dependencies as possible.
Long story short, I started working on the second iteration that I thought wouldn't sacrifice privacy, but eventually shelved and haven't completed it. I guess it is time to open-source it.
Thanks for sharing, in fact I did read your post to dig into tech details, thank you for your work. Actually, Quotion sites can be customized in Apple Notes directly, check it out: https://docs.quotion.co/features/customize-layout . Other things like colors, fonts, they all can be achieved with similar format. Looking forward to your launch, :)
Neat trick. I found that I had to wrestle with Notes more than necessary, but if I were to pick up the project again, I would approach it differently this time:
- Notes function similarly to tweets, and, by definition, cannot by customized much. They are like cute little memos or a river of news where less is more.
- I thought it would also be cool, if within Notes I could subscribe to memos from my friends. Whoa, Apple Notes as a social network! As you are already familiar with Apple Notes structure, it won't be a surprise that it is not difficult to add content to special folders automatically (I built a prototype, which sadly never published.) However, this topic raises a bunch of important questions, and I don't remember whether I found all the answers. It's been a while.
Not long ago, I went down the rabbit hole of space-filling curves, and learned about an obscure paper by Rolf Niedermeier, Klaus Reinhardt, and Peter Sanders that introduced a rather peculiar curve with an unfortunate name: H-curve [1]. The paper mentions that H-curve preserves better locality properties compared to Hilbert curve. It fills the space with H-like shapes, hence the name. Also, like the Moore curve, it generates a loop.
Space-filling curves are ridiculously easy to implement with L-system rules, and I spent a few days developing a set of axioms to express it in a rewrite system. This was a fun puzzle [2].
A => BF-F-BFFFC-F-FC+F+BF-F-BFFFC-F-FC
B => BFFFC-F-FC+F+B
C => C+F+BF-F-BFFFC
hmm, i tried those rules in xfractint with 90-degree angles, and it doesn't seem to be a space-filling curve; http://canonical.org/~kragen/nothcurve.png is what i get at order 3
maybe my translation is buggy?
hcurve {
Angle 4
Axiom A
A=BF-F-BFFFD-F-FD+F+BF-F-BFFFD-F-FD
B=BFFFD-F-FD+F+B
D=D+F+BF-F-BFFFD ; C is reserved in Fractint
}
Fascinating. These rules are identical, but I don't have xfractint locally to reproduce/troubleshoot the issue. I think I was using https://dmitrykandalov.com/lsystem/ as a playground, and it appears to be fine there.
Can you try generating 1/8 or 1/4 curves to check the partial generation? Or these rules--should produce the triangular shape:
and accordingly this works on Kandalov's site with the same
axiom:
A => AFFFB-F-FB+F+A
B => B+F+AF-F-AFFFB
oh, now i know what the problem is, D is a drawing command in fractint; i had too much D, the opposite of the usual problem. so this works:
hcurve { ; by oneearedrabbit.net. See
; <https://news.ycombinator.com/item?id=38029945>. Based on "an obscure
; paper by Rolf Niedermeier, Klaus Reinhardt, and Peter Sanders that
; introduced a rather peculiar curve with an unfortunate name: H-curve
; [1]. The paper mentions that H-curve preserves better locality
; properties compared to Hilbert curve. It fills the space with H-like
; shapes, hence the name. Also, like the Moore curve, it generates a
; loop."
; <https://www.sciencedirect.com/science/article/pii/S0166218X00003267>
Angle 4
Axiom A
A=BF-F-BFFFX-F-FX+F+BF-F-BFFFX-F-FX
B=BFFFX-F-FX+F+B
X=X+F+BF-F-BFFFX ; C and D are reserved in Fractint
}
but i like `simplified` above better
incidentally the paper seems to call it 'h-indexing'
incidentally, this slight variant l-system, which is probably what you meant
a where
a -> afffb-f-fb+f+a
b -> b+f+af-f-afffb
has the property that the axiom is a proper prefix of the axiom's expansion,
which turns out to be equivalent to the property
that every generation is a proper prefix of the following generation;
this means that in a sense each one is "approaching a limit"
of a single infinite string,
in the sense that it's a successively longer prefix of that string.
this infinite string,
called a 'morphic word',
is a fixed point of the mapping
the l-system does each generation
this is literally a numerical approximation if you treat the string
as a fractional number in some base, e.g., base 10 with a=1, b=2, f=3, +=4, -=5
with that interpretation, the first approximation 'a' is 0.1, the second approximation 'afffb-f-fb+f+a' is 0.13332535324341, the third approximation 'afffb-f-fb+f+afffb+f+af-f-afffb-f-fb+f+af-f-afffb+f+afffb-f-fb+f+a' is 0.133325353243413332434135351333253532434135351333243413332535324341, and so on.
the thue-morse sequence can be generated in the same way with the l-system
0 where 0 -> 01 and 1 -> 10
although the so-called fibonacci word is slightly simpler
a where a -> ab and b -> a
all of the above morphic words are aperiodic, though it's trivial to design a periodic morphic word
a program to output the infinite morphic word of movement commands for the h-curve of a single triangle is
queue = ['a'], []
d = dict(a='afffb-f-fb+f+a', b='b+f+af-f-afffb')
while True:
for item in queue[0]:
for c in item:
n = d.get(c, c)
yield n
queue[1].append(n)
queue = queue[1][::-1], queue[0] # amortized constant time
queue[1].clear()
Not directly related to the post, but still feels somewhat relevant.
Back in March I used a bit more elaborate multi-step prompts for GPT3.5 to generate amusing pictures and published a gallery [1]. However, I eventually reached a point where changing prompts did not consistently improve the final results. At the end of the day, the quality of images are only as good as the training dataset, and GPT is a black box.
For something different, to test whether it is possible to "compress" visual content specifically for GPT, I ran another experiment. SVG, being a verbose format, takes time to generate a detailed image, and it also becomes expensive over time. I translated a subset of SVG elements into Forth words [2], which has a nice synergy with GPT tokens--this allowed me to progressively render pictures and produce smaller outputs without sacrificing much in quality.
Finally, I training my own GPT2-like model on the QuickDraw dataset [3]. It's not surprising that a sequence transformer can be trained to produce coherent brush strokes and recognizable images as long as there is a way to translate a graphical content into a sequence of tokens. That said, I found myself with more questions than I started, and trying other ideas now.
This number is tokenized as a list: 43 20 59 83 409 58 340 9 58 340 95 30 95 809 34 850 9 34 850 34 809 58 340 9 58 30 49 850 385 30 49 58 30. If GPT recognizes the context "+ 1 equals" through the attention mechanism, it can predict that the next number in the sequence should be 31: ... 58 30 -> ... 58 31
Such a system could be fine up until you need to open a graphical browser. If you can get by with 2008 workloads it would be fine. I don't say this as a disparaging comment about the eeePC, it was an alright machine for the workloads of its era. It's more a problem with modern websites being festooned with JavaScript.
The other issue with an older eeePC is the original SSD may have non-trivial wear issues and the battery is likely shot. Unless you get the machine for next to nothing fixing those issues will cost as much as just a brand new cheapo laptop.
I loved my eee but I can't imagine going back to that platform these days; it's workable for programming but it struggles with almost every task. You're going to want to run xfce or something rather than gnome or kde, among other things. I'd prefer a system76 laptop these days.
>>I loved my eee but I can't imagine going back to that platform these days
I really miss my old eeepc901 some days but I agree. Going back to those slow atom processors is really a nonstarter. But the size and form factor (with exception of the nonstandard screen aspect ratio) was just about the perfect 'take anywhere' laptop. I used Ubuntu on it and if I had to do it again in this day and age I'd try to see if I could make Ubuntu-Mate to work on it. Being able to set the font sizes and the icon sizes made that nonstandard screen aspect ratio work far better than it ever did in Windows XP.
My ex-girlfriend used to have a thing called Asus E200HA. It was amazingly cheap, cute as hell and even reasonably fast in Windows. I’d immediately buy a newer machine like that if it could run Linux.