I like dynamic languages too. But I don't like the idea of "optimization", and I would be super interested in a dynamic language that didn't attempt to divorce performance from correctness. The worst part about jumping through insane hoops to enchant the optimizer is that it can all go wrong with the tiniest change--a flag here, a different usage pattern there, a new version, etc., and suddenly your program doesn't do what you need it to, as though an operation taking 1000x longer is just a matter of degree.
At the same time, no one wants their code to run 100x slower than it would in any typical statically typed language. Unoptimized dynamic languages are sloooooow.
Rpython and Graal (and what else?) provide JIT-for-free (or at least cheap).
Of course, this really only works for code that is (a) statically polymorphic but dynamically monomorphic, and (b) has hot loops, but qualitatively that conjunction does seem like it ought to cover a lot of low-hanging fruit.
There aren't many people looking at these JITs at the moment. Stefan Marr[1]'s group[2] is, I believe, the where most of the research is currently done. A recent paper[3] compares performance of interpreters in RPython and Graal. Their baseline performance is Java, and they achieve performance close to V8, which itself is about 2x slower than Java.
My summary is you can write fast interpreters + get JIT for free, but fast JIT for dynamic languages still means 2x slower than JIT for statically typed languages (and Java definitely leaves some performance on the table due to how it represents data).
> This makes all logical thought impossible, and means there cannot be correct or incorrect thinking.
No, this is conflating paradigms. If this were true, you would expect the same Buddhist texts not to mention "right" and "wrong", and boy do they ever.
Here's an interesting experiment you can do around retrospective perception and timing: clap your hands once, listening for the sound of the clap. Notice what it's like to remember the sound of the clap for a minute or so. Now, clap your hands twice, again listening closely and noticing what it's like to remember the sound for a little while. In the second case, is it possible for you to remember the sound of the first clap only, without an echo of the second?
Rapamycin and related rapalogs will be far better than metformin at inhibiting mTORC1, and at a low and intermittent dose of 4-6 mg ONCE per week are quite safe. Best paper I know of now is Kaeberlein et al 2023:
What about stuff like Berberine? Is that all smoke and mirror? It's always hard to tell if the research around "traditional" compounds is actually legit or just other countries engaging in what is effectively marketing.
They're minimal; retinol is usually mixed with a carrier oil or cream in various concentrations and applied topically. I would be surprised if it has an impact on the superficial musculoaponeurotic system (SMAS) referred to in the article, however.
It's a great title/article pair for HN because it's unexpected/good, as opposed to clickbait which is unexpected/bad. The guidelines permit changing the original title in the case of clickbait anyways.
That depends on your interests. If you care about the finer points of sound editing embedded in a rather extensive personal narrative then yeah, it's good. Otherwise not so much.
These are easily one of my favourite types of posts (and this one was particularly gratifying). I wish I could go down these rabbit holes every day!
> [...] and why heterogeneous platforms are important
This prompts me to wonder vaguely whether there's any untapped juice in fuzzing approaches that might be relevant here. As in, how much of the platform (including configuration and heuristics, and so on) could be fuzzed as program input?
Thank you, glad you appreciated the post! I love writing up debugging/incident reports and this one was just really fun.
Regarding fuzzing... maybe? I've wondered that a couple of times myself but in reality there are really just a finite number of platforms, and so much of this is determined at compile time by library call availability. But I'm probably not thinking as deeply about this as someone could be, and I'd be interested to hear other folks' thoughts.
reply