> I'm wondering if humans are mostly incapable of producing great things without (artifical) restrictions.
I think the real issue is that there isn't a program language that produces a compiler error if the given code can exceed a maximum specified latency.
Even working on a program with soft-realtime scheduling, I've had to constantly push back against patches that introduce some obscure convenience without having measured worst case latency.
The problem is so bad I doubt most people realize it's there. I don't know what the answer is, but I have the feeling there's an intersection with timing attacks on software/hardware. Some kind of tooling that makes both worst case times and variance as visible as the computed CSS in devTools would probably help. Added to some kind of static analysis, perhaps devs to hack their way to decently responsive interfaces and services.
I think the real issue is that there isn't a program language that produces a compiler error if the given code can exceed a maximum specified latency.
Even working on a program with soft-realtime scheduling, I've had to constantly push back against patches that introduce some obscure convenience without having measured worst case latency.
The problem is so bad I doubt most people realize it's there. I don't know what the answer is, but I have the feeling there's an intersection with timing attacks on software/hardware. Some kind of tooling that makes both worst case times and variance as visible as the computed CSS in devTools would probably help. Added to some kind of static analysis, perhaps devs to hack their way to decently responsive interfaces and services.