I can't figure out what your first paragraph is about. The topic under discussion is Python performance. We do not generally try to measure something as fuzzy as "real work" as you seem to be using the term in performance discussions because what even is that. There's a reason my post referenced "lines of code", still a rather fuzzy thing (which I already pointed out in my post), but it gets across the idea that while Python has to do a lot of work for "x.y = z" for all the things that "x.y" might mean including the possibility that the user has changed what it means since the last time this statement ran, compiled languages generally do over an order of magnitude less "work" in resolving that.
This is one of the issues with Python I've pointed out before, to the point I suggest that someone could make a language around this idea: https://jerf.org/iri/post/2025/programming_language_ideas/#s... In Python you pay and pay and pay and pay and pay for all this dynamic functionality, but in practice you aren't actually dynamically modifying class hierarchies and attaching arbitrary attributes to arbitrary instances with arbitrary types. You pay for the feature but you benefit from them far less often than the number of times Python is paying for them. Python spends rather a lot of time spinning its wheels double-checking that it's still safe to do the thing it thinks it can do, and it's hard to remove that even in JIT because it is extremely difficult to prove it can eliminate those checks.
I understand what you're saying. In a way, my comment is actually off-topic to most of your comment. What I was saying in my first paragraph is that the words you use in your context of a language runtime in-effeciency, can be used to describe why these in-effeciences exist, in the context of higher level processes, like business effeciency. I find your choice of words amusing, given the juxtoposition of these contexts, even saying "you pay, pay, pay".
You claimed churning through Python interpretation is not "real work". You now correctly ask the question: what is "real work"? Why is interpreting Python not real work, if it means I don't have to check for array bounds?
This is one of the issues with Python I've pointed out before, to the point I suggest that someone could make a language around this idea: https://jerf.org/iri/post/2025/programming_language_ideas/#s... In Python you pay and pay and pay and pay and pay for all this dynamic functionality, but in practice you aren't actually dynamically modifying class hierarchies and attaching arbitrary attributes to arbitrary instances with arbitrary types. You pay for the feature but you benefit from them far less often than the number of times Python is paying for them. Python spends rather a lot of time spinning its wheels double-checking that it's still safe to do the thing it thinks it can do, and it's hard to remove that even in JIT because it is extremely difficult to prove it can eliminate those checks.