I'd be curious to see how many clock cycles something as simple as "x = y + z" (Where all three variables are integers) takes in various languages.
The compiled languages would likely output a single MOV and ADD and get done in 2 cycles (plus any time to fetch from memory). Something like Python probably takes a couple hundred with all its type checking. JIT languages I would think would take a couple hundred the first time the line gets executed, but then have a single MOV and ADD ready the next time, unless I'm completely misunderstanding JIT.