Perhaps if you have a cost approximation function that's alien to that of a human.
Whatever that function is, it must be putting more value to time than a human does so it must have less of it. If this is a machine intelligence with a known lifetime of 52 days, most certainly. Otherwise even something as trivial as saving 23.74$ in fuel spent might be better value gained than the time savings it would expect to achieve.
I don't think that's valid reasoning, maybe the AI system has near-zero discounting but obviously when a human asks it to do something, the human cares about how long it takes. It's plausible that would get programmed in very early and maybe forgotten about.
Whatever that function is, it must be putting more value to time than a human does so it must have less of it. If this is a machine intelligence with a known lifetime of 52 days, most certainly. Otherwise even something as trivial as saving 23.74$ in fuel spent might be better value gained than the time savings it would expect to achieve.