“Tend to” is a mathematical term mean get closer to. Formally, for any e as small as you like there exists an N within e of the value being approached.
This is possible because we are sampling a probability distribution. A dice is discrete but the probability of rolling a 1 as the number of sides increases tends to zero even if there is a whole “1” in a given dice.
An algorithm can't have that property for its expected runtime, as it can only look at a sub-constant amount of the input of it runs in sub-constant time.
it's not possible for it to behave differently as n increases because it doesn't have enough time to distinguish between different values of n once it's large enough.