My guess at an answer is that human beings are more comfortable thinking about numbers that are small integers (between 1 and 20 or so?), and that (roughly speaking) we often want to be able to give a bit more precision than you'd get from just "1" vs. "2".
So for baby growth, parents will talk about how many days old their child is for the first week or so, and then use "weeks" for the first few months, and then use "months" until they're around 2 years old. (There's also a real sense in which the pace of child development seems to progress on a sort of log scale: change is very rapid at first, but gradually slows down. The use of different age units seems to roughly parallel that.)
As an aside, this same human preference is presumably also why the English developed different units for (say) inches, feet, and miles rather than using one of those units for everything. [Side note: is there any common English unit between yards and miles? I grew up using "blocks", which is handy, but that's pretty city-specific.]
 By "precision" I'm thinking more or less about "relative uncertainty". If you assume that an integer value is accurate to within +/- 0.5, then the percent uncertainty on 1 or 2 is so large as to make the information almost useless, while the implied uncertainty on a big number like 50 is probably smaller than is justified for most contexts.
Literally, the length of a furrow. A sensible length for
farmers that later evolved into the acre, which is discussed
later in this section. A standard furrow is 220 yards long
or ⅛ mile