My guess at an answer is that human beings are more comfortable thinking about numbers that are small integers (between 1 and 20 or so?), and that (roughly speaking) we often want to be able to give a bit more precision than you'd get from just "1" vs. "2".
So for baby growth, parents will talk about how many days old their child is for the first week or so, and then use "weeks" for the first few months, and then use "months" until they're around 2 years old. (There's also a real sense in which the pace of child development seems to progress on a sort of log scale: change is very rapid at first, but gradually slows down. The use of different age units seems to roughly parallel that.)
As an aside, this same human preference is presumably also why the English developed different units for (say) inches, feet, and miles rather than using one of those units for everything. [Side note: is there any common English unit between yards and miles? I grew up using "blocks", which is handy, but that's pretty city-specific.]
 By "precision" I'm thinking more or less about "relative uncertainty". If you assume that an integer value is accurate to within +/- 0.5, then the percent uncertainty on 1 or 2 is so large as to make the information almost useless, while the implied uncertainty on a big number like 50 is probably smaller than is justified for most contexts.
Literally, the length of a furrow. A sensible length for
farmers that later evolved into the acre, which is discussed
later in this section. A standard furrow is 220 yards long
or ⅛ mile
- Clothes are sized in months 0-3, 3-6 etc..
- During doctor visits you discuss developmental milestones expressed in months.
Etc. You get used to it, since at that age the development of a child is extremely condensed and years simply don't provide enough resolution.
- 1.0833 years old: 13 months
- 1.4166 years old: 17 months
- 1.8333 years old: 22 months
So, is it easier to use years on the clean decimals and months whenever it gets hairy, or to just settle on months?