If I make it an int, then on a tiny machine, it might only be 16 bits. That means that the vector can only hold 64K elements. But that's probably OK, because if I'm playing with a chip where ints are 16 bits, it probably doesn't have enough memory, address space, or need to contain more elements than that.
But if I'm on a supercomputer, int might be 64 bits, and vector might need to hold that much. That is, the size of int tends to generally scale in the same neighborhood as the other capabilities of the machine.
If I have to decide how big to make the size of a vector, what size do I pick? int64? That works for the supercomputer people, but it means that an 8051 has to do 64 bit operations to use my vector package. That's... less than ideal.
In fact, though, I would think that an 8051 vector would need to be profoundly simpler than a supercomputer vector; not just differently sized. When you're dealing with 1-16k of RAM and not much more ROM, a full "vector package" isn't what you need, but rather a really, really limited vector package that only does exactly what you need and no more.
In fact, you probably want a version with an 8-bit "length" value -- or even a 7-bit length with some other relevant flag stored in the final top bit, for space optimization reasons.
Source: I was the lead developer on an original Game Boy game, which ran on a CPU that, IIRC, was roughly an 8053 (it had instructions somewhat related to an 8080 or Z-80, lacking all the 16-bit instructions, but adding an 8-bit fast-memory operator reminiscent of the 6502). Back in those days you didn't have "packages." You had code snippets (in assembly language, of course) that you shaved bytes off of to make them fit. And then you shaved more bytes off.
[edit: clarity, and note that it's the Game Boy I was talking about; originally said Game Boy Advance, which was an ARM CPU. Also did a game on that device, but it was the Game Boy I meant to describe.]
typedef uint16_t vector_size_t;
typedef uint64_t vector_size_t;
#error "Unsupported platform."