I haven't done C in a while, but I lost track of how many times I had to create `struct array` to ensure I always passed the `*` with its length - it always felt like I was doing something wrong. Maybe I was. But the prevalence of overflow errors that could be fixed by size-checking makes me think there's something missing in the language or standard-libraries that could have helped a lot for many years.
I never had an issue with a Turbo Pascal application size due to string/arrays.
That is you have an allocator that views it as a free list of tracked size. You have the exact same memory loaned out to other parts of the system for various other uses. Some of which are other arrays.
I think the problem is you have to keep the array and its length together and always remember to call the appropriate array functions with the appropriate lengths. This isn't made easier by the fact that many string/array methods have variants that expect (and only respect) null-terminated strings (i.e. where you actually don't keep a length and rely on a terminal null-byte).
The language could have decided that the first byte is the length and used that convention in string/array functions, but instead we have the convention you point out that the array is really referenced only by its first element and array-indexes are just sleight-of-hand for pointer addition.
Consider, in c, you can have an array and pass it in two parts to two different functions. Each getting passed a different length. You couldn't do that if the array had it's length as an intrinsic property.
You could have split the function space into operations that work on parts of an array and those that work on a full array. So, not claiming this as a full blocker. And I don't know which is ultimately better. We know where we are. Seems easy to speculate we could have avoided a lot of errors. I just don't know if I agree it is guaranteed.
I guess that makes the point that what they chose is probably correct, but it's still quite disappointing that the stdlib doesn't help more to prevent you from shooting yourself in the foot with pretty common operations and data-types.
Interesting data point that in 1989, it was still a novel idea that buffer overflow was related to security.
> To encourage people to pay more attention to the official language rules, to detect legal but suspicious constructions, and to help find interface mismatches undetectable with simple mechanisms for separate compilation, Steve Johnson adapted his pcc compiler to produce lint [Johnson 79b], which scanned a set of files and remarked on dubious constructions.
Dennis M. Ritchie, https://www.bell-labs.com/usr/dmr/www/chist.html
When you read papers from the 60s, 70s and 80s it's amazing how many things we think are new today have already been discussed back then.