alloca isn't so bad if you have stack guards, which Microsoft has used extensively for a long time [1]. Historically Unix has lagged behind here, however, and so alloca is a security risk on Unix, which leads to the "alloca is evil" advice.
There is also the fact that you permanently lose one register in your function (ebp) if you use alloca(). This is important on 32-bit x86 but is rarely discussed.
There is essentially no implementation difference between VLAs and alloca().
No, I think alloca was considered evil before Elias wrote "Smashing The Stack". I bet if you go track down the 'alloca' man page from FreeBSD 2.0, you'll find that it's use was "discouraged" even back then.
There is no way to recover from a failed VLA allocation.
Also, VLAs are optional in C11, and may not be supported in otherwise-near-C99 compilers for small processors. (Of course, alloca() is not standard at all.)
> There is no way to recover from a failed VLA allocation.
There's no way to recover from any stack allocation.
#define TOO_BIG 1000000
This:
{
double big_array[TOO_BIG]; // not a VLA
}
is no safer than this:
{
size_t size = TOO_BIG; // size is not a constant
double big_vla[size];
}
If you use VLAs (variable length arrays) in a manner that permits arbitrary unchecked sizes, then yes, they're dangerous. If you carefully check the size so that a VLA is no bigger than a corresponding fixed-size array would have been, they don't introduce any new risk.
(I'm not sure why VLAs were made optional in C11, at least for hosted implementations. Support for them was mandatory in C99, and I'm skeptical that they impose any significant burden on compilers.)
It's a bit less evil because it's a declaration instead of pretending to be a function, but a lot of the criticism works for either.
Note that the overarching theme of C99 was making C a better language for numerical code (ie competing with Fortran), thus the introduction of VLAs (as well as _Complex, tgmath.h, restrict, and possibly other things I forgot ).