I think that, for long-term security, we need to have devices that are resilient to orchestrated sabotage by the vendor. The current approach by Apple is great, until Apple is compromised in one way or another.
I have some idea for how this could be done (TPM-resident signing keys on each device, which have to sign all binaries before they can execute). I might end up writing a blog post about the idea.
My experience with AppLocker is that it doesn't really work. As high-school students we would trade ways to break it to play games on our laptops (we were given school laptops which had AppLocker). If high-school students were able to figure out how to break it, I have no doubt there are more serious issues. In addition, I believe you can only whitelist based on:
1. Paths (like AppArmor).
2. Publisher (which I think is a signature, but is a signature of the publisher not the machine itself -- so a compromised publisher could give you a bad update silently).
3. Hash (which is _okay_ but arguably requires more maintenance of the "good hash" list than requiring a specific signature -- though the nice thing with hashes is that you can disallow old ones).
On Linux we have IMA, and there is quite a lot of work on being able to use it as a way of requiring signed-binary execution (it's still not there, from what I've heard in recent talks). But even with that we'd need quite a bit of work to create an installer that bootstraps TPM-resident keys and signs all of the system binaries -- as well as requiring all new updates to sign said binaries.
I often wonder if the cat and mouse game of high school IT restrictions is an under handed way of training the next generation of security professionals.
I have some idea for how this could be done (TPM-resident signing keys on each device, which have to sign all binaries before they can execute). I might end up writing a blog post about the idea.