Separately, I’m surprised Apple didn’t do this with Swift. I thought one of the goals of Swift was to be a good language for low level code up to high level scripting.
This should be dynamic, but it shouldn't be LUA. It should be something like JSON. Although Lua is sandboxed, so it's safer that way, it is still an easier and bigger attack vector for hackers than JSON would be.
For any given programming task, there is a fixed (minimum) amount of work to do to solve it correctly, and the programmer's design choices merely decide the distribution of that work among the environment, the compiler, the standard library, the third-party libraries, and the programmer herself.
Meaning, that in order to use the programmer's time most effectively per unit of task difficulty, the language itself should be such that the compiler does as much work as it possibly can. Such a compiler simply won't be small, and probably won't be fast either. (imho the best it can or should do in the face of a syntactically-valid non-program, is to give as many errors at once as are applicable, so that the programmer can fix them all in one go)
Rust, of course, in the grandly oversimplified view, takes this tradeoff in the same direction as Swift, and I'm sure you're familiar with all the ways that Rust is better for it ... and even a few ways that Rust is worse for doing the opposite. :)
(Though, then again, there's miri, for those that value correctness over performance. Unless it too has changed names.)
Behavior changes of the os should not be quiet and magical, they should be intentional and the user should be notified (if they care).
Presumably both of these have more overhead than Apple wanted in this case, but it does seem odd.
For a core feature, it makes sense to have minimal dependencies with a minimal footprint so plenty of room is left for user applications. Remember that Apple probably has a bunch of skunkworks devices in their labs they hope to productize someday. And many of these might be tiny portable devices with minimal hardware capabilities, like Apple Watch (or remember all the different iPods?).
I've worked with SpiderMonkey and have known the team for about a decade now. I'm aware of the complexity :)
FTL is gone, I believe. But anyway, JSC is not just a JIT. If you're worried about the JIT, just use an interpreter.
JSC uses a hand-written assembly interpreter. I would be shocked if it beat Lua in performance.
> For a core feature, it makes sense to have minimal dependencies with a minimal footprint so plenty of room is left for user applications.
JSC has been shipping since the first version of iOS in 2007. Tons of apps use embedded WebKit already.
This may well be the reason.
My hope is basically that many more rules will be written and they will slowly move towards handling more general problems as well as a very large variety of specific cases.
> 2.5.2 Apps should be self-contained in their bundles,
> and may not read or write data outside the designated
> container area, nor may they download, install, or
> execute code which introduces or changes features or
> functionality of the app, including other apps.
Having a self modifying app is a nightmare from a security & privacy standpoint.
"Hypocrisy" gets thrown around way too often on the Internet, and if you do so you are basically always wrong, either because it's not actually hypocrisy at all (the word is not a synonym for "anything I don't like") or because it's a meaningless thing to say anyway vs more substantive complaints.
Another thing - hypocrisy has no bearing on the conclusion of the argument. You can be a hypocrite and be logically correct. OTOH, you can be far away from a hypocrite and be wrong.
> 5.2.4 Apple Endorsements: Don’t suggest or infer that Apple is a source or
> supplier of the App, or that Apple endorses any particular representation
> regarding quality or functionality.
Apples owns the entire platform, I have no problem with them having "root" privileges. Honestly I'd rather have a closed platform with strict guidelines than the wild wild west that is Android.
Required review by Apple? That malicious code is restricted from doing much precisely by the sorts of things OP is complaining about? The potential economic return isn't as high because Apple can forcefully clean it off? That it's harder to be anonymous when $100/year has to change hands for a cert and Apple wants to know who you are on some level so it'd be at least somewhat harder to avoid them going after you? None of these are bulletproof, even combined. But even so they're not worthless either to that goal. Are you actually asserting that the actual state of malware on iOS is identical or worse then that of Android or Windows/Mac/Linux? That's not something I've seen supported before but I'd read a good source if you've got one.
0. Trying to prevent average developers from shooting themselves in the feet too often by having apps that depend on external servers to run, then having a big security incident and Apple being blamed for allowing insecure, third-party code to run.
1. Trying to keep tabs on devs pivoting too far without a re-review. Can’t sell a Farmville app that then becomes a bitcoin wallet which then steals said bitcoins.
2. Keep the quality of apps up by curating through review rather than allowing free, unlimited publishing of random/broken apps. It’s a small barrier to entry and proof that someone invested to make something worth publishing.
3. Malware for old/jailbroken iOS existed way back, but it’s not really a thing, anymore because 0days get mostly either reported to Apple or sold/auctioned off.
This limitation is not about improving security, it's about preventing developers from creating ecosystems inside their apps.
But sure, you totally know how to sneak malicious code past Apple's security processes, and for some completely sensible reason you've neither used it for your own nefarious gain nor reported the exploit and claimed the fat bounty that would inevitably await you.