I have built lots and lots of stuff on mobile web. Never once was JavaScript perf the bottleneck on iOS. Never (especially since the rise of retina screens).
I think there is a lot of hyperbole around the whole JIT in webview thing from people who don't have a good grasp of where the performance problems are on mobile web.
This. To my knowledge, JS perf isn't an issue in any modern browser (even the crippled iOS webview). Improving the performance of DOM access and manipulation (not just on iOS) would be a much more fruitful enterprise and is something I would like to see all browser makers focusing on.
A million times this. XML at the end of the day is a language with only two types, strings and children. This is woefully inadequate if you want performance.
Take a 3D matrix transform for example. It needs to be marshalled from an array of floats to a string to be applied. Then it needs to be parsed back into an array of floats to be used to position a texture in the GPU. Such a huge waste...
The only way any JIT at all will ever be allowed inside of UIWebViews is when UIWebView makes the jump to WebKit2, i.e. using a helper process to actually run the web content.
They might be able to handle it with a remote view controller. An existing iOS technology that allows for a view controller that fronts an out-of-process application. This is used for the mail composer and facebook sharing.
(I suspect there are some constraints on how a remote view controller can be used, and they'd have to do some additional work to shunt in NSURLProtocol and all.)
That's quite unrelated. That's a technology that allows for e.g. the Mail compose screen to run out-of-process while still compositing into the same layer hierarchy as the host application and receiving touches. This not only allows restricting access to Mail accounts to the daemon that runs the remote controller, but also prevents the host application from doing things like hiding the mail compose controller behind another view and forwarding touches to it, so a press on an unrelated button actually triggers an email send.
WebKit2 is done differently. The UIWebView still handles all the user interaction and whatnot, and it talks to the Web Process to send it requests (e.g. "load this URL", "run this JavaScript in the context of the page", etc). I'm not quite sure how the compositing of the web content works, but it's not the same system as remote view controllers.
I might not have been clear in my grandparent post. I wasn't proposing this as a means to WebKit2. (I don't know what WebKit2 is.) I was suggesting this might be a way to get JIT support in native apps with the current iOS WebKit, by running the entire UIWebKit out of process.
However, I have no idea if there are enough hybrid apps out there, with javascript performance issues, to justify going to the trouble of doing this.
Ah. Well, it's a cute idea, but it won't work with UIWebView. Besides the fact that existing apps already rely on the fact that UIWebView isn't remote (because they can muck about with various view properties of it, including direct access to its UIScrollView), UIWebView also exposes at least one synchronous API (-stringByEvaluatingJavaScriptFromString:) and synchronous APIs are no good when doing stuff out-of-process. It also has a delegate that runs in a synchronous mode (e.g. it's asked questions it has to answer by returning a value, instead of just being notified of events or using a callback-based response), and that's the same problem as synchronous APIs (because now the remote process has to block waiting on a response from UIWebView).
WebKit2 is an existing effort by Apple to do out-of-process rendering. Various desktop apps already use it, including Safari, Mail, and iBooks. It features new APIs for doing everything, because it's all asynchronous. If third-party iOS apps gain out-of-process rendering it will be through WebKit2 and not some attempt to port UIWebView to the remote view controller system.
Incidentally, the same synchronous issues with UIWebView that I mentioned above suggest that porting to WebKit2 is a non-backwards-compatible change. If Apple makes this change, I would guess that either they'll introduce a separate UIWebView class that's used for WebKit2 and leave the existing UIWebView alone, or they'll give UIWebView two modes, a WebKit mode and a WebKit2 mode, and whichever mode it uses will be selected at initialization time (and using a WebKit method on a WebKit2 UIWebView would be an error, and vice versa). Either way, apps will have to be rewritten to take advantage of WebKit2.
What is it that makes a synchronous call to an out of process view on ios so bad?
Also, apple has changed the internal subviews of system views before. In some cases they seem to provide a backwards compatible subview layout mode triggered by looking at the compiler/linker/sdk version so existing binaries will keep running. For sure I have seen previously working code like this breaking merely by recompiling with a newer xcode, as if it is taken to be an opt-in to new internal layouts - but not always; sometimes even existing apps needs rush updates if they have been assuming too much about undocumented subview hierarchies.
> What is it that makes a synchronous call to an out of process view on ios so bad?
Because it's blocking the main thread. Worse, it's blocking the main thread doing something that has effectively no upper bound on how long it can take. Web Process is blocked trying to handle a previous request to calculate Pi to a trillion digits in javascript? Oops, your synchronous call to execute more javascript will just have to wait. Indefinitely.
> Also, apple has changed the internal subviews of system views before.
I'm not talking about internals. UIWebView directly exposes its scrollView as public API. And you can muck with the view properties of the UIWebView too to make changes. For example, if you really want to you could set your UIWebView to 50% alpha (and this is totally legit, not depending on internal subview hierarchies). Whereas Remote View Controllers explicitly prevent you from mucking with any aspect of the view hierarchy.
It would still be blocking the main thread if it was in-process? I'm not seeing the big deal here, at least if there's a 1:1 relationship of web views and the out of process counterparts. And it's not like ios is running a lot of foreground apps simultaneously either.
Fair enough about the scrollviews, although maybe they could be proxied somehow.
I suppose you're right in that my example of calculating pi only makes sense if you're sharing the web process with another app. But the real point, that I skipped to try and make what I thought was the more dramatic (but flawed) point, is that out-of-process requests have built-in latency that in-process requests don't. It's just plain bad design to have a synchronous API that talks to another process.
I thought the latest Safari (Desktop at least) were already on Webkit2. And i know Safari on iOS7 tends to be running one cycle behind ( Correct me if i am wrong )
After Further Digging, it turns out WebKit2 has been a private framework in both OSX and iOS. But on OSX it has been extremely unstable. It wasn't until Maverick did the process per tab came and stability improves. I guess Apple still needs to work on for iOS due to its heavy memory usage.
Hopefully they will allow you to ship a browser that contains a JIT. Apple doesn't currently allow that, so most browsers perform pretty much the same on iOS.
Apple doesn't allow you to ship a browser with its own JS engine full stop, let alone JIT. So every browser on iOS is just a fancy UI over the hobbled Safari engine with no Nitro. So there's Safari on iOS which is full speed, and then every other browser on iOS which is hobbled speed. This is by design and unlikely to change.
This is the very reason why Mozilla refuses to release a Firefox version for iOS[1]. Mozilla would like to ship Firefox with its own rendering engine, Gecko, while Apple policy dictates that all browsers on iOS are required to use WebKit[2].
Doesn't that apply to FirefoxOS as well? You can't write a JIT in Javascript. So for example Google wouldn't be able to port Chrome with V8 to FirefoxOS, as far as I know.
No, it doesn't. Unlike iOS, which is a full operating system, Firefox OS is basically a web browser. The apps are all web-based apps. Firefox isn't just one app of many that runs on the OS (as Safari is on iOS), Firefox IS the OS and all the apps running on it are Javascript/HTML/CSS.
So, while Google could probably build Chrome out of Javascript and run it atop the Firefox OS engine, it wouldn't make sense to do so. Asking if you can is basically like asking if you can create Chrome as a Firefox extension on Windows. Sure, you probably could with enough effort, but it would be a bit pointless in terms of functionality, speed, practicality, etc.
Additionally, this is a bit of a straw man that Apple fans trot out when people accuse Apple of being anti-competitive with iOS. It's not an equivalent situation. An equivalent situation would be if Google forbid other browser engines from running on Android. But, they don't. So, we have a nice competitive browser ecosystem on Android with solid browsers like Firefox that I use every day.
Firefox OS takes the view that the system you are writing to is the browser. It is the OS you are running. From that world-view, writing a JIT would mean writing a compiler that outputs asm.js and then executing that outputted code. So if Google modified V8 to have different backend they could definitely run it on Firefox OS. This would be the same process they would need to do to port it to another system, ie modify the JIT to output different types of machine instructions to execute. It just so happens that in this case the machine the instructions are for is the Javascript VM.
Apple's case, if I understand, is different, because you cannot generate code at runtime and hence you cannot compile something "just in time". Nothing about Firefox OS disallows this, and so the two situations are not comparable
AFAIK, iOS JavaScript has eval(), so you are free to run Firefox on iOS in the browser in the same way you can run Chrome on Firefox OS or Linux on iOS (http://bellard.org/jslinux/ boots in 10.4 seconds on my iPad).
Well it's not really the same. Firefox OS is literally the OS. Demanding that Chrome be allowed to run their C++ code would be like demanding that Google makes it so that Firefox OS can be installed on Android phones, replacing the Android system. They don't do that, instead they offer up to developers the same facilities to write their apps that they reserve for their own apps. GMail for Android is written using the same language and APIs that you can use for your own application.
Firefox OS is providing the same thing. They don't allow you to just replace the OS, which in this case is the browser, but all applications, including their own are written with the same APIs available to you the developer. They aren't writing some of their own apps in C++ and then demanding everyone else write in Javascript. Apple is writing their own JIT and then demanding that everyone else interpret, which is not at all equivalent.
The difference here is that the browser is the OS. So the OS is at a certain level which you cannot replace or step into willy nilly. Think of it like Windows enforcing Hardware access through its drivers, not allowing you to just send whatever you want to someone's disk drive and possibly do damage to the device. In the same way, they enforce a level of Memory protection through the runtime, and after that they are on the same footing as you. We accept these constraints when we run OSs all the time, and I don't see it as being much different here.
That doesn't actually rebut the point. After all, Firefox OS's browser doesn't include a HTML renderer and JavaScript engine written in JavaScript; it outsources this functionality to a sandboxed mode of the underlying C++ engine. If you want to use a different engine, it has to be written in JavaScript, which is of course completely impractical for performance reasons.
The reason is basically that you can't easily provide a guaranteed W^X [1] environment alongside a JIT in the same address space -- that is unless you write a ROP-based JIT.
There's nothing that prevents W^X and JIT from coexisting. All the JIT has to do is change the permissions on the generated code after generating it so that it becomes executable instead of writable. W^X just says that any given chunk of memory cannot be both writeable and executable at the same time.
What Apple does on iOS goes beyond W^X. They give you a one-way trap door, where you're not allowed to mark any memory as executable, ever, once it's marked non-executable. (Obviously, there is an exception made for the OS facilities that actually load your code from the disk, and an exception made for Safari's JIT, but us mere mortals don't get access to that stuff.)
As for the reason, it's hard to say. It's probably because of code signing and Apple's desire for control to prevent you from downloading and running unreviewed, unapproved code, although even that doesn't entirely make sense, because you can always embed an interpreter and download and run code in that.
They can co-exist, but not in the same process. If there's a JIT running in the same process as my C-based program, it has the permissions as I do.
Any secure JIT would have to exist out-of-process with extended permissions, and would require some sort of RPC back-and-forth as the code was shuttled between the two processes.
W^X is not some code word for "secure". It has a specific meaning, and that specific meaning is completely compatible with JIT code. All it means is that you have to call mprotect() after generating code.
One can debate the security implications of allowing programs to mark code as executable at all, but that debate is unrelated to W^X.
1. Preventing any app code from allocating memory and setting it executable: Apple doesn't seem to want to allow dynamically adding code to any app store apps. See: No .dylib support, the earlier bans on all types of interpreters, etc. It would be too easy to get a "Loader"-type app into the app store that could download and execute random app code from the internet that hasn't been vetted by the app store review team. (Emulators, plugins, addons that call private APIs, etc. etc. etc.)
2. Limiting the vulnerability surface: Reduce the risk of webkit exploits targeting app store apps (or even non-webkit random exploits in the app's code allowing loading of shellcode)
You are correct. I'm (incorrectly) using W^X here to refer to Apple's W^X, which is effectively setting W^X on pages at image load time for preventing execution of dynamic code -- and possibly for security, but I imagine that is more of a bonus from their point of view.
It is -- code-signing is how they enforce W^X. Only code loaded from signed applications is given pages with execute permission, and those pages are denied write access.
mmastrac is correct, the problem with JIT is W^X. The default sandbox for iOS processes prevents marking any page as executable that has ever been marked as writable. Safari is not "authorized to use JIT" so much as it has a special sandbox that lifts this restriction. It's not about code-signing, it's about the security provided by W^X. If it was merely about code-signing, the JIT could simply codesign its output using a certificate provided by Apple for this purpose.
I wanted to get into iPhone development, and I kept looking back thinking I was inept at Obj-C, and I may well still be, but thank you for saying this, it reminded me that my ideas that mixed local and remote content and web technology made me quickly realize that WebView was a second class citizen, and I would've needed to reimplement a whole browser to make what I wanted on the iPhone. So I switched to Android where the WebView is a better object, although it would be nice to have an embedded Chrome based web view.
I remember when everyone took Microsoft to task for their "private" APIs in Windows, deep IE integration, etc to give their own apps an advantage,and forced them to level the playing field.
Why is no one up in arms about Apple doing essentially the same thing with Safari vs. UIWebView?
There's a chicken and egg problem here. The OP said "maybe this will allow us to make high performance apps" - given that it currently isn't possible, it isn't very surprising that you haven't seen any great, high performance apps.
As for preferring the desktop sites, I'm afraid you are in a minority there.
At what point do we stop calling it a "browser" and start calling it "a crippled, inflexible OS?" Relying on the browser as the operating environment in which to run any and every kind of application seems like a misguided a path to take. At some point the browser would have to become a kind of middle-man between the underlying OS and the end-user. There are very few good reasons to prefer "web apps" over native apps in any environment.
I don't think anyone thinks that the browser is the perfect medium for something like this. But native development is still extremely costly, because every platform has it's own language and environment.
If all the device manufacturers come together tomorrow and propose a cross-platform native programming platform then I'm all for it. Until then we'll rely on HTML.
Every browser has its own quirks, as well, not to mention the magnitude of front-end technologies, back-end technologies, and everything in between.
That is, developing for the web doesn't magically make these fragmentation issues you attribute to native development disappear. The cost is lower because the quality expectations are lower, both in terms of the end-product and the developers putting the product together.
Put another way, if browsers become sophisticated enough to make them an appropriate medium for native-like apps that are more sophisticated than basic form data entry, what makes you think the cost to find competent, quality developers to write these apps to the level of quality expectation the end users will have won't be higher than it is today?
No the problem is some people are simply detached from reality.
We use apps like Photoshop, Word, Logic, Lightroom etc and not once has anyone come close to replicating the full experience within a web browser. Performance was never an issue then so why do people believe that once mobile browsers are faster that magically native apps will disappear ?
why do people believe that once mobile browsers are faster that magically native apps will disappear
I'm not sure that I've seen anyone suggest that. In any case, your point assumes that every app has the level of complexity that Photoshop, Lightroom etc. do and that's clearly not the case.
The simple fact remains that mobile web performance is sub-par, so every mobile web experience you've seen so far has also been sub-par. That doesn't automatically mean that amazing mobile web performance will result in amazing mobile web apps, but without trying it we won't ever know.
I'm confused. Was I talking about websites? I'm talking about developing cross platform mobile apps that utilize Apple's new proposed JS engine and WebGL. Apple's policies prevent apps from using their own JIT JS engine (i.e. V8). Outside of Safari, we are stuck using a non-JIT (non "Nitro") version of JavaScriptEngine.
As a consequence, performance of JS outside of Safari is notably poor, and developers choose to go the "native" route for the sake of user experience.
My point is that we can do better. Tossing WebGL into the mix wouldn't hurt.
The title makes me cringe. It makes it sound like this is the first JIT from Apple, when it's actually the "Fourth Tier LLVM" JIT. It goes on top of the current JavaScript engine, which is already pretty fast.
Safari is basically the worst browser right now. They have a firm toe hold in mobile obviously, but in terms of desktop it's the worst browser to support, even behind IE. Nice to see them finally getting with the times. Every other browser has had this technology for years.
> Nice to see them finally getting with the times. Every other browser has had this technology for years.
Citation needed? Safari's had a JIT JavaScript engine since 2008, and was one of the first (if I remember correctly). This is just a new version again.
> Safari is basically the worst browser right now.
Worst browser for who? Developers or actual users?
As a dev, I don't prefer using Safari to build stuff, that's for sure. Chrome and sometimes even Firefox is much more pleasant in that regard.
But as a user, the overall browsing experience is usually quite good. There are three things that it does really well IMO.
* It renders text and websites beautifully.
* I don't care so much about artificial benchmarks - Safari usually feels like one of the faster browsers in normal usage and for things like switching between tabs and all of that normal stuff.
* It takes advantage of some newer OS X technologies for extending battery life.
The only missing ingredient as a user is the much smaller extension ecosystem for Safari.
This is just flat wrong. Safari is basically at feature parity with Chrome, which isn't surprising given that they only diverged recently. I haven't to my knowledge ever had to work around differences between the two.
Indeed, Safari has had JITed JS for longer than most browsers - this is just an update.
No, it's not a joke. You've identified two features that are in Chrome that aren't in Safari. There are a few others - the new filesystem APIs, Shadow DOM etc. - there are also a few features in Safari that aren't in Chrome, like MathML and I think CSS region support.
IOW, they aren't far off each other. Safari suffers from a much slower release cycle in particular - Chrome is usually a few features ahead due to this. Same with Firefox.
Pretty much every game-changing web feature comes to Safari years after Firefox and Chrome. Yes, they are good at some things, particularly CSS, but not at the types of things that allow new types of applications to be built.
Indeed. There are many missing features in Safari, including such basic stuff as allowing you to set the download attribute on hyperlinks. That makes it very difficult to write client-side apps for Safari (at least if you want to allow the user to save the output to disk). This is a personal pain point that's bitten me several times.
For those who aren't familiar with the problem, say you've written a cool web app that creates (just for example) ZIP files. You want to let the user save these to disk. In Chrome, Firefox, or Opera, this is dirt-simple. Put the ZIP data in a data URL, set the href attribute of the link to the data URL, and set the download attribute to "yourcoolfilename.zip" (or whatever you want). Boom. Done. User clicks the link, data downloads to "yourcoolfilename.zip". Totally streamlined.
Not in Safari. The best you can do there is open up the data URL in a new window (which of course ignores any mime type you've set, so the user gets an ugly screen full of binary data) and tell them to use Save As to save the mess. Very nasty.
Chrome has had this feature for 20 (!) versions, Firefox for 9, Opera for 5. It even works on the Android and Blackberry browsers.
Even IE allows this (in a completely non-standard and convoluted way, naturally, but it can be done).
Why can't they just use v8 or spidermonkey? Why invest time on creating something that will be only marginally superior (maybe?) than current open source technologies?
It's great that Apple has smart engineers working on this. It's trying out a new approach that neither v8 or spidermonkey have - to use LLVM as a 4th tier JIT. This is how interesting things happen.
The world would be a poorer place if we had fewer independent implementations, as each has its own design and architecture.
I think there is a lot of hyperbole around the whole JIT in webview thing from people who don't have a good grasp of where the performance problems are on mobile web.