Considering the popularity of Electron, if WebAssembly does get as fast as native or at least 80% there (and that's a really big if) it would easily become the cross platform language for the vast majority of desktop applications.
For mobile we will always be dragged down by Apple and its reticence to embrace web technologies. It's 2019 and we can't even show a add to home screen native banner.
"Bringing the web up to speed with WebAssembly" [PDF]: https://github.com/WebAssembly/spec/raw/master/papers/pldi20...
I mean yeah, it might be a little easier to distribute but it's kind of absurd to agree to throw away 20% performance for a bit of convenience and then sched tears on how the latest intel chips only delivered 5% improvement over the previous ones.
Plus there is another big argument. Some app written in wasm is not just compatible with today’s platforms (iOS, Android, MacOS, Windows, etc), but also with tomorrow’s platforms. The lack of apps is probably what killed windows mobile and resulted in the current duopoly. If all apps become compatible with any platform as long as they implement a standard, then there is a chance for a challenger to break the duopoly.
Don't get me wrong, this is all to the good. But does WASM have some special way to prevent the various platforms from implementing their own unique, special, and (of course) incompatible APIs?
It probably would've caught on more/sooner on the desktop if it had adopted a native-look-and-feel GUI toolkit earlier rather than later. Hopefully a WASM reincarnation of Java's cross-platform goals will prioritize that sort of look-and-feel integration.
Specifically, if all we knew about user-facing Java was that most Android app are written in Java (or something that differs from Java only enough to try to avoid infringing the intellectual property rights of Java's owner) then we would expect many cross-platform user-facing apps to be written in Java.
But we have no need for such indirect evidence because we can directly count the cross-platform user-facing apps that are written in Java. You mentioned one, Minecraft, and I will add Eclipse, IntelliJ and its derivatives and Jin (a "client" used to connect to the Free Internet Chess Server and Internet Chess Club). Since I know how hundreds of user-facing cross-platform apps are implemented, and since four out of hundreds is not much, I stick to my claim that Java never caught on for writing cross-platform user-facing code.
Depends on how you define "platform". Android apps certainly run on multiple architectures (mostly ARM, but also x86 and MIPS). They also run on non-Android operating systems (namely: ChromeOS; theoretically-speaking, other operating systems could run Android apps, too, so long as Dalvik runs on those operating systems, though unfortunately this is not the case for most operating systems).
And sure, we can provably claim that Java didn't "catch on" compared to the sheer volume of user-facing programs written in C or C++, but by that logic macOS didn't "catch on" compared to the sheer volume of user-facing computers running Windows (which might indeed be true depending on how you define "catch on").
So from an absolute purist point of view, yes, but practically it will be as cross-platform as it gets.
Which is always the case. The more you abstract the less you control, the more you need to control the less you can abstract. We've played this game before with Java.
Combining all of the technologies in the web stack has performance hit far higher than 20%.
It's like personal finance. At the end of the day you are not going to 'balance the checkbook' by identifying the 2 places where you are misbehaving the worst. You have to get every aspect of your spending below a % of your overall budget or you will always be in debt. Starting with the big things can be very motivational but it's not a strategy that works (or rather, it works by accident).
The relevant part for this discussion is that if you fix the slowest thing, (interpreting code) then the effects of all of the other sources of slowness are magnified. Everything else gets 'slower' by comparison (if you fix the thing that takes 50% of the time, then all of the things that used to take single digit percentages jump to double digits). You'll eventually get near your goal that way but the entire process is a game of whack-a-mole, and nobody will ever support you going back for the last 2% in some section of the code, leading to death by a thousand cuts (and honestly, I've watched a lot of people interpret performance reports and most of us can't even spot problems that small.)
Anyways, there are good uses for WASM even outside browser. Plugins and when you have multiple architectures you need to deal with.
Sandboxing and deterministic memory consumption is a great fit in certain applications. I'd like to use WASM on relatively small embedded devices (even as small as 64 - 256 kB RAM), to provide extendibility, for example.
Performance just doesn't matter all that much in so many cases.
You can also build hybrid apps, where you can take advantage of web tech for certain presentational stuff and still go fast for stuff in the backend
Still won't be a panacea for the Chromium runtime.
It will if native WebAssembly applications don't ship in Electron.
It's probably inevitable that all WebAssembly applications will ship in Electron until the heat death of the universe, simply due to network effects and inertia, but that doesn't actually have to be the case. It being called "WebAssembly" doesn't mean it has to run in a browser.
Want your command line application to run on your desktop and phone?
Want to mock that hardware you are creating on your PC, do you can unity test the controller software?
If we decide that the web stack is what Web Assembly runs in both on and offline, then we're stuck with the limitations of those formats forever, rather then at least trying to come up with something more fitting.
From writing a hybrid app and replacing all the native parts with it (so Node in Electron), to even writing everything with WebAssembly and using native GUI.
It is already possible to remove Node and Chrome from a hybrid app by using webviews and Swift in macOS and C# in Windows. We're developing a crossplatform app using that technique and it's awesome. With WebAssembly the native part could be completely crossplatform with comparable performance.
The only project I know that does this is NodeKit but it seems to be a bit abandoned.
The problem isn't JS performance -- scripting performance crown was won over a decade ago. The issue is DOM being slow. If you write bindings to a native GUI framework and write your app in JS, performance is going to be extremely acceptable. The biggest issue is that most JS devs aren't comfortable in that universe.
A great example is VS Code switching from HTML to Canvas for Terminal. Performance increased by several times. React Native is another example. It performs much better than a Webview even though it still uses JS. Gnome and GJS perform better than any electron app out there while using way less memory.
I assume that when you say query string, you are talking about document.querySelector() rather than the actual query string in the URL. Modern apps using something like React almost never use those. Even with a lot of care, those apps will still be slower than apps written with native GUI frameworks.
Calculating 1/1.43 = 70%, which is close to 80% already today.
That sounds like a feature.
There’s enough advantage from being accessible via URL, usable on any device, live updating, etc. to compensate that for many uses. But using web apps daily for any serious work is a huge loss in resource use, usability, polish, ...
The one that makes me saddest though is that web apps seem to be getting worse over time. For example most of Google’s suite of web apps were dramatically better a decade ago.
Where can I find this mythical mindblowing app?
Sure, you can get to the same result eventually. But the tools to do so are better and more intuitive for making Android or iOS apps.
I’ve seen nice results and crappy results regardless of what particular stack it’s made in. IMO the biggest difference shows up in integration with the rest of the system. If you want to hook into the Files app or Siri Shortcuts, native apps certainly have a leg up.
All this stuff is just done now regardless of what tech you choose. The arguments against web tech are old. The platform has caught up.
It has not. There are no tools for the web approaching even a Delphi RAD or a Qt Creator ca 2001.
CSS layouts are a joke, and trying to create anything as complex as a modern app with CSS is a road of blood and tears (go ahead and use CSS to implement something like Sencha. Hell, start with any constrained layout available for most native frameworks and toolsets out of the box).
By "most native APIs" you mean a very small subset deemed more-or-less safe to execute in the browser (you won't ever have full access to, let's say, UIKit, or AVFoundation).
And even if animation is rendered on GPU, 1) if you have a lot of them, the browser will struggle, and 2) web animations are extremely limited, primitive, and extremely constrained by layout. Good luck not running into reflow issues for animations which are a breeze on the native platforms. And good luck working around these issues using only animations which won't trigger them.
> All this stuff is just done now regardless of what tech you choose.
There are people who build OSes using only assembly. It doesn't mean that tools or capabilities of assembly are anywhere near available for other tech.
I have seen the add to home screen banner dozens of times and clicked it never. It would quite obviously be better for me if the banner didnt exist.
That's all assuming you trust the efficacy of the app store scanners of course, and my trust in google on that is pretty low.
Microsoft already does this: https://docs.microsoft.com/en-us/microsoft-edge/progressive-...
The banner being a factor… how? iOS has let users add web applications to their home screen for over a decade.
> We've all become very used to the App Store but honestly the experience is pretty crappy.
As opposed to every fucking site spamming you to install on your home screen being a stellar experience.
Further bonus: no unstable proprietary API to reverse-engineer and implement and it teaches the user that there's nothing specific to it, they can pin any site they commonly use to their home page whether that site provides specific PWA support or not, such that they can perform the action at any point they want.
The browser is simpler, the method is less scammy and the user is empowered.
It's not like these things are cheap, and plenty of developers wouldn't have a shot of successfully delivering an app without the infrastructure in place to support them.
It should come as a surprise to absolutely nobody that the vendor wants their cut.
If we are having an honest conversation about who gets a cut of what, we have to realise that web apps still have bills to pay and people getting their cut of something — credit card handlers, payments systems, web hosting (and that ain't cheap, especially when demand rises) on top of the cost of hosting an API and databases, and plenty of other costs.
That's what people are asking for. If you want to provide services, sure, charge for them, but also allow developers to not use your services in this case.
Apples monopoly enforcement in both this and in html rendering is one of the biggest causes of stagnation in tech right now.
When I run a web app, sure, I need to pay lots of different middle men. But I get to choose which ones I use.
Nobody thinks it cost nothing, it's the draconian 30% they object to. Other app stores are offering similar services, with a much lower cut:
There are ~1400 app releases in App Store per day (all of them need to be reviewed, checked for compatibility etc. etc.).
There are ~2 million apps which have to be hosted, checked, delivered etc.
Then there are push notifications. During WWDC'12 (yes, 7 years ago) Apple was pushing 7 billion notifications daily.
In comparison: Epic store has 4 apps.
Not really, any sensible benchmark would be "X vs. the most relevant point of comparison".
As an example the article mentions suboptimal register allocation, but without compile time comparision there's no way to know if there really is a simple way to improve WebGL implementation.
But that's beside the point. The paper isn't suggesting switching to Clang or to the algorithms Clang is using. Rather, it's treating Clang's output as an approximation of 'optimal' code generation for the given C code. There are various reasons to compare it to WebAssembly JITs:
- For one, the paper identifies specific reasons the JIT output is slower, which shows potential areas of improvement they could focus on.
- The comparison also provides a sort of upper bound on how much the JITs theoretically could be improved. It's only a weak upper bound; the upper bound of code quality achievable at the required performance levels is lower and would be more relevant, but there's no way to measure that.
- It also indicates the potential of adding a higher tier to the JIT that optimizes very hot code using slower algorithms.
- And finally, most entertainingly, the benchmarks help answer age-old questions like "can WebAssembly replace native code?" :) Or, at least, "for what applications can WebAssembly replace native code and have acceptable performance?"...
Do typical web assembly implementations cache the result of compilation?
(I just skimmed the paper, sorry if i miss something)
To my knowledge, Cranelift in Firefox is still disabled by default. https://searchfox.org/mozilla-central/source/modules/libpref...
You're asking for unified, standardized, cross-vendor hardware + instruction sets.
Why not do this other way around though? "Compile" JS to WASM, then to x86/x64. Finally just execute on the bare metal.
I'm sure you could compile WASM into Japanese or French if you had to. Maybe then, our French-native CPUs would finally be able to process "sudo fais moi un sandwich". My point being, you're asking a very broad, very theoretical question.
Are CPUs these days not performing well enough for most applications? If so, is it necessarily due to the complex instruction sets powering those platforms?
int_19h is not asking a very broad, theoretical question so much as that it is a difficult question to answer, requiring specific technical know-how about CPUs and compiler design.
Saying that one can compile WASM into Japanese or French is not an answer to "is x86/x64 (or ARM) an ideal target architecture for WASM", so irrelevant.
I guess a very tiny microcontroller could be a use case, where it's a good tradeoff to save some memory (=cost) by eliding the JIT even if execution is slow. Even then you could just replace this with offine binary translation.
> Root Cause Analysis and Advice for Implementers:
> We conduct a forensic analysis with the aid of performance counter results to identify the root causes of this performance gap. We find the following results: (1) code compiled to WebAssembly yields more loads and stores than native code (2.1× more loads and 2× more stores in Chrome; 1.6× more loads and 1.7× more stores in Firefox). We attribute this to reduced availability of registers, a sub-optimal register allocator, and a failure to effectively exploit a wider range of x86 addressing modes; (2) increased code sizes lead to more instructions being executed and more L1 instruction cache misses; and (3) generated code has more branches due to safety checks for overflow and indirect function calls.
A surprisingly large amount of this boils down to "x86 needs more registers", but there's quite a lot of detail here. It validates my personal experience that Chrome doesn't do a very good job relative to Firefox, for example
> Code generated by Firefox
has 1.15× more branch instructions retired and 1.21× more conditional branch instructions retired than native code, while code generated by Chrome has 4.13× branch instructions retired and 5.06× more conditional branch instructions retired.
> Chrome executes 2.9× more instructions and Firefox executes 1.53× more instructions on average than native code.
> On average, Chrome suffers from 3.88× more L1 instruction cache misses than native code, and Firefox suffers from 1.8× more L1 instruction cache misses than native code.
Overall, it was always obvious that Chrome's deficit compared to Firefox was just engineering work, but what is new is figuring out how difficult the native code deficit would be to close. Some aspects are quite straightforward; browsers probably need to develop a more aggressive tier-2 WASM JIT, now that both listed have a tier-1 baseline that is very fast to compile. This should include:
1. a better register allocator,
2. better peephole optimizations,
3. sophisticated loop optimizations.
Unavoidable slowdowns might include:
1. register pressure from reserved registers,
2. stack overflow checks,
3. function table bounds checks.
Is this line a typo? I can't make sense of it; I read 1.5 and 1.9 respectively from the table. (There's another mistyped sentence starting ‘Clang,’ but it is not a major issue.)
> On average WebAssembly in Firefox runs at 1.9× over native code and in Chrome runs at 1.75× over native code.
It's also the case that there's a lot more pressure on JS JITs to produce code fast, and so running multiple passes of low level optimizations was not desirable in a web context. This is somewhat bad news for WASM: better low level optimizations could mean longer compilation times. However, AFAIK, Chrome has been working on caching compiled code, so you may only have to compile WASM when it changes (or when you reinstall/update your browser).
Unless you have customers or want them, because Firefox's market share is generally too small to base your entire business around.
Nah, useless functionality. I've got feedly if I'd like to keep track of their news.
That being said I work in ad arbitrage, so I understand the presumed efficiency of these notifications. As they say our business is middle-aged Americans who can't use the Internet.
I know what you mean, but that's usually traditional news sites and the like. Having the Spotify web app notify you of the currently playing song or a remainder app reminding you of an appointment is of course a different case, Hence the "web site" vs "web app" distinction I made.
But besides the point: imo the semantics of these notifications severely differ between applications. Moreover, the usefullness of them heavily depends on the ability for someone to integrate them in the whole ecosystem.
Thus notifications from news websites fit right in: they don’t need a snooze button as it in for OS X Calendar app. But they are the least useful.
So we are back to square one: if you allow notifications, then what kind of? Do you allow any HTML payloads or only some?
That being said my reply is re. subj.: I understand why Apple doesn’t like them. Because they are generally useful only for a small amount of use-cases, 99% of which is spam. And an average user is going to blame Apple for allowing him to accept them in the first place.
The reason I feel web notifications are more problematic is the nagging aspect that I think would happen (like, for example, the 'install our app' interstitials and prompts). That's annoying enough without adding notifications to the mix.
It's the new IE, I'm constantly implementing workarounds for Safari.
Add to home screen is a good example
iOS 12.2's main updates are with regards to state management and authentication flows for PWA's.
PWA's are since 2015, so mid 2018 does indeed seem pretty fast for Apple for implementing it.
I might believe it all when they have "approved 3rd party web rendering engines". Which is ( to be honest) the only thing that will change my opinion about it. Implementing it very slow is just a little better than not implementing it at all.
And i don't think Apple will ever do that. So, it's a rock-steady argument since 2010.
If the user doesn’t use the app for a few weeks, iOS will free up the app’s files. The icon will still be there on the home screen, and when accessed the app will be downloaded again
So much for PWA's on iOS.
They do even auto-delete the PWA-apps
If the user doesn’t use the app for a few weeks, iOS will free up the app’s files. The icon will still be there on the home screen, and when accessed the app will be downloaded again.
Not very trustworthy for using a PWA on iOS it seems :)
I'll admit it's been little more than a fancy way to add a bookmark to your home screen with the option to hide the browser chrome, but it's been there for a long time.
Browsers have been working had to make people not want bookmarks, but everybody that I show how to get them is just loves them.
(But well, they would be better if the mobile web didn't suck.)
Websites can show a 'add to homescreen' button.
And no, asking the user for confirmation is never sufficient to stop unsuspecting people from accidentally causing their own compromise. It's a trust thing. Once they trust one dialogue from a legit website, they'll trust it over and over.
Let web sites leave the home screen on my phone the same way web sites have always left my bookmarks/favourites: alone
Edit: It looks like Chrome may have changed this, what I’m referring to above was the initial implementation. I think it’s a reasonable objection to say this should be under the control of the browser. But Safari could just implement the former without the latter.
My main gripes with Safari at least with my use of webapps is that it doesn’t allow users to install PWA’s without the browser chrome, and its implementation of service workers is subtlely broken. Those are the biggest usability issues.
If a user goes out of their way to go to the share menu, scroll along and click ‘add to homescreen’ why not allow them to specify to use what is clearly an app-like link without browser chrome? And why implement Service Workers but not according to the standard?
Having said that, if PWA support in iOS moves forward I wouldn't be surprised if that API is added to iOS, since then it'd be part of a bigger-picture thing.
Technically not quite the very start (but it was added in iPhoneOS 2.1 so close enough).
We already have a "X company has an app" banner API on iOS, it seems fitting that webapps have an option to match.
And the Safari provided "X company has an app" banner is a lot less annoying than any third party "HEY WHY AREN'T YOU USING OUR APP" banners in that it goes away when you scroll down instead of being fixed to the top of your screen. If a native feature can keep people from making a shittier version and sticking it all over, then I vote yes.
Having the native API to do something doesn't make the third-party crap banners go away.
I guess the real question is "If we didn't have smart banners, would even more websites use obnoxious full screen popovers and fixed position headers that pop back up every time you visit the page?" I think the answer is yes, fixed position headers are a really easy/lazy thing to tack onto a webpage, and not bothering to persist the fact that you already closed it 10 times is easier than saving that preference and checking it.
If you want people to consider doing it the nicer way you have to make it just as easy.
I think the answer is no: smart banners only increased the pop-over spam, and adding a smart banner is pretty much free, it's a single line to add to a header and is guaranteed to only trigger on the relevant / target population. A hand-rolled version not only is significantly more expensive to implement but it will misfire and lose visitors.
What? It's a single tap to open the share sheet and the button is in the system row "above the fold" (because there aren't even enough items for anything to even be below the fold). If that's "buried deep" I'd shudder to know what you think of uninstalling an application or adding a keyboard.
And in answer to your last point, I'm not optimistic about the average user's ability to add a keyboard either.
You mean the thing that's been there since iPhoneOS 2.1? (https://appleinsider.com/articles/08/10/03/latest_iphone_sof...)
Or are you just whining about other implementors not adopting Chrome's proprietary crap wholesale?
So much for PWA's on iOS... Apple's willingness to embrace web technologies is a joke.
The way they banned Steam from streaming games, by retroactively changing app-store rules, it appears like the motivation is to increase (app-store) revenues, by crippling the iOS browser.