The elephant in the room is download size. A wasm photoshop, even if it works and performs well, is still a multi-gigabyte "web page". The browser is in no way set up to handle that.
wasm as it's implemented is going to have a very narrow band of usefulness. Basically, isolated computational modules (e.g. a physics simulation), and games. The games won't be, like, "Call of Duty on the web", they'll be little flash-type games with not too many assets. Creators will have to put a lot of effort into asset loading and compression to get around browser limitations.
This just expands on what can be done now. No one is claiming this will fix everything, it will just allow more than we have now and leverage many of the advantages of native development.
As for photo shop, why does a Wasm application need to be structured the same way as a desktop app? Desktop apps shipped all the the software in a single download because that download came from a disc or was expected to be used after being disconnected from the Internet. Why not download the functionality piecemeal? Download a small set of libraries that enable core functionality, then download modules as they are requested. When the user clicks a menu or button go get the code that makes the functionality behind that work. I am not saying break up every button, but if there is a new screen of group of functionality, make that a module then go get that.
In a video game it could be broken into levels or regions the in game map. In gaming paged loading is a solved problem, it seems like it could be applied here. The next level or nearby regions can be downloaded while the player explores this one.
So you split it into chunks and download the bits you need as you go.
Microsoft has already figured out how to do this -- you can run Office (the real, full Windows version) basically streaming from the Internet already.
at last, the renaissance of the microcomputer programmers (like the COBOL renaissance of the mainframe programmers)! that kind of technique was common back in the 80s to get things done
While it seems early, I can't count many web standards that Microsoft, Google, Mozilla agreed on running the same way.
I disagree with the notion that wasm, or any other tech will be a failure because it can't succeed at being all, or nothing.
There's a lot of problems that aren't being solved in the browser that will now start to become possible.
Unity games can be compiled directly to wasm, and it's still early: http://webassembly.org/demo/Tanks/
The abilty to create rich experiences that are code once, deploy everywhere sans browser interpretations is a big deal.
A wasm photoshop would cache the download, use a HTML front end, and be written in a compiled language.
Sure you'll be able to use the tech to do stupid things like running a python interpreter inside a WebAssembler virtual machine. But then your argument is just that if you use the tech to do stupid things you'll get stupid results. Total straw man.
Given an enormous WASM binary one could write a small bootstrap binary that fetches the binary and stores it in local cache. Future updates to the mega-binary can be done with some kind of delta-diff mechanism.
Now, let's say you rely on Qt in some manner. Well, likely the WASM package resides at some known URI and is subject to standard browser caching.
There is already a movement towards using the DOM for web game UI instead of canvas, for example this recent article makes some compelling arguments: https://blog.pocketcitygame.com/5-reasons-to-use-dom-instead...
Using web platform technologies can actually reduce the download size over native apps. In the game I am working on, written in C and compiled using emscripten, as a rough measure, the equivalent program compiled natively totals an executable size of about 2 MB, not including texture resource data. An optimized asm.js build is about 950 KB and WebAssembly only 580 KB, this includes the .html shell, .js loader, and .wasm binary itself.
This is not a completely fair comparison because I compile out some native code not relevant to the web, and vice versa, but here are a few specifics of where I believe the gains may come from:
curl: the native C app uses libcurl for fetching resources from HTTP and HTTPS servers, but on the web, we have XMLHttpRequest and HTML5 Fetch. Emscripten provides the built-ins emscripten_wget() and emscripten_async_wget() for these purposes. No need for shipping HTTP and SSL stacks because the browser already has it included.
glfw and glew: libraries for wrangling OpenGL, emscripten has its own implementation which largely simply bridges to WebGL or other HTML5 APIs, a very thin layer. SDL, too.
databases: many apps bundle their own copy of SQLite, often as the single C file amalgamation. I used to, too, even through emscripten and it worked fine (there is even a pre-packaged emscriptenified sqlite.js), admittedly I haven't looked into it much yet but the web platform supports IndexedDB built-in, no extra dependency needed.
We may see a resurgence in "small C libraries" targeting WebAssembly. There is a growing trend of header-only libraries, especially the popular stb: https://github.com/nothings/stb#stb_libs and there is a growing list here: https://github.com/nothings/single_file_libs I consult to find tiny libraries appropriate for linking into a web-based C application. lodepng (for decoding PNG images) and miniz (for reading and extracting zip files) are about the only substantial dependencies I have beyond what is in the emscripten standard library. Everything else is provided by the browser, the web has grown to a surprisingly powerful and complete platform.
WebAssembly defines no new platform APIs other than some APIs for loading and linking WebAssembly code, relying on standards-based Web APIs for everything else. WebAssembly differs from asm.js by defining a bytecode format with some new operations JS doesn't have, so some spec work was required (and has been done!). Like asm.js, WebAssembly application call-stacks are maintained by the JS VM, outside the memory addressable by the application, which reduces the exploitability of application bugs. (Though, again like asm.js and unlike PNaCl, the compiler is trusted.)
What could an evil compiler really do? Maybe add a spinning loop to waste CPU cycles, but that can be done in JS too.
This is a real issue, but o some extent this extra attack surface is mitigated because vendors are reusing JS compiler backends that are already part of the TCB.
As the verifier for NaCl is likely to be an order magnitude smaller than a component in WebAssembly implementation that verifies and compiles the bytecode to the native code, PNaCl attack surface is much smaller.
And… well, you're right. It's the JIT's job to produce sensible code, and it's the browser's job to sandbox that well.
But what's wrong with that?
At least with WebAssembly it's a well-defined, small language that's easy to verify.
The JS/WASM VM does. If unsafe WASM code is allowed to execute, there is a bug in VM. VM must prevent semantically incorrect WASM from executing Control flow integrity and incorrect use of pointers are detected at load time. There are traps for invalid indexes, exceeding stack limits, invalid indexes in the index space.
(I should not need to mention this but compiling is form of verification).
Not like with the other, though. You can statically verify NaCl stuff because it was designed for that to be easy. If WebAssembly doesn't do that, then it's a step down from NaCl at least on this risk. That it matters should be obvious given all the research by the kind of CompSci people that invented NaCl on eliminating that risk from the TCB. It's part of a larger concept where you make the things generating the risky code untrusted with the trusted checkers so simple they can be mathematically verified. Here's an example from a top group on NaCl:
Why is this so?
Webassembly is an overcomplication. But, as always, I suppose we have to live with it.
Probably! I don't pretend to be able predict what path companies would have went in the next few years.
It's really good to see that there is a will to agree and that there is more than one player. Could always be more, also for keeping standards sane. (WebSQL officially failed because there isn't more variety)
Do you mean ActiveX rather than DirectX? As in the OLE and COM technology for embedding user interface components on Windows?
DirectX was hardly a dark age, DX8 and onwards was a step ahead of GL.
I don't actually know too much about graphics programming. Stupid mistake from my side. Sorry!
I think this will make an explosion of more browser-hosted applications, with much more power than before. And will make many more programmers go into serious frontend app development.
Also, as a personal wish of mine, this enables the possibility of being able to program on the language of your choice, both in the browser side and on your server side. For example Haskell/Haskell, Common Lisp/Common Lisp, Clojure/Clojure, Racket/Racket Python/Python, etc. And i mean using in the browser the FULLY FEATURED version of the language, not a subset or a limited version like ClojureScript, PyJS, Transcrypt, etc, but a full version of the language supporting the full libraries available for it.
This also gives us a little step forward in liberating ourselves from being tied to the mainstream operating systems: Windows, Apple X, Linux, BSD, because more and more apps will target the browser environment, not the operating system directly.
Roll your own operating system and still keep compatibility with most apps!!
Does this mean, simply said, that I can take any arbitrary windows desktop app written in C++, for example, and run it in browser on every other OS?
It required 3 days of porting from what I remember hearing.
What follow is speculation:
At guess they feel burnt after they lost the whole closed source DRM in the w3c spec battle. They realize that something like WebAssembly will become a thing in the next few years and that if they don't push super hard for a completely open solution from day 1 then they're afraid that Google, Microsoft and Apple will get together in a room and make a deal without them.
At the end of the day Mozilla don't only care about delivering a browser, they care about delivering a completely open browser, and they don't want their ability to deliver that to become more threatened in the future.
I think you will be surprised how many apps will use WASM in the future. I'm a C++ dev and working with WASM (and asm.js as a fallback) full time. It's going to be absolutely huge.
In our case we use the same library code on iOS, Android, Windows and in the browser to do computationally expensive operations.
Also, to dismiss productivity apps such as PS and CAD as unimportant is kind of crazy. Huge business model change? They are still an enormous part of the software industry and the browser a great platform for distribution for many cases. Performance and code secrecy being two barriers that WASM solves for them.
Imagine if every AAA game had a web demo?
Imagine real support for Peer to Peer video that worked in several browsers and platforms?
What could editors in browsers be? Not just text, but any kind of office suite that exists on the local
Is it? Since Adobe switched to the subscription model and the "Adobe Creative Cloud" I'd say they would be quite happy if that infrastructure would be good enough to run their very big application suite(s). Not to mention the savings of not having to support two major platforms (and several OS versions for each) - even realizing them only partially would be big. Of course, given the size of their product I'd say there is little use in talking about this at this point, the web platform would have to mature a lot more first.
If they do this right, I suspect it will be much more heavily used than that. I can already see a world where every major webapp is using this (indirectly, using a language that compiles down to this) for the performance and user experience improvements it could provide.
(Yes, I know Safari technical preview has support, but shipping Safari does not. We'll see whether Firefox ends up shipping support before Safari or not; the patches to implement <link rel="preload"> in Firefox got posted to https://bugzilla.mozilla.org/show_bug.cgi?id=1222633 earlier today.)
This type of thinking and behavior is going to make Chrome today what Internet Explorer was in the 90's: a toxic, dangerous platform that ignores standards.
If that was that critical, it'd be easy to point to e.g. webpagetest.org traces showing Chrome loading a site significantly faster. And, yes, you can find differences in microbenchmarks but it's pretty rare to find something where rel=preload is a game-changer.
If you actually run benchmarks, it's nice but hardly a game-changer, especially in the post-HTTP/2 world. If you're concerned about cold page load times most sites will see significantly greater benefit from using fewer blocking resources.
Same for modules, or Intersection Observer or many other features that FF is not shipping. Stop trying to prove me wrong and understand my point. Firefox is slow on stuff that makes regular old web developers lives easier, because they're chasing 3d games.
> Stop trying to prove me wrong and understand my point.
I think you should focus on making your point more clearly rather than defending what was clearly a. For example, you cite modules as something which is apparently a big deal for web developers but not shipped by Firefox. Sounds like Mozilla needs to get cracking … unless you know that only Safari has shipped it and the Chrome, Firefox, and Edge teams all have it available but behind a feature flag for testing:
That doesn't support your narrative that the Firefox team is ignoring this or that they're behind the market — and since anyone who isn't targeting only the latest version of Safari is either polyfilling or continuing to use their existing strategy, so there's an upper bound on how bad that can be, too.
Similarly, with Intersection Observer you can see that it was enabled in FF50 but had stability issues which lead to it being disabled and is likely to be re-enabled in FF54 based on testing. Unless you have some evidence that the developers who were working on that were pulled off to work on WebAssembly it doesn't seem like an especially compelling argument.
Again, I'm not saying that any of these things are useless — only that the narrative you're insisting on where Mozilla is ignoring web developers doesn't seem to be well supported by the evidence. At least for the projects I work on, I'd level that criticism at Safari or Edge first and in the much fewer cases where either Chrome or Firefox has a bug or limitation I need to work around it's about as likely to be Chrome as Firefox requiring extra work.
Do a quick text search of this Hacker News thread for people talking about Firefox "feeling" slower than Chrome.
Preload is a big underlying "why" of that experience.
I've read many comments where people mention performance in ways suggesting they have issues with UI performance, launch time, extensions, etc. Optimizing cold-cache performance the first time someone visits a site won't help with that at all.
But what do we gain with WebAssembly? Faster download+compilation times? How much faster?
TL;DR: much faster parsing over asm.js (10x to 20x faster), parsing should also use much less memory, 10%..20% smaller downloads (when comparing the compressed sizes, uncompressed WASM is several times smaller then asm.js), 64-bit integers (these have to be emulated on asm.js),
much faster parsing
3d gaming comes to mind, most games have a tight budget of 16ms per frame, and at those speeds 0.1 ms is a real chunk of that. If I am going to download a new module in then try to load it the game shouldn't have to hiccup for that.
And hey, even if photoshop is 1000 times bigger then jquery - that would still compile in one tenth of a second.
EDIT: Also, if you think 10%-20% smaller downloads aren't worth it, you're clearly not working at a CDN.
For asm.js/webassembly we're talking about source code sizes in the multiple megabytes range after gzip compression.
There's a talk by Alex Danilo from Google IO 17 explaining it in detail.
Here, I'll link you directly to the part where he starts explaining the differences on a technical level:
Another part is that I suspect it makes maintaining a good JS compiler easier, because all of that asm.js code can be removed. I'm sure the browser vendors are happy about that.
Without WebAssembly, browser vendors have to strike a balance between optimizing the JS as much as possible (for increased performance), and running the script as soon as possible (so the user isn't waiting too long for execution to begin).
edit: Demo of this in-browser video editor charts a FPS difference between the JS and WebAssembly implementations.
(edit: replied to the wrong comment, apologies)
You would be wrong then. Quoting Unity https://blogs.unity3d.com/2015/06/18/webgl-webassembly-and-f...
So WebAssembly is 1.4x smaller, after gzip.
Go or Rust for example. Also, I watched a talk from DConf and saw that D is adding memory safety as well. You need to mark any part of code doing pointer arithmetic as "system code" or something like that.
I think it's a shame one couldn't have happened before the other. I wish something like D or Go or Nim or something else would have won. Then there would be this effort to get that language fast on the browser.
Is anything I'm saying making sense? I don't know enough about WebAsm, is it really tied tightly to C or could Go or Rust or some future version of statically typed Python become a first-class citizen?
WebAssembly offers a compelling "alternative": instead of (or in addition to, if you desire) writing in memory-safe languages, with all the cost that incurs, you can write in unsafe languages and the consequence of memory errors is limited by the sandbox.
This approaches the problem from a different direction, on one side we have e.g. Go and Rust. Go with garbage collection and its overhead, or Rust with zero-(runtime)-cost abstractions which push the burden onto the programmer at development time. C compiled to WebAssembly is low-overhead but safer than native code, giving the benefits of both worlds.
I would have never expected it, but now believe C is the language of the future for the web. Built on decades of history with an unbeatably large existing codebase, extensive analysis and tooling support, standardization, raw power without the shackles of safety, yet confined to limit damage by the browser sandbox. True cross-platform compatibility with powerful HTML5 web APIs.
My experience about a month so far developing a C application compiled to WebAssembly/asm.js using emscripten has been surprisingly smooth. I can compile and test natively, including enabling the clang static analyzer or -fsanitize=address and -fsanitize=undefined to find bugs endemic to C, fix them and then deploy and run on the web. For the most part I can code directly to OpenGL and GLFW, which emscripten bridges to WebGL and other browser APIs seamlessly. I had to contribute a handful of fixes to emscripten, as well as an implementation of glfwJoystick to the HTML5 Gamepad API (also working on file drop and monitor API), but this was straightforward and easier than expected, emscripten happily accepted the patches. There are Rust (https://github.com/thinkofname/steven) and Go (https://github.com/thinkofname/steven-go) applications in this problem space but porting a similar application written in plain C (https://github.com/fogleman/Craft) to emscripten was nearly trivial (if there is any interest: https://github.com/satoshinm/NetCraft). After about a week I was able to consider the web-based port finished, and then focus on developing new features, for both web and native.
Is C compiled to WebAssembly a panacea? Not by a long shot, there are many (perhaps most) scenarios where memory-safe languages such as Rust would be preferred. But for games and other programs where performance is more critical over correctness and safety, WASM is a godsend.
We have yet to fathom how far reaching WASM will be. Did I say WASM enough times? I end with only this. WASM.
I don't think so. The only languages that can currently target WASM are languages that don't need GC. It'll take a while before the tooling and functionality is mature enough to support the most commonly used languages (aside from C/C++).
Looks like the "compressed bytecode that runs really fast" part is now a reality. I'm hoping they'll work on making deliverable packages that are convenient for programmers and users.
Anyone know if there is there anything like this in the works?
Also, wasn't asm.js just a subset of JS? This line confuses me:
> asm.js and PNaCl represented quite different visions for how C/C++ code should be supported on the Web
Yes you could compile C to asm.js, but it would still be compiling it to JS at the end of the day. WebAssembly is completely different in that regard.
WebAssembly (at least to start with) is mostly intended as a more compact binary encoding of asm.js. It still mostly executes the same way and offers the same APIs as asm.js.
But you can use the WebAssemblyExplorer https://mbebenita.github.io/WasmExplorer/: There is also a nice video about using WebAssemblyExplorer and even uses a simple WebAssembly-Module: https://www.youtube.com/watch?v=3XrGjSnPHGY&t=310s
To run it you need at least Node 8.0.0 and the `--expose-wasm` flag.
Sorry for using modern JS, you will need to use babel-node to run it. Or you can port it to old fashioned JS.
(It is pretty amazing you can compile C code to something that actually runs on multiple machines, it's very nice.)
Can this be used as alternative to native modules in Node.js?
It is not impossible to make pure JS run this fast, but its not going to happen without herculean effort put into JS runtimes. It is just easier to do it this way, and we get to leave JS, which is a benefit to everyone except people who actually like JS.
The other thing is it's not just about performance, but also about running existing and future C/C++/Rust etc code on the Web.
I know these are coming to WebAssembly, but who knows how long it'll take.
Chromium is refocusing on WebAssembly
I think looking at Quantum (https://wiki.mozilla.org/Quantum) and Phonton (https://www.ghacks.net/2017/03/31/firefox-photon-new-design-...) this becomes even more clear.
That said, the article is very informative, and well-balanced. It was really good of Google and the other browsers to join the wasm bandwagon. And yes, although as the author himself points out, "proclaiming a "winner" is unimportant or even counterproductive", Mozilla does deserve a lot of credit here.
That's the domain of reddit.
Also known as "culture". It's part of what keeps communities alive, and I don't think it's bad in and of itself.
More importantly, the posts weren't written for HN specifically, but in the context of discussing the open web and WebASM. Within that community this reference will be pretty obvious.
I do, that's the second part of my comment :)
That said, I actually read the other headline, but just forgot. So much for my memory...
The general message being: "Mozilla are being very diplomatic and restrained, whereas many in their current position would be outright celebratory"
When the open web wins, Mozilla wins.
Servo, which I think is the most important software project in the world, is where it will start to change. That's when those of us who may not be directly contributing code into Servo need to come out and do our thing. I still fondly remember the NYT ad and the crop circle. We should do it all over again.
Thousands of developers from HN make websites. If they switch to Firefox, at the very least, the websites they make will support Firefox. This won't necessarily make Mozilla commercially viable, but if people are really that concerned about it, they can donate to the Mozilla Foundation.
But even taken on business terms, you're sweeping a lot under the rug. As developers and entrepreneurs, we've benefited hugely from the web being an open, competitively specified platform. The more one large company can control the platform, the more it will get tilted toward that company and away from the rest of us.
That may not be bad for any given business next week; these things take time. But for anybody building a serious business, you're going to have to worry about the long-term, large-scale stuff. Google's been going 20+ years; Microsoft and Apple, 40+; IBM, 100+. They didn't get there by only thinking about the next quarter, and you won't either.
but it's just browser preference, so the whole "moral" thing factors in less than whatever logo is printed on the pen I take from the junk drawer. I just want a pen that works.
I'll note that it's a different bad argument, one about consumer choice, than the one I was addressing, which was about business choices. But consumer choices too always have implications. That's why, e.g., boycotts are a thing: small decisions add up.
Everything is a "moral choice" when the person demanding the choice feels strongly about it, but that typically means you just lack perspective.
At the end of the day we're talking about browsers and websites, and while people may not LIKE it, when a business writes software it's a business decision as to whether or not they'll target all browsers or a subset.
In that way there was a quiet revolution toward cross browser support.
The actual argument against (which others are making and which I'm sympathetic to) is that one shouldn't optimize only for direct bottom-line business interests, that businesses and people have a social responsibility, etc etc.
But that's entirely different from what you're talking about.
It's a business decision.
That's it. I didn't say we should optimize for direct bottom-line business interests. I said IT IS A BUSINESS DECISION.
It is not the decision of the developers unless the BUSINESS GIVES THEM THE ABILITY TO CHOOSE.
And even in THAT, it's a business decision.
That's all I said. The business that pays for the labor and chooses the direction they go in.
This idea that a business targetting a specific browser is some horrible social problem is silly. If I'm making a product that's meant to sit in a kiosk running Chrome OS, I'm sure as shit not going to pay for FF and Edge support. If I get it on accident, fine, but if something breaks in FF I'm not putting any effort in fixing it.
The price for me doing tech support for free on your laptop is the default browser gets switched to Firefox and Chrome gets uninstalled :-)
I use both Firefox and Chrome pretty regularly, but I'm under no illusions that the quality of Firefox isn't quite a bit lower in multiple very concrete ways for my day-to-day usage (presumably because Mozilla has less resources than Google).
In the past, I have switched family members to a _better_ browser, but I'm talking about IE6 to Firefox, the usability gap between which was 1) in favor of the switched-to browser and 2) waaaayyy bigger than Firefox v Chrome in 2017. Even then, if I had to make that same decision again today, I would probably first convince the person whose computer I was modifying. Especially for non-technical users, having things suddenly change out from under you can be really jarring in an environment that's already pretty confusing.
Stylo (Servo's style subsystem) has already landed in Firefox behind a preference flag (not everything is wired at this point), and it also improves perf.
So, yes Servo is of paramount importance for Mozilla's future, because it does make a difference to end users.
Another advantage of Rust is that it allows devs to avoid a whole range of bugs making it easier for them to iterate and ship updates without introducing new tricky bugs (race conditions can be hard to debug).
While dynamic languages may be nice when you want to explore a problem space, the stronger/most static type system is a benefit for upgrading large, mature code bases...
Stylo is still a compile-time option, but will soon be built by default and controlled by an about:config flag. You can watch the progress to build by default in this Firefox bug:
Have you gotten amazing results on your machine? I ask because I think Servo, Rust, WebRender are awesome and I'm rooting for them, but the performance has not been great when I try Servo or WebRender in Firefox on multiple machines. Maybe it's just the machines I've tried on though.
Browser benchmarks should include how long does it take to watch a 30 sec youtube video from application start to finish or how much 3rd party feature/bloat/mal/adware it downloads connecting to $major_site.
As the lead dev of VLC in a recent interview said, they’ve been offered huge amounts of money to include Google Chrome in their installer, and saying no was the hardest decision he’s ever made.
As long as Google has fraudulent ads for Chrome "your browser is outdated, update now to Google Chrome" on their websites, as long as Google intentionally makes the experience worse for Firefox (see the youtube redesign), as long as Google pays developers to ship Chrome as malware with every single installer, as long as Google forces OEMs to install Chrome with Android, Chrome will rule the market.
The only solution now is the EU.
MS was a much bigger impenetrable monopoly and the Web was won back. It can be done again. Having a great product and grassroots evangelism certainly help.
(Not that I think Chrome is THE ENEMY. It's constantly evolving, multiplatform and open source. IE was none of that. But I agree Google's practices you described are despicable. Huge kudos to VLC for doing the right thing)
With IE, Microsoft was influencing what the web was viewed with via its control of the client - Windows.
Google is influencing what the web is viewed with by simply being such a key part of the web itself, and using its weight from that direction instead.
(One could argue that they have Android for the client, but as a percentage of web users it's still far from what Windows had in the IE6 days.)
Firefox won back the web by having a great product, grassroots evangelism, and a Microsoft who badly neglected their competing product for many years. They left an opening.
Chrome might be losing its lustre but it's certainly not being treated the same way. I think Firefox's new battle for market share might be harder than it was vs IE, simply because Google is still so active on this front. In response, the only real new thing in our arsenal is hindsight, which I guess is what the VLC example above is a result of.
Microsoft was trying (and succeeding) for the web not to be a preferable API to Win32, so as to keep that way a high barrier to entry into the OS business. Which is why IE was squarely against standards.
The Web won because a) its introduction was a one-time technological change whose social impact was on the order of the printing press, and b) Microsoft had gotten fat and lazy on their monopoly revenues from Windows and Office.
We can't get cocky here. It is perfectly plausible that HTML will still render 500 years from now. And I'm dead certain that it will still dominate 20 years from now.
Growing up in a major technical revolution, it's easy for us to assume that the future will have a lot of technical revolutions that will keep knocking monopolists, rentiers, and authoritarians off their perches. And if that happens, great. But we should really be planning for the opposite case.
Internet Exploder was constantly evolving and multi-platform.
Did you forget IE ran on PPC Macs, X86 Macs, Windows, and CP/M?
IE updated... just slower than Netscape.
For a long time being a 90s Mac user sucked as different websites required you to use IE on a Windows PC.
On the Mac it went as far as version 5 only. And never made it to x86 (unless you count Rosetta)
So, through the dark ages, IE was very much frozen and Windows only, for all intends and purposes.
I'd love to use Brave on my desktop too, but their lack of plugin/extension/add on support cripples it a little. There's a couple I just can't live without. Using Iridium on the desktop instead.
I believe Chrome has to be installed, but it clearly doesn't have to be the default. Just see all the Samsung phones with the default Samsung Internet Browser as evidence.
As long as Chrome is preinstalled, it will win.
Where do you see this? I have Firefox open and don't see this on Search, Drive, Google Music, etc.
Transcribing the subtitles:
> "To be honest we’ve been offered some insane amounts of money to do bad stuff around VLC, like shipping tool bars at the same time of the installer of VLC or or installing other software like Google Chrome and so on. And when you see the numbers they propose to you, you’re just like: How the fuck am I going to say no to that?"
> "The thing is, it’s not only my project so I’m not allowed to do that. [It’s the] legacy of other people. That wouldn’t be moral."
I'm asking genuinely here. Is Servo expected to be have a much better performance?
Demonstrated a successful code execution attack against Safari to gain root privileges using an use-after-free vulnerability in Safari and an out-of-bounds vulnerability in Mac OS X.
Demonstrated a successful code execution attack against Microsoft Edge in the SYSTEM context using an uninitialized stack variable vulnerability in Microsoft Edge
Demonstrated a successful code execution vulnerability against Microsoft Edge in the SYSTEM context using an out-of-bounds vulnerability in Microsoft Edge and a buffer overflow vulnerability in the Kernel.
etc. Highlights mine. All of these are prevented in safe Rust.
My general belief is that general-audience users don't care at all about bugs like those. Which is why we have so very many of them, and have for decades.
It's the same benefit you get from strong, static types in a large project vs one with dynamic/weak types, but for another category of bugs. In large projects, it makes a difference.
0. Hopefully you'll forget the meme ;-)
Software stabilizes over time, you find problems and then you fix them. How often are these class of bugs causing catastrophic issues?
That's really the important question.
Project Quantum will use only pieces of Servo coupled along with Gecko. Maybe Mozilla will fully replace all of the old single threaded code at some point in the future but I imagine that's a ways off.
Given the gradual progress of these changes it's likely that any successes in performance it brings will be copied in the other engines before they are too far ahead.
I use Firefox as my main browser but my girlfriend uses chrome on her computers so I get to use it from time to time. I don't notice any major differences, the extensions I care about are available on both browsers, the speed is not noticeably different etc... On top of that Firefox predates chrome so it's not like people not switching away from IE because they didn't know better. So what happened exactly? Is there some chrome killer feature that I just happen not to use myself?
Because every time people go to google.com they see a popup that google works better in Google Chrome.
It was dead slow, actually, unless you were running a modern (i.e. very fast) multicore PC and with a very small number of tabs. For example, when trying Chrome out soon after release, it managed to grind my PC to a halt because I dared open something like 5 tabs (whereas Opera was happily running double to triple digits). Back in 2008 multicore CPUs weren't as widespread as they are today, so for the most common cases Chrome was just slow, context switching PCs into the ground.
> I think that initial period is why so many regular people use chrome.
Regular people don't simply install new browsers, it's the people familiar with computers (like the family geek, or the guy maintaining PCs for a living) who push them unto regular people. Anecdotally it went something like this: technically inclined people were supporting Firefox (because it wasn't IE and because of A LOT of marketing) despite it being a crappy browser and there being better alternatives. Now, Firefox was not slow per se, but if definitely FELT slow, so when Chrome was launched the same people who had popularized Firefox started promoting the new shiny trinket. Everybody kept saying "it's so fast!" - well, it certainly FELT faster than Firefox, at least when it came to the UI, and that was enough to switch.
> I think the other thing that pushed it this far is just the fact that it's new.
I think so too. Shiny new things have the side effect of attracting the enthusiastic bandwagon jumping types, and enthusiasm can be contagious.
There's no useful distinction between "Felt slow" and "Is slow" in getting users to adopt a product.
If your UI feels clunky because you open a window and paint it white before filling in the UI itself, stop painting it white.
I totally dispute that. I had a very modest PC at the time and I remember vividly using Chrome for the first time, noticing how much better it performed compared to firefox, especially if you had many tabs open.
I'm back to firefox now.
Given that I used Opera's performance as a basis for my statement I guess your comment, instead of being a glowing praise of Chrome, simply reflects very poorly on Firefox.
At the end of the day, Chrome ended up using too many system resources to be a viable option for me at the time. People accused me on occasion of being an Opera fanboy, but, objectively, it was hard to justify why Chrome would need more resources than Opera while delivering significantly less features.
But, as a quick test, I closed Chrome, and it was up again instantly (say 500ms). I did the same with Firefox (after a generous warmup / caching session), and got 4 to 6 seconds each time. Clicking links and page loading feels similar; on Chrome I don't notice it, on Firefox I always do. Am I the only one that feels this way?
Because Chrome, by default, enables prefetching  (it loads links before you decide to click on them). Firefox will never do that due to obvious privacy concerns.
It's the usual "principle vs convenience" thing, where most people choose the latter.
> Link prefetching is when a webpage hints to the browser that certain pages are likely to be visited, so the browser downloads them immediately so they can be displayed immediately when the user requests it. This preference controls whether link prefetching is enabled.
> Possible values and their effects
> Enable link prefetching. (Default)
You can check the value in your Firefox with about:config and searching for network.prefetch-next
My one is "false" and it's shown in bold face, meaning that I changed it to that value. Privacy concerns and also legal concerns: what if a site links another site that the legislation of your country (or the country you're travelling to) doesn't allow you to access? At least I get some hints of where I'm heading to if I'm loading pages myself.
Nope. I really want to like Firefox but its performance is just so much worse across the board than Chrome.
Another example is video playback: Firefox, when viewing video, heats up my laptop to the point where the fans kick in at full blast. Chrome stays nice and cool on the same material. The difference in battery life is noticeable too.
Sorry Firefox, you need to do better.
Google aggressively pushed Chrome on web users.
Banner messages claiming the users browser was out of date (it wasn't, they just weren't using Chrome), or that this website works better on Chrome.
Redesigning their web services to be coincidentally worse on Firefox.
The numerous and excessive ways it was bundled with various other application installers. Most users don't customise and application install, they just stick with the defaults and the result was, they ended up with Chrome as the default browser.
I got into chrome as a teenager, though, when my dad switched from firefox to chrome due to Chrome being apparently faster than firefox at its debut.
I have given Firefox plenty of chances. It's just too slow: webpages load slower, and when they are loaded interactions feel awful (low FPS on large webapps).
I like that I can install ublock origin on mobile firefox. Chrome does not support this option.
That said, Mozilla is still awesome and FF will most likely remain my second go-to browser.
What has always frustrated me most about Chrome is that certain bugs or fixes can take an eternity to fix despite there be many reports on an issue.
I'm glad to see it killed off, and have been expecting it for a few years now. JS/asm and (hopefully soon) WebAssembly have supplanted many of its features and benefits.
Still, this doesn't strike me as anti-open web. Google offered a solution when one was needed - it didn't gain traction, so they eventually retired it.
I'd also go so far as to say that Google's AMP project, as implemented, displays a distinct step away from an open web.
As a developer I'm annoyed every time I enter "com.scala.List" in the address bar and FF does not use google to search but thinks this is an url. No, "List" is not a TLD and no that website does not exist.
Eg. if I type "cheese", it shows "Did you mean to go to http://cheese/"? If I click that link I get TalkTalk's "Error Replacement Service" full of ads (or at least I did, till I switched to Google DNS because TalkTalk's "opt-out" system has been conveniently broken for years)!
The new Google "did you mean to go to ?" nonsense is something else to add to that link!
My ISP (TalkTalk) claims to have an opt-out page but the forums suggest it's been broken for years, and today it is a 404. I have an open issue with the CEOs office to opt me out manually but they've been pretty useless so far.
I did see someone from Google ask if it'd be useful if after the first time, when Google knows it's a valid domain, it should just go there directly (even without the slash). Everyone said yes, but it doesn't seem like it was ever implemented!
Mozilla focuses on thousand of other things in addition to on making FF a great browser.
See project Quantum, WebExtensions (some would argue that it makes the browser worse, but the goal is undeniably to make it better), and Photon.
The point of these commands (invented by Opera by the way) is to give you choice what search engine to use while not sacrificing your performance. Google's approach is different: remove choices that might confuse or distract you. Choose what you like.
I've only done a little front-end development but I tried both browsers debugging tools and other than layout, I couldn't see any differences.
They both seem to offer the same things.
At my house (ubuntu gnome), windows firefox on wine is the only way I have found to access some webTV based on flash.
Users will choose the fastest browser that works, in general.
To my taste, Chrome juuuust edges out at a cursory glance, but barely (though at this point, it's a little sticky for me because it has my Google account credentials, my bookmarks backed up to the cloud, etc., etc.). But it's definitely looking better than it did when last I tried that comparison.
Second, the zoom functionality in Firefox is broken. On a 4k display, pages just get jacked up after doing Ctrl-+ a few too many times. Chrome's zoom is far superior. People have told me, "just change the default pixels per inch" nonsense. No, Firefox's zoom is just broken.
Lastly, after the SJW witch hunt and ousting of Brendan Eich, I don't care what happens to Mozilla.