Hacker News new | past | comments | ask | show | jobs | submit login

Bravo! As a Firefox user, the trend of web developers only testing in Chrome (and ending up using features that only work in Chrome) is getting really annoying. Most of the time, the fixes are really simple, like including more prefixes than -webkit.

Very happy to see Mozilla raising awareness around this issue.

Personally, I have Firefox and Chrome open side by side for development. If those two work, Safari and Edge generally will too. I'll add IE specific rules/hacks afterward.




If your app only works in Webkit / Blink, you've written a Chrome App and not a Web App.

Your CUSTOMERS do not want to see "To use this site / app, stop using the web browser you've chosen to use (or your company has chosen for you) and go install this instead!"


Here's a thought exercise: how will this trend be exacerbated by the Electron/Chromium stack? Should Mozilla develop a competing engine for "native" apps?


Mozilla is developing a competing engine for native apps. The word "embedded" is part of Servo's [1] elevator pitch:

Servo is a modern, high-performance browser engine designed for both application and embedded use.

Sponsored by Mozilla and written in the new systems programming language Rust, the Servo project aims to achieve better parallelism, security, modularity, and performance.

[1] https://servo.org/

EDIT seeing Etzos's answer, I see better what you meant (de-facto "standardizing" around electron). positron looks like a better answer then. But long term, there's clear interest from Servo developers, see http://blog.servo.org/2015/05/01/forward/ and Ctrl+F for "Chromium Embedded Framework"


Great. I'd prefer to see Servo and browser.html replace the heavy Electron stack.


How light is servo + browser.html? you seem to be implying it's much less than electron. Any figures? I haven't looked at Servo / browser.html / rust at all apart from seeing a few news articles pop up on HN the last few months.


See this demo comparing Chrome, Firefox and Servo WebRender: https://www.youtube.com/watch?v=u0hYIRQRiws Servo WebRender uses the GPU making it much faster than classic renderers (besides CPU parallelism), think of it like a Webbrowser using Video Gaming technology.

There's a talk about this at https://air.mozilla.org/bay-area-rust-meetup-february-2016/ (HN discussion: https://news.ycombinator.com/item?id=11175258)

browser.html is just the current "skin" for Servo.


I think the grandparent was referring to things like startup speed and memory consumption when they said "lighter", not rendering performance.


Any tests done on integrated graphics such as those in normal laptops?


In https://air.mozilla.org/bay-area-rust-meetup-february-2016/ ,

- At 03:00: pcwalton explains how this experiment leans on the GPU-ization of our Intel CPUs since Haswell.

- At 14:50: slide says "WebRender supports OpenGL ES 2.1 and OpenGL 3.x"

- "Benchmarks" at 26:00 running on his macbook, which may fit what you are looking for

TL;DR, yes this early work is for "integrated graphics such as those in normal laptops". Or try it yourself on your laptop with a nightly build: http://blog.servo.org/2016/06/30/servo-nightlies/


It works awesomely on both integrated and discrete graphics for me. IIRC Patrick and Glenn, the people who wrote webrender, by default use integrated graphics anyway (at least one of them does).

Software rendering makes it choke sometimes (other times it works surprisingly smoothly, but it depends on the load), but that is to be expected :)


Hard to say since it's still in development. Maybe someone with more knowledge can chime in. But it's focus is on parallelism and performance, so I have high hopes. Slack (a flagship of Electron apps) is currently using 850mb of ram, idle in the background. Firefox with 50 tabs open is using the same amount.


Holy crap! 850MB for a chat client!? Just made the argument for truly, native apps right there.


Servo is super long term; it is more of a testing ground than a product. It is possible that it will become a product in the future, and embedding is a really lucrative space for us to try, but no plans for a product right now.

Positron is indeed the thing you are looking for.


They are actually already doing this[1].

[1] https://github.com/mozilla/positron


Don't forget Cordova where most views are also Chrome based. (The exception being Windows, by default, and many don't seem to test on Windows.)

Obviously Mozilla tried and failed with Firefox OS to play in that market at the mobile level.

It might be nice to see an Electron and/or Cordova-compatible view engine from Mozilla, but I'm not sure how much adoption or testing it would see unless it became the default (and even then you probably would have a bunch of web developers revert to Chrome views simply because its comfortable to what they know).


Mozilla actually had one for years, called XULRunner.


It's not good; people use Electron instead of XULRunner because XULRunner was barely maintained. It would randomly stop working on major Firefox releases, and would stay broken for weeks, because nobody noticed or cared that XULRunner didn't work, as long as core Firefox kept working.

Even when it did work, working with XPCOM sucked.

Positron https://github.com/mozilla/positron is the new XULRunner. WIP, doesn't actually work yet. :-(


The trouble with that is what do you do with older versions of the browsers? Edge is okay with features, but then you have people who complain that your website looks funny in your browser because they still run IE 8 or 10.


I didn't realize Chrome was becoming the default testing environment. I find it weird, given its terrible font rendering. I tried using Chrome but just couldn't. Smaller size fonts render weak, pale, bleak, like drawn with a dull pencil. I found it rather difficult to read large bodies of text in Chrome which was giving me quite a bit of eye strain. Not sure why anyone would want to test their pages in that in my opinion broken rendering engine.

I'm aware there is a setting which has to do with subpixel rendering or whatever this is called, I tried switching it on and off for no perceivable visual change. So I gave up on Chrome.

IE also has had broken font rendering since they moved to subpixel rendering several years ago, but at least it draws fonts in a strong and dark fashion making them more readable than in Chrome.

Personally, I design for Firefox which I consider the gold standard of web development. Then, at some intervals I check if things are okay in IE and Chrome and usually they are fine. Chrome was useful once in helping me spot some sort of a race condition, its developer tools also conveniently allow you to quickly bypass caching for testing purposes, but other than that I found it useless and unusable.


I mainly use Chrome, with some testing in Firefox. It used to be the other way around, but Firefox's single threaded nature just makes it very painful when you have multiple windows with multiple tabs open. Having it use only 12% of the available CPU in the system when I am doing lots of work is not acceptable especially when I need the browser to be responsive.

Yes I know about e10s and servo, but they aren't in the regular Firefox right now, and especially weren't several years ago when I was forced to change from Firefox to Chrome. I'd love to go back ...


You're probably seeing this bug https://bugs.chromium.org/p/chromium/issues/detail?id=146407... and it is indeed a sad story, especially because Chrome 21 worked just fine. Then they rewrote some gamma correction code in the Skia that shipped with Chrome 22.


Thanks for the link. Based on some of the screenshots posted over there it indeed looks like what I've been observing on my machine (version 47-something).

Not a problem really. Never used Chrome before. Only installed it out of curiosity and also for testing purposes. Within the first day of using it on my development machine it became clear I wouldn't be using this piece of software in the future either.


>I didn't realize Chrome was becoming the default testing environment. I find it weird, given its terrible font rendering. I tried using Chrome but just couldn't. Smaller size fonts render weak, pale, bleak, like drawn with a dull pencil.

I find the opposite: never could stand Firefox's font rendering -- and I've used the thing for years back in early 00s.


I think it is in certain circles. For Google, obviously, in particular. Many of their new sites launch completely incompatible with Firefox and Edge. Some of the frameworks they push, like Polymer and Angular, have been repeat offenders in the past, which leads web apps built with them to also be broken.


Edge actually pretends it's Chrome in its UA


To be clear, it still specifies "Edge" in it's UA string and can be specifically detected. And just about every browser now mentions Mozilla, Chrome, and Webkit in their UAs.


Right on. The tools we have at our finger tips as web developers are so accessible and comprehensive that there is little excuse for not testing an app across a range of user interfaces/platforms. I shiver to think how much deep DIY happened at every layer of development and deployment in previous software generations.


The problem is caused by the web spec being way too complicated.

Instead W3C should have chosen simpler primitives from which developers can build complicated (formatting) rules themselves.

Simpler primitives is also better from a security perspective.

By designing the web for average users instead of for developers, W3C has shot itself in the foot.


> The web spec

There are hundreds of web technologies, each with their own spec. None of which are actually that complicated if you read them.

> Simpler primitives ... formatting rules ... better from a security perspective.

So you mean CSS? What does that have to do with security?

> Designing the web for average users

Not sure what you're trying to say.


>There are hundreds of web technologies, each with their own spec. None of which are actually that complicated if you read them

The first part is true. There are indeed hundreds of web technologies. The second is absolutely false. A lot of them are (and have historically been) horribly complicated to implement, with lots of edge cases and strange interactions. In fact prominent web standards people have talked about such cases many times.

>So you mean CSS? What does that have to do with security?

No, he means "simpler primitives" across the board. And even CSS has to do with security (e.g. loading third party fonts, etc).

>>Designing the web for average users >Not sure what you're trying to say.

He's obviously tries to say that W3C et al piled features to please end users and satisfy end user needs, without giving much care about how to make them more consistent and coherent for web developers.


Complicated to implement, yes. Complicated to read and understand why the browser is doing x? No.

I'm not sure if end users are the people W3C is trying to satisfy. Browser vendors do enough of that. If anything, they're trying to satisfy media corporations that want DRM standards, or ad corporations that want pervasive tracking.


>Complicated to implement, yes. Complicated to read and understand why the browser is doing x? No.

CSS, for one, is notorious for being complicated to understand, with tons of complex cases, especially in the layout department, where whole cottage industries of "tips" and "workarounds" for the simplest of stuff -- and I'm not talking browser incompatibilities -- thrived for decades (until, at least, flexbox and the like).


>There are hundreds of web technologies, each with their own spec. None of which are actually that complicated if you read them.

I agree with your overall point, but there is a reason sites like MDN exist. It's because some of the HTML/CSS/JS specs are crazy complicated, contain years and years of edge cases and bugs that have become standard, and are written in standard-eze (which is easy enough to read once you know it, but it can be a bit of a learning curve for someone who just wants to know the order of function parameters).


MDN will be a huge legacy for Mozilla. Of all of their projects, I think MDN will have the biggest impact on the web. It's absolutely needed for someone who just wants to know the order of function parameters.

For a more nuanced understanding of browser behavior, you have to read the spec. Worked on a high performance network app, and I think I know the XHR spec by heart now.


> Worked on a high performance network app, and I think I know the XHR spec by heart now.

I am curious, can you elaborate on this? :) What part of the performance related work involved peeking at the XHR spec so much?


For mapping applications, tiles are loaded to display data. These requests have to be handled carefully, especially when there is a lot of data, because every map movement causes 10s of requests to be fired at once. Their lifecycles have to be managed so they can be cancelled as soon as their result isn't needed (user panned away, or changed zoom level).


I couldn't agree more, and I also think it influenced the standards bodies. New ES[year] specs include lots of very MDN-esque explanations, examples, and "polyfills" for the new features.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: