Hacker News new | comments | ask | show | jobs | submit login
Mozilla Wants To Split Off Its Thunderbird Email/Chat Client (techcrunch.com)
444 points by dest on Dec 1, 2015 | hide | past | web | favorite | 433 comments



I think this is the 3rd time I've heard Thunderbird is/is going to be unmaintained and that I should avoid it, but I'm yet to see a good alternative. Claws was crashy and didn't DPI scale properly on Windows, OS X Mail had strange behavior with my IMAP server and I just wanted a consistent UI with my Windows and Linux system. My next option is webmail but there's no good IMAP webmail client that seems to be able to handle 8 accounts at once with thousands of messages in each. The best alternative I've seen is probably Alpine, a CLI based client, but that was difficult to use and seemed to randomly send deleted characters from emails to clients, which was almost a major issue on one occasion.

I'm just going to keep running Thunderbird until there are gaping security holes, people say Thunderbird is bad but everything else is worse.


I think a lot of people are forgetting that there is still a community developing it since Mozilla stopped "updating" it, beyond maintenance, in 2012. The product isn't dying, just will no longer be Mozilla backed. I assume that maintenance, like ongoing updates, will be passed to the community and hopefully another company steps up to back the project.

Repo for Thunderbird: https://hg.mozilla.org/comm-central


Thanks for highlighting the repo!

I too had heard development had stopped and misconstrued that to mean all development.

Now I'm wondering if there are any issues I might be able to help fix. I still use Thunderbird so if I can help keep it alive, I'd like to.

As parent pointed out, most of the alternatives are worse.


No problem. I think everyone got out their pitchforks without reading Mozilla's announcement and looking at the repo.

There are always issues to help out on. Here's a bit more info on contributing [0]. I imagine it won't change much as the project is officially handed off to the community. It's largely led by the community anyways at this point.

[0]: https://developer.mozilla.org/en-US/docs/Introduction


Thunderbird usage continues to grow, almost reaching 10M users:

https://blog.mozilla.org/thunderbird/2015/02/thunderbird-usa...


Is that more than Firefox OS?


Yes, I think so. There might be a mention of it in the article.


I saw this on hacker news a while ago but it seems like a great alternative to thunderbird. I've used it and I really like the look and feel of it compared to thunderbird. I'm pretty sure I saw an option for IMAP too. It's called N1 and it's from Nylas. https://www.nylas.com/n1


Nylas is a beautiful app and, by design, easily extensible. As a result, I think that it has a better chance than other open source mail clients of providing an experience that's as polished and complete as Gmail. Doing that requires a bunch of affordances like calendar integration, Youtube previews, warning if you forgot to add an attachment, 10-second "undo send", and so on. Letting the community write those as add-ons seems smart.

To really make Nylas compelling over Gmail, and also to replace Thunderbird, here's what I think needs to happen:

1. Bundle Sync Engine and N1 into a single program.

2. Implement local search. Faster than IMAP search and doesn't rely on a network connection.

(As it stands, Nylas requires an internet connection to work. As long as that's the case it's no Thunderbird replacement at all, and will have a v hard time winning over Gmail users.)

The good news is that both Sync Engine and N1 are GPLv3, so it's totally possible. The bad news is complexity:

* Sync Engine is written in Python and uses MySQL for persistence

* N1 is built with Coffeescript + React + Electron, and uses SQLite for persistence

So we need to simplify. We can't just bundle all that into an app, it would be a monstrosity. Here's how I think we could do it instead:

* Use Sync Engine as the basis for a Node library for IMAP sync

* Use the SQLite full-text feature (fts4) for search

That way you'd have a single Electron app with all persistence in SQLite.

I'm prototyping this now. Thoughts? Let me know if you have ideas or want to help!


N1 actually works offline. Obviously it won't fetch new mail, but you can read, archive, compose, and search for local messages. When you reconnect, those tasks will persist to the backend and Gmail/iCloud/Exchange.

Search is definitely an area we can make dramatic improvements, and we're working on it.


It may be open source, but it's a thin client-side with your data being handled by the servers of a third party. No thanks, as trusting Google or FastMail with my email is a tough pill to swallow already, trusting an intermediary as well is way too much.


You can actually host the Nylas Sync Engine yourself so if you're seriously considering it, it's still an option.


yes, but why not just use a client that doesn't need a server-side component


Looking through their API docs, one of the things that I noticed is that they added a transaction log ("delta" field), which IMAP doesn't have: https://www.nylas.com/docs/#deltas

Having a server communicate with IMAP (which is slow), cache the results, and export a more efficient/log-structured alternative probably makes syncing client computers much faster.


Interesting. Will take a look.


Nylas N1 requires a custom server (Nylas Sync Engine) to do notifications. Why? Their FAQ is extremely vague about it. Apparently it suffers the same performance issues as plain IMAP for search. It seems this might be to support their Electron based UI, which seems... bloated to say the least.


Hi-- I work at Nylas. Happy to jump in here :)

The sync engine handles all the compatibility or dozens of IMAP and Exchange servers, and uses a variety of heuristics for real-time notifications. It also does the heavy-lifting for the huge amount of data in most people's mailbox. The N1 app can cache your entire archive in about 10% of the disk space required if you also download all unprocessed headers+attachments.

Right now our search implementation just sends a proxy request to the IMAP search, so you'll get the same performance as Gmail web, Mac Mail, etc. It's not ideal. We're working on our own faster search system, but it's tough to build a huge distributed search cluster that handles tens of terabytes. (And is growing quickly!) If anyone here wants to work on that, we're also hiring. ;)

N1 is built on Electron, which under the hood is Chromium+NodeJS so that comes with a bit of weight. But the app itself is pretty manageable. You can check out the source here: https://github.com/nylas/n1


The Electron UI is all local.

I believe the sync engine translates between IMAP and a custom JSON protocol they've created. I'm not sure what the performance difference is, but looking at the API docs, it looks a lot nicer to work with than IMAP: https://www.nylas.com/docs/

It also seems to more accurately represent modern semantics for working with email. For example, it allows using OAuth for authentication, and it has separate fields for folders and labels, depending on the backing email service. (And unlike IMAP, the labels field is actually useful.) You can also download a transaction log to make sync faster, which I don't think has an analog in IMAP.

Subjectively, I've found the performance to be rather snappy in most cases.

(You can also run a local client of the sync engine, if you don't want to trust their hosted copy: https://github.com/nylas/sync-engine)


I've moved to Mutt/OfflineIMAP on most every machine I own, and it's worked remarkably well. It's a CLI app, yes, but it outshines Thunderbird (and most other GUI clients, for that matter) in a number of ways:

- FAST. Mutt lets you process thousands of messages in short order. Mutt is directly responsible for helping me dig out of a 50k deep email hole brought on by years of GMail's approach (archive, never delete) in a few days of on-and-off cleanup.

- Plays well with others. Uses a bog standard Maildir format that anything sane can read (including Thunderbird, so migration should be easy), provides a very powerful hooking system which can do anything from verifying PGP signatures to checking your spelling before sending to displaying attachments.

- Sane defaults. Doesn't require a lot of in-depth customization to provide the basics, and the bells and whistles are hardly out of reach.

- You have a backup of your mail for free (by using OfflineIMAP)

- Not hard to learn (at least for the crowd here). If you can use Vim, you can be using Mutt at full speed in less than a week.

- Search capabilities that blow nearly every other project out of the water (regexes with some very cool niceties)[1]

- Secure by design. Mails are rendered as plaintext, HTML is decoded by piping messages through a program like w3m. Tracking bugs can't do their job, spammers can't see that you've looked at their stuff, and you're immune to whatever image decoding bugs crop up. (Though you can still view images, it's an explicit process). Also, builtin GPG support.

I followed Steve Losh's guide[2] on setting it up.

[1]: http://www.mutt.org/doc/manual/manual-4.html

[2]: http://stevelosh.com/blog/2012/10/the-homely-mutt/


I used to use offlineimap. Now I use isync. Much more efficient and simple.

I also recommend an indexer like notmuch or mu to gain Gmail-like search and agenda capabilities.


How well does mutt work with multiple accounts though? Last I saw it seemed more cumbersome than Alpine for that.


That I'm not 100% sure of. I know for receiving, you'd just configure OfflineIMAP to deliver each account's messages into a different folder, and just switch between them in Mutt, but I don't know how it would work for distinct from-addresses, contact lists, etc.

Some quick googling[1] makes it look like you indeed do that first thing, and then set up a hook to load a different configuration file when you enter the folder for a different mailbox.

[1]: https://www.df7cb.de/blog/2010/Using_multiple_IMAP_accounts_...


I stopped using mutt a while ago, but I just used separate config files for every mail account and a bash alias to launch mutt for each one.

Unless you actually need to do things like move entire e-mail threads between accounts, it works well enough.


Works fine. You can switch between IMAP accounts like switching between folders.


Is there a version of Mutt with a GUI yet?

While I like command line apps, and am a die hard vim user, there are just some applications that I really want to use a mouse for, and email is one of them. I've tried mutt a few times before and it's just never clicked with me. It's not that I can't learn how to drive it --- it's that I don't want to have to. Finding little-used commands is so much easier if there's an actual menu.


That's funny, because every time I use an MUA other than mutt I'm frustrated by how janky it feels to process email with a mouse. :)

The little-used commands can be found by pressing "?" in any mode. You get a nice list including any custom macros you've set up (I don't use many, but a few are essential). It's far simpler than clicking all over menus and dropdowns and hovering inscrutable icons...


I setup something along the lines of Homely Mutt, the setup time is obscene but it's very quick and reliable


For Webmail or IMAP mail, look no further than Fastmail. These guys know exactly what they are doing with IMAP and they can easily accomodate your multiple accounts.

I've been a Fastmail user for almost a decade. I've never been happier with any email provider (or any provider, to be honest). You pay for the service, but it's worth the money and more. You are treated like a valued customer and the support, if needed, is rapid and professional. I cannot recommend Fastmail enough.


ultramancool wants a desktop email client. Many people strongly prefer to do email on the desktop outside of a Web browser.

(Fellow FastMail user here. They're really great.)

EDIT: Well, I see her/his comment about Webmail albeit as a second choice. It seems the preference is for a desktop client, but I shouldn't assume.


I'm basically okay with Webmail if it's self hosted and supports notifications on many accounts.

If I recall correctly, fastmail cannot be self hosted and I really don't feel like trusting some random service with my personal and business email. Or paying them money for something I currently get for free for that matter.


I understand, but send these guys an email. One of the founders will likely respond to you.

I used to run all my own stuff, but when my children came along, I didn't want to spend all my time chasing issues with servers.

Look the Fastmail guys up. If anyone is doing email correctly, it's Rob, Bron, and the guys at Fastmail. They actually write some of the code for the Cyrus IMAP server, so if you were in doubt as to how talented these guys are, don't fear. I would now NOT trust my email to anyone else. They are that good. Yes, I know, I'm some random guy on the Internet, but I'm an IT guy, which in itself says nothing, but I'm nothing if not extremely picky about my own IT. These guys can fix any issues you may have and probably give you some awesome suggestions. They are approachable, something you will never get from any other company. They are based in Melbourne, Australia, but use NYI here in the US. Worth a call if you value good service and top-notch know-how from guys who know email better than anyone else I'm aware of. They routinely post in the Email Discussions Web site forum.


> They are based in Melbourne, Australia, but use NYI here in the US.

Mhm, 2 of the "Five Eyes" countries. Yeah, you're really selling me on this one.

I'll admit I'm being a bit ridiculously paranoid, but it's cheap paranoia for me, Thunderbird works well as a client and I spend maybe 5 minutes per month doing mail server admin stuff. I'm really not willing to switch to any hosted solution for this.

Just looking for another mail client option if/when Thunderbird goes down hill, not a mail server.


Well, for Webmail, then, I would say Tutanota or ProtonMail, both of which are good.


Have you looked at Mailpile[1]? I haven't used it, but their idea was basically "self-hosted Gmail".

https://www.mailpile.is


Yeah, definitely looks interesting, but the issue I have is again the multi-account issue. Thunderbird has very nice handling of many different email accounts.


How does it compare to, say, Gmail enterprise? Well, apart from the whole Google Apps.

I'm curious because we'll have to migrate to a better email provider in a month or two, and Gmail enterprise seemed to be the best fit for us (price, storage, good iOS app to leverage push notifications, etc).


Google's apps for iOS are subpar compared to its Android counterparts, or even to other iOS apps. I'm a heavy Android user, but on my new iPhone I actually prefer the native email app, because it is simple, effective, more standards complaint than Gmail and has quick swipe actions for both delete and archive, which is all I need.

On push, Google Apps gives you ActiveSync as a protocol, as an alternative to IMAP. But the integration with iOS Mail is weird and I could not use it.

But FastMail on the other hand, in a twisted turn of events supports push through Apple's own APN, getting the same treatment as iCloud: https://blog.fastmail.com/2015/07/17/push-email-now-availabl...

FastMail also has a native app that does push notifications, but it's just the packaged web interacts with push notifications added. This is both good and bad. It's good because the mobile web interface is very decent, compared to Gmail, and this means you always have a decent UI on whatever OS you have. It's bad because it doesn't feel native, but then there's nothing more native than the iOS Mail app.


Google's rendition of IMAP isn't following any standard but their own. Fastmail uses Cyrus IMAP server, which most certainly follows standards.

I've been a Fastmail user for almost 10 years. Never an issue that was not solved in very short order and most professionally. Fastmail offer a modern product with old world service and charm. A win-win.


We use either Postbox, Thunderbird or Apple Mail and all of them should work with Gmail like every most common mail client. So the point about IMAP doesn't have a practical impact for us. Am I missing something? Good to know about their strong reliability though


I would simply add that whilst Google have their fingers in so many different things, Fastmail does one thing very well: email and email-related.


Thunderbird user for years here. I now use Postbox, which is (I believe) a commercial fork of Thunderbird. https://www.postbox-inc.com/


Postbox is pretty good, and would have been the solution to my mail client woes, except that:

* even after you've paid for the product, the only way to get support is to pay $10 for every question you want to ask, however basic or advanced it is - and you'll need support, because...

* there's erratic unpredictable behaviour at many places; is it some subtle bug in the software? or is it some weird about:config option that you have to set to get things working as expected? Who knows! If you're lucky, you'll find a solution in the Thunderbird forums that works. Often though, that dialog option has been removed from Postbox or that about:config option no longer does anything, and...

* the documentation is no help. It's no worse than the average startup's help pages, except in this case, as mentioned above, there's no support - so you'd expect that to be compensated with exhaustive documentation, but no such luck. The docs cover just the most basic cases and leave all the complex interactions of emailing to your own guesswork. Also...

* the devs avoid users like the plague. The only-paid-support thing is (explicitly stated as) a consequence of this, but this even extends to bug reports and feature requests - which are free of cost to submit, but the two bug reports (and one or two feature reqs) I submitted were met with only gaping silence. The interaction on Facebook and Twitter too seems limited to version announcements and rare one sentence replies.

I really wanted to like Postbox, for it to be the solution, because it was quite a good product with advanced capabilities and a quite reasonable cost. But it didn't seem like a reliable option for the long run given these limitations.


> the devs avoid users like the plague

A recipe for a terrible product. I tried Postbox when it came out and found it to be unbearably buggy (like they had released an alpha version). I couldn't even get Gmail to work, which should have been user story #1 for them: set up Gmail account.

Now I understand the culture that made it such a piece of garbage and will continue to avoid it like the plague.


Interesting, I'll definitely take a look at this. I don't mind paying for it, but I'd be a bit iffy on going from a nice open source solution to a proprietary one.

EDIT: Nevermind, no Linux build, that's out for me unfortunately.


"4 GB of RAM (8 GB recommended)" Wow why does it need so much RAM.


Interesting story tangentially related to this question. I just finished building my new gaming rig - the only thing left is a GPU which should be coming in the mail pretty soon. The CPU I have, a core i7 4790K, has a pretty decent GPU so I was testing it out the last couple of days. The first game I tried was GTA 5. It ran smoothly (albeit at the lowest possible settings and 720p resolution). I was pretty impressed with the integrated graphics' performance. Yesterday I tried GTAV out again for a little bit and I observed that it was ... way too choppy and basically unplayable. I couldn't figure out the reason why. Then I saw that Thunderbird was the only application that was open aside from Steam and GTAV. I closed it down again and boom, back to consistently high framerates.

What is the catch here? Granted I'm using the integrated graphics that came with the CPU but I have 32 GBs of RAM. What gives!??


How long was Thunderbird open before this happened? It could have been a memory leak, which can be really expensive when working with web-rendering software.


It was open for a bit. This experience has made me start to question all those times I had slow-ish response times and Thunderbird was open. Slowness or game choppiness should be unacceptable given that I have a 4.0 Ghz CPU and 32 GB RAM. I guess it is time to look for a new email client :/


Not sure, but it's currently using just over 200 MB of RAM here (on Windows 10).


Additionally, the Postbox search feature is way better than Thunderbird's.


"didn't DPI scale properly", "strange behavior with my IMAP server", "consistent UI with my Windows and Linux system", "handle 8 accounts at once with thousands of messages".

If you have enough rigid requirements nothing will ever be a good fit.


This is true, but sort of specious. The point is that for the grandparent (and me) Thunderbird meets all these requirements, and other stuff doesn't. Could we adjust our decades-mature workflow to accomodate other tools? Sure. But it would suck. So we use the tool we have that does what we want until something unambiguously better comes along.


Of course! But Thunderbird does a pretty good job of all these requirements, it's the only reason I feel like I can be so stringent with them. Only reason I've been trying to leave is everyone else telling me it's bad or it's going to stop being maintained properly.


I second that. It's hard to find a functional and usable mail client. I may switch to claws for personal use, but I am still using thunderbird for work.


KMail is rather good, though I find settings UI to be non intuitive and poorly organized. Hopefully KF5 based version will address some of that.


They already released 5.0, and didn't change anything. What is more likely to happen is the development of a Plasma Mobile QML UI built on top of Kontact and the new Akonadi that you can switch between, and that the desktop version might just use a reactive designed mobile UI as default with the current UI being available as a toggle option.


That's a pity. I didn't get KF5 one yet since it's still in experimental in Debian[1] and I assumed it's not usable enough. Settings UI in KMail really needs some serious streamlining.

[1]: https://tracker.debian.org/pkg/kdepim


I feel the same way, I love KMail but it feels like a mess whenever I have to configure it.


How's the support for it on Windows and OS X?


I don't know about support, but my experience is it works fine on Windows.


Well, that's what I meant by support. Sorry for not being more clear.


Thunderbird has been forked. http://www.fossamail.org/


For a good IMAP webmail client I recommend Rainloop: http://www.rainloop.net/


Yeah I use it in some places actually but unless I'm missing something, you can't connect to multiple accounts, see them simultaneously and get notifications on all of them.


While we’re still at a pretty early pre-release version, we already do multiple accounts and unified inbox with ownCloud Mail: https://github.com/owncloud/mail

If you do try it, please let us know of your feedback in the issue tracker! :)

(And that is, you can also run ownCloud with the Mail app locally. We’re working on improved caching.)


I personally use mu4e. If you've tried it a year or so ago, it has made many improvements since then. It is terminal based, but I find I can't use thunderbird any more after switching.


So do I. offlineimap for syncing mails with my IMAP server at fastmail and mu4e for reading mails. Search is superfast, but I miss a better handling of spam mails. Thunderbird is really good at sorting out spam once you've trained the spam filter.


Check out MailMate! It's great!

It's a robust IMAP client that has cross account message rules, smart filters, blocking external images, OpenPGP, Google's wonky IMAP implementation, great support, and a lot more.

http://freron.com/


It's great, but not the fastest client in/on OS X. For me, Mail is faster. YMMV.


> probably Alpine, a CLI based client

I think that Mutt might be a better option in that I think it is still maintained.


I believe alpine is still maintained and it has much better multi account support than mutt.


I'm certainly at a loss for amother decent mail client that works on OS X and with Exchange.


I've been using Mail.app with Exchange for 3 years, never had a problem. Is there anything in particular missing?


I'll take another look, thanks.


I run Zimbra, and their webmail UI is the best I've seen - much more functional than GMail.


They want to kill XUL for Firefox so they can be all fancy HTML. So they have to kill Thunderbird, a XUL app.

In a few years the all new HTML Firefox will come out. My bet is that it will suck. It will lack a TON of features that the existing Firefox has, but hey, it's all HTML! And you won't be able to stick on the old one, because within a week or two some critical security flaw will be discovered and eventually (like six weeks later) they'll stop supporting those for old Firefox.

Initially the HTML Firefox will suck. When you take an app that's been worked on for 15 or so years and then replace it's UI you're going to lose a TON of features. They'll slowly reintroduce some of the most popular features (hamburger menu will be priority #1!) but there will be a TON that they will not reintroduce. Why? Because when they were first introduced a decade ago it was a cool idea someone had, and no one knew how popular it would be, so heck, why not implement it. But now they know that only 10 million or even 1 million people use that feature, and they're only interested in 100 million user features! If Google Chrome doesn't have it, it must not be important!

As much as people complain about XUL not looking native, wait for HTML Firefox, it will take them forever to get where XUL was years ago.

They can't just kill XUL for Firefox though, they have to burn down the XUL ecosystem first so they're just releasing a new Firefox, nothing to see here.

1. They try to kill xulrunner as a project separate from Firefox. They try to move everyone to firefox -app.

2. They stop releasing binaries for xulrunner.

3. They deprecate XUL extensions.

4. They distance themselves from Thunderbird. They say it's better for Thunderbird. Yeah right! Thunderbird is built on XUL, it's not going to be rewritten in HTML any time soon, definitely not by volunteers. It's not going to be able to maintain XUL either, and when Mozilla stops supporting XUL for Firefox a few years after deprecating XUL extensions then Thunderbird will be screwed, but hey, it's not our project! We abandoned it years ago!

So when the crappy HTML Firefox shows up, with way less features than the Firefox of today, remember that this (Thunderbird) was one of the things given up to have it.

But hey, donate to Mozilla! $5, $15, $25, anything helps. Because we already make hundreds of millions of dollars and we do whatever is shiny and new, screw the "community" of existing stuff. We're fighting for an open web! (where you can use Gmail for email)


XUL was a weight on developing for Firefox. It worked like HTML, but was nonstandard. At first, it meant they could quickly implement things like flexbox (<hbox> / <vbox>) and gridlayout (<grid>) long before they were a thing in CSS.

But as HTML and CSS slowly got more features than XUL, XUL development slowed down, up to the point where writing the Firefox UI in XUL became a pain because of poor tooling and sneaky bugs. More and more pieces of Firefox got written in HTML inside XUL, and factorizing code between the pieces in XUL and those in HTML was nightmarish.

Dropping XUL means putting those bugs and issues behind us, and focusing development on a single DOM language. You would probably be surprised by how much of the UI already is in HTML; tab groups is almost all HTML, and the DevTools' editor and DOM inspector are in HTML as well.

As for donating to Mozilla, the distinction between Mozilla Corp and Mozilla Foundation is understandably complex for outsiders, but basically only Mozilla Corp makes money from the partnerships.


> It worked like HTML, but was nonstandard.

Being nonstandard is completely irrelevant here. Firefox's UI doesn't need to be rendered by Internet Explorer or Google Chrome.


Much as i despise HTML UIs (see reasoning my rant below https://news.ycombinator.com/item?id=10655606), it is relevant and XUL is horrible.

I'm glad to see it disappear, it's one of those things the world doesn't need. A failed experiment. And an HTML UI for firefox makes sense in the long run.

Fun fact of the day: Did you know XUL uses DTD to store translations? That's right, if you have a string you want to translate, you just have to create a new XML element in a localized DTD file. Isn't that just a wonderful idea.


I was just going to mention that, but you beat me to it. It's just bizarre to use XML external entities for internationalization.

I worked on TomTom Home, which was implemented in xulrunner, and I developed some internationalization/localization tools that had to deal with XUL DTDs as well as several other different and incompatible file formats for storing translations. I could never for the life of me figure out why they decided to use DTDs with external entities for translations.


XUL wasn't a failed experiment. From what I understand it served as the inspiration for some of the new features found in HTML. If that's the case it was a useful experiment.


What do you think is so horrible about XUL, apart from XBL being really, really verbose? The basic UI part is fairly decent, IMHO.


> Being nonstandard is completely irrelevant here

It was meant as a pro. Since it was their own language, they could do anything with it. They could make a better HTML.

But HTML caught up. Currently, I believe HTML is better than XUL, and making XUL great again is both a silly reuse of a political slogan and a waste of effort.

I for one would rather see efforts made to allow CSS styling of all input elements in HTML.


Good luck - the main advantage of XUL IMO is that it looks and feels native to the platform. This is completely lacking in any HTML based UI I've ever seen. It's been a major advantage for FF extensions and I can't help but feel like it's a major step backwards to lose it.


> Good luck - the main advantage of XUL IMO is that it looks and feels native to the platform.

That you say that is a testament to how well the meticulous CSS styling of the XUL elements—which applies equally well to HTML (try it in a browser chrome shell!)—worked.


I don't think that has anything to do with XUL directly, you can still find its original look by furing up Seamonkey, but that Firefox developers opted to rework the UI subsystem so that it translated XUL elements into native elements as much as possible. Iirc, this was done in part to speed up the Firefox UI compared to Seamonkey.


> Being nonstandard is completely irrelevant here.

It's relevant in that much of the work has to be duplicated (documentation but also layout implementation & the like) and none of the new and improved web development tools can be used for XUL.


The problem as I understand is more about the documentation, the hidden quirks and also the barrier to bring new contributors to the source code. It's always easier when they don't have to learn a custom language first.


> Being nonstandard is completely irrelevant here.

No, it's not irrelevant. Standards are also about documentation (it's easier to document standardized things) and familiarity (people are more familiar with standardized things).


It is not entirely irrelevant.

Given that there are more developers familiar with HTML development, it may lower the barrier to developing plug-ins, or providing patches.


I was poking at the non-standard portion. Lowering barrier to development of plugins, patches, etc is fine, but being standard doesn't help or hinder that. Popularity and ease of introduction help that. Javascript (that is, the dialect of ECMAScript that is implemented by Firefox) itself is non-standard and Mozilla isn't throwing that one out for exclusively ES2015.


Actually, tons of nonstandard SpiderMonkey features have been removed. Sharp literals were axed, E4X was removed, "let" is being changed to the ES6 behavior, and so on.


FWIW, there are real intentions to standardise all the JS extensions or drop them.


It happened before, with the transition from Mozilla Suite to Firefox. And let's be honest: XUL was just lipstick on the pig that is cross-platform development. HTML/CSS/JS are now fast enough to look like a slightly better pig, so here we go.

Also, there's a generational shift underway. You and me could find crazy that people would openly choose to use IDEs built on HTML/CSS/JS, but that's what a lot of young folks are doing (Atom, VSCode etc etc). That's their world, that's what they like. An entire generation now exists, who learnt to code from web scripting rather than C or BASIC. They have taken over. It's just how it is.

(this said, I agree that donating to Mozilla feels a bit silly, looking at how much money they make from commercial agreements. It's like donating to Ubuntu or RedHat.)


I'm all for writing new apps in HTML, I think Atom and VSCode are awesome, but I'm not for rewriting huge legacy apps to be HTML apps for no good reason. The reasons given, that XUL requires maintenance that Mozilla engineers don't enjoy doing, is a joke considering the amount of effort to maintain XUL is less than 1% of the amount of effort to move Firefox to HTML.

No one has listed the ten awesome features that we're going to get from HTML Firefox (cause there ain't many) or the 1,000 features (tons of little details) that will be lost. If users listed their 10 biggest problems with Firefox I doubt any of them would be solved by moving to HTML.

Imagine if instead of writing VSCode from scratch and releasing it alongside Visual Studio Microsoft had rewritten the Visual Studio UI in HTML, abandoned all the nonessential features, and abandoned the old native Visual Studio.

One might say that Mozilla will wait to release the new Firefox till it has all the old features of the old Firefox, but that's not been my experience with how teams work. They'll get frustrated with the rewrite and want to get it out the door. "We can add those features later" they will say, and then they'll never get added.


> The reasons given, that XUL requires maintenance that Mozilla engineers don't enjoy doing, is a joke considering the amount of effort to maintain XUL is less than 1% of the amount of effort to move Firefox to HTML.

Ah, but maintaining XUL means working on old code (which is boring), but moving Firefox to HTML means working on new shiny code (which is exciting).

https://www.jwz.org/doc/cadt.html


Well, or Firefox as a web browser has to render HTML/CSS/JavaScript no matter what, and now that HTML/CSS is at feature parity or better with XUL in the space XUL is meant to occupy, it doesn't make sense for Mozilla to maintain two competing technologies when one receives 95% of their internal developer attention and 99.99999999999999% of external developer attention.


But, sadly, they aren’t replacing XUL with HTML.

Instead, they plan to render the UI natively, with only "some" parts in HTML.

So, instead of XUL + HTML, we’re going to get GTK + WinForms + Cocoa + HTML. Great, eh?

And we lose the ability to style it with addons – your themes can only change the background image of the header bar, that’s it.

And the remaining addons can’t modify the UI (tree style tabs, bottom tabs, etc) anymore either, instead you can only modify page content.

I’m seriously pissed off now, because Firefox was the last browser where I could actually customize it how I liked it.

I hope the person who made this decision is going to have to use software without any config options and with horrible defaults, like GNOME. For the rest of their life. May their car always have have either 60°C+ heat, or -20°C AC, may their screen of their phone always either be too dark, or too bright.


Yes the plan is to replace XUL with HTML. I have no idea where you heard otherwise.


> Part of the decision has already been made. We are moving Firefox addons (themes and extensions) away from a model where you can perform arbitrary styling or scripting of the browser chrome. This is an engineering-driven decision, and it's unavoidable and necessary for the long-term health of Firefox. Not only are we moving Firefox away from XUL, but we are likely going to make significant changes in the way the UI is structured. It is likely that some parts of the UI will be implemented using native widgets, and other parts will be implemented in HTML, but the exactly DOM structure may involve independent connected with well-defined API surfaces.

Official statement from the Mozilla post in the discussion regarding removal of support for "heavyweight" themes. Emphasis mine.

That’s a pretty clear statement that it won’t be 100% HTML.

Also, the fact that "arbitrary styling and scripting" won’t be possible is another issue.

Tell me how I am supposed to write an addon that adds tab-previews as thumbnails when you hover over a tab like Vivaldi is doing it: http://i.imgur.com/vqysJs1.png ?

How am I supposed to write an addon that colors the navbar and the current tab in the theme color given by the HTML, or, if not existing, the favicon?

With current addons I can do that, with the new addon system, I’m seriously fucked.



You're citing jwz's CADT post in a thread discussing Firefox? It's a product for which people regularly complain about open bugs that are 10 or more years old.


Killing XUL has the potential of killing Firefox. I bet the last good XUL version will be forked.


It essentially already has http://www.palemoon.org/

The Palemoon developers understand the folly of ditching the flexibility of the XUL interface. They won't be removing it.


And then the person who forked it gets to maintain it!

Lucky them.


No need to be so snarky, he'll have all those other people out there who love XUL to help him.


He's right. Mozilla, a $200-300 million a year outfit, currently maintains their software including Firefox. It's a huge C++ application. People who code XUL in their spare time, even a bunch of them, aren't likely to make a dent in keeping parity between a Firefox fork and the main release. They'd likely have trouble even porting it.

So, a fork is a rough solution and will have maintenance issues for an app this size.


I was being facetious.


Ah. You got me there haha.


As an active user of Thunderbird I am very disappointed with these news.

But these rants are just silly. XUL is a technology that needlessly duplicates what HTML/CSS do these days. And if you ever want to have a smooth transition to servo, which solves real, deep problems, having a html UI is going to be dramatically important.


I really want to agree with you that Atom is awesome, and it is, in principle, but in reality a text editor should not be using 300MB of RAM. Sublime Text, which I consider to be a direct competitor, barely uses 20MB of memory on my machine even after hours of use. Heck, does Intellij even use that much memory?

I'm just really discouraged with how more and more desktop apps are being written in HTML, CSS and Javascript and suffer in quality as a result.


Considering my first 16 Bit computer had 3MB RAM and ran MS Word alongside a GUI, I am wary to say "Amount X of RAM is preposterous for a given task".

As long as Moore's law provides enough lift under our wings, RAM usage is one of the less important aspects of an application. However, I fear the trend of building application UI's in HTML/CSS because invariably it will lead to wildly inconsistent look, feel and behavior.


Reminds me of the quip that goes something like this: "What's the best way to save 10 dollars? Earn 10 dollars more."


Eight Megabytes And Constantly Swapping used to be a joke.

Sigh. I feel old.


IntelliJ has something between 100mb and 700mb depending how much projects you have open and what language you use.


It's worth pointing out that for a couple of years Mozilla has been building a replacement for their rendering engine (Servo), so I imagine that to some extent deprecating XUL now is preparation for not having to re-implement it for Servo. Just speculation.


> One might say that Mozilla will wait to release the new Firefox till it has all the old features of the old Firefox, but that's not been my experience with how teams work.

Realistically, getting to feature parity after rewriting a core part of any application is going to be nearly impossible. You end up with different features, hopefully better ones, but not exactly the ones you had before you started. You can't step twice in the same river.


> You and me could find crazy that people would openly choose to use IDEs built on HTML/CSS/JS, but that's what a lot of young folks are doing

I'm 25 and I feel to old for this industry already.


What bothers me about being older is my first browser was Internet Explorer, then I got to play with MOSAIC's slow arse in school, then used Opera/Mozilla, and so on. Got to see where it came from. And the new stuff, especially Firefox, is coming full circle in how friggin' slow they run to serve the lowest common denominator of web pages.

It's annoying. I miss Web 1.0. Add just a bit of dynamic functionality plus broadband and it would be fine for 80-90% of cases. And FAST!

Instead, we get Web 2.0, 3.0 (4?) which makes pages on 40-50Mbps load like my old 28Kbps modem on AOL. Seriously...?


> And the new stuff, especially Firefox, is coming full circle in how friggin' slow they run to serve the lowest common denominator of web pages.

It should be easy to find performance numbers showing how Firefox 42 renders old, pre-CSS pages slower than, say, Netscape 4, then.

Netscape 4 didn't have a JIT, didn't use hardware accelerated layers, trapped into kernel mode for GDI calls, didn't use accelerated SIMD for painting, and didn't have HTTP 2. It barely had any optimizations for dynamic restyling, so tons of stuff would get reflowed when it didn't have to. This is just the tip of the iceberg.

Browsers have gotten more complex, but the complexity is often in the service of making things faster.


Notice my references to Web 1.0, 2.0, etc? That means my comment was talking about not just the browsers but the sites designed for them. The combination of the two have made web sites really slow that could be designed to load up instantly. Instead, they load up as slowly as some sites did on my old Pentium 2 running Opera, etc. You'd think they'd be significantly faster with all the Moore's law iterations and browser improvements. Modern sites make sure that doesn't happen, though.

And I never mentioned Netscape: it was called Netscrape then and hackers despised it. I used Opera and IE mainly.


Opera and IE were no different in architecture.


What are you referring to? Opera and IE certainly used different rendering engines: Trident for IE, Presto for Opera (before it moved to Blink).


Back in the Netscape 4 days, the architectures of those engines were broadly the same (in the sense that Linux and FreeBSD have broadly the same architecture). Of course the codebases were different.


How do you know this? Trident and Presto were both closed source, did you do some contract work with both Microsoft and Opera at the time?


Presto wasn't released until 2003.

In the late '90s, the time frame this thread is about, it was well known that the layout engines at the time were not dynamic: they could not in general reflow only parts of the page. Everything else I mentioned in the post that triggered this subthread is obvious simply based on browser engine and OS history.


Architecture != functionality.


I don't even know what we're arguing about anymore. Do you dispute this?

> (Netscape 4/Opera/IE) didn't have a JIT, didn't use hardware accelerated layers, trapped into kernel mode for GDI calls, didn't use accelerated SIMD for painting, and didn't have HTTP 2. It barely had any optimizations for dynamic restyling, so tons of stuff would get reflowed when it didn't have to.


The internal structure of a program = architecture.

The features of a program = functionality.

So we're getting our wires crossed due to word choice.

In any case, I'll admit this has dragged on long enough. Truce?


Wasnt aware of that. Interesting.

It was just faster, better looking, and skinnable. Plus malware stayed hitting IE so there was that too.


I don't think the statement you replied to is correct.


Yep. No point in fixing it though.


Disable javascript. Seriously.

Disabling js has given me the fastest turn-round on page-load times. It's usually not the browser, but loading 15 different un-optimized JS engines that causes the problem.


I use NoScript. JS heavy sites are the worst though.


I am 29, started coding at 14, I've built UIs based on mIRC Scripting, VB/Winforms, C++/Qt, C#/WPF, Java/Android layouts. Using IDEs such as Visual Studio, Eclipse, Qt Creator and Android Studio.

I must say: Atom Editor is great. Using HTML/CSS/JS to build desktop/mobile apps really makes sense to me, especially given that:

- You only have to support one rendering engine. - You have access to the latest Web Components/ES6/CSS3 features. - You can rely on native Node.js modules when needed.

It's good for portability, HTML/CSS became better at UI, JS becomes a better language, current IDEs are great, live debugging tools are great.


You also forget:

- There are far fewer native components, leaving accessibility down to the developer of the app and making it nonexistant.

- Platform integration is impossible, which means there is no way for the framework, for example, to create widgets differently on OS X, Windows or Linux (these platforms have many different conventions)

- Theming globally becomes impossible. If you want to write a dark theme for your desktop, you go from writing a theme once for GTK and once for Qt to once for each and every app you run. Uuuurgh.

But only supporting one rendering engine, yay! Much better than the 6 different engines we have to support in Qt (huh?).

And having access to all the latest JS additions! ... that were copied from other languages you could develop desktop apps in, because JS is a terrible hack.

And relying on native Node.js modules, yay! As opposed to native modules for literally every other better language out there.

I'm not sure what you're actually comparing this workflow to. Maybe one day writing HTML apps will be great, but today is not that day. Today, writing HTML apps is only beginning to be an idea that doesn't completely suck. But for the user, it does massively suck. Massive apps that ship their own copy of webkit/blink/whathaveyou, with security flaws that won't get patched, disgusting performance on low-end hardware, atrocious battery usage and decades of UX knowledge thrown out of the window just because the app developer doesn't have the knowledge to see it.

I don't look forward to this. And I'm younger than you.


> "Platform integration is impossible, which means there is no way for the framework, for example, to create widgets differently on OS X, Windows or Linux (these platforms have many different conventions)"

So web apps can't have reusable components now?

> "And having access to all the latest JS additions! ... that were copied from other languages you could develop desktop apps in, because JS is a terrible hack."

WebAssembly will take over for web apps eventually.

> "decades of UX knowledge thrown out of the window"

UX knowledge isn't toolkit dependent, UX knowledge is just as applicable on the web. You can make a dog of an app with any toolkit, native toolkits offer no guarantees for good UX, you can only hope that designers choose to follow best practices.


> So web apps can't have reusable components now?

I'm not saying they can't, I'm saying they don't. You go ahead and try to fix that, create widget#772981 that still won't support typed-selection, or will break with large text or what not... I've seen too many of those, most of them bad, and none of them standard. So far, React is the only thing that even comes close to a sane model for a contender to UI development on the desktop, and it still mostly fails the accessibility checkbox.

> WebAssembly will take over for web apps eventually.

More eventualism. Do you have evidence for that? Do you even have evidence that it'll be better than what we have now in other languages if it does take over?

> UX knowledge isn't toolkit dependent

A lot of it is. You'd be surprised just how much UX is crammed into Qt Widgets for example. Years of experience making them more accessible, more usable, more consistent with the platform they're running on, etc.


> "I'm not saying they can't, I'm saying they don't."

They already do. Electron, Web Components, etc...

> "Do you even have evidence that it'll be better than what we have now in other languages if it does take over?"

Compare the performance of vanilla JS vs. asm.js. It is clear from what developers have stated that WebAssembly performance will exceed the performance of asm.js, the threading improvements alone should offer noticeable benefits.

> "Years of experience making them more accessible, more usable, more consistent with the platform they're running on, etc."

So what are we looking at to replicate that? A theme per platform? Some accessibility work? What else?

It should be noted that the web isn't starting from zero with UX either, we've already had 20+ years of refinements to the web user experience.


> They already do. Electron, Web Components, etc...

And nobody can agree on which to use. It's not standard if it's just "some set of components some people reuse". A far cry from standardization.

> Compare the performance of vanilla JS vs. asm.js.

That's not what I'm comparing. I'm comparing the performance of native toolkits vs web toolkits on asm.js. It pales in comparison, and the battery usage is through the roof. ymmv?

> So what are we looking at to replicate that?

1. Standardization of components (developer does not have to build their own scrolling system, context menu, etc) 2. Themability of components at the application level (developer can style components not to clash with the style of the application) 3. Themability of components at the platform level (user can style the application not to clash with the style of their own desktop) 4. Performance needs to shoot way, way up. Apps can't rely on a performant GPU, it's unreasonable to ask that of every device at this point in time. Some day maybe every device will come with their own high performance GPU, but eventualism cannot excuse bad coding practices and unnecessary layering.

The rest should follow. But I still don't see us getting any of those things, any time soon. These are not easy problems to solve.


> "And nobody can agree on which to use. It's not standard if it's just "some set of components some people reuse". A far cry from standardization."

Atom uses Electron, VS Code uses Electron, Light Table uses Electron (starting with v0.8).

As for the web side, web apps are a young field, what else would you expect?

There are certainly popular UI elements, Bootstrap for example.

If you want to make an web app that looks native, using React Native could be a good solution.


An app based on React Native isn't a web app anymore. But it's not quite native either; it doesn't use the native button and list view controls on either iOS or Android.


>I'm not sure what you're actually comparing this workflow to.

He is comparing it to what they have now, XUL + CSS + JavaScript. Replacing it with html isn't as big a change as it sounds, their UI is already written in XML and rendered by Gecko, all they are doing is moving from a custom XML like XAML(MS) or FXML(Java) to standard HTML rendered by Gecko.


Unlike you know something I don't about Kunix specifically, I don't think he's talking about Mozilla development in particular, but rather development in general.


In that case, could the XUL dependency be removed from Thunderbird?


Unlike the original parent post I don't think this announcement anything to do with XUL. Thunderbird just doesn't have the userbase Mozilla expects it to at this point because, surprise, the people who want an email client are a small subset of the people who want a web browser.

<rant>

This is lost in a sea of replies now but I'm sure pissed off Mozilla is completely losing their root mantra of fighting for the free web. Persona, Thunderbird, two critical components of a "free web": free global authentication, free email client. I'm sure next mozfest the same suits as every year will talk about how they're so proud of "keeping the web open". What a crock of crap. Firefox isn't even that good of a browser anymore.

</rant>


It is easy to forget that freedom and creativity are not a numbers game. There are benefits to a majority when a minority is free to create. An argument could be made that the majority consumption patterns are made possible by the minority creator patterns, hence it makes economic sense to fund the minority out of majority profits.

This is one of the reasons why iPads have stalled - pro developers can't make money, for well documented reasons. The web equivalent will be the starving of open communication, thought and creativity, leading to homogeneous noise as a poor substitute for ground-breaking content.


I don't see why it couldn't, they are doing it for FireFox. On the other hand Mozilla has been on the path to retire Thunderbird for some time, this is just the next step.

Mostly I think it is an effort vs reward thing. Thunderbird doesn't have the user base and doesn't have a revenue stream the way fire fox sells the search box. I wonder if anyone over there has thought about cleaning it up and selling it as a white label email client.


> - You only have to support one rendering engine. - You have access to the latest Web Components/ES6/CSS3 features. - You can rely on native Node.js modules when needed.

So, developer convenience trumps user experience? Who cares about your battery run time, how hot your PC runs, the bandwidth overhead, the massive attack surface from all the useless components shipped, how badly the webapp integrates into your OS, as long as we can ship faster and faster?


As a user, my experience with HTML/CSS/JS apps is great. Look at the excellent feedback the Atom Editor is getting.

As a developer, I feel that it's easier for me to create good user experience (nice UI, easy non-blocking I/Os by default, etc).

I don't say it's flawless, I say it's now becoming a very good alternative.


Oh, I'm writing web apps myself, but Gods, am I hating myself for it. OS integration is somewhere between a nightmare and impossible – and yes, it is desirable, unless you're on like Gnome 3 –, the resource requirements are abysmal (400 MB for an app that consists of a single input form, a table, and a search field, really Chrome?), performance is actually pretty lacklustre (non-blocking is one thing, actual multithreading another!), the UI isn't actually all that nice if you want to use it, not just look at it (say goodbye to accessibility; hell, even just proper keyboard navigation is black magic for most frameworks); and if you ship the engine yourself, you're now responsible for orchestrating and shipping bi-weekly browser updates to your customers to make sure your browser engine stays patched.

Is all that bullshit really worth… whatever we're saving? (Our company is mainly saving in developer salaries, because we can force kids fresh out of school to work for minimum wage, instead of hiring experienced developers… we'll see how long this keeps working.)


> As a user, my experience with HTML/CSS/JS apps is great. Look at the excellent feedback the Atom Editor is getting.

For some less positive feedback on Atom, which reinforces some of creshal's points, see the Atom section of this text editor rundown:

http://eev.ee/blog/2015/05/31/text-editor-rundown/

I've also frequently seen complaints to the effect that Atom is much too slow, even on machines only a few years old. Even my 486 laptop back in 1997 could run a text editor with syntax highlighting (specifically, JPad for programming in Java). What are editors doing these days that needs so much CPU power?


> For some less positive feedback on Atom, which reinforces some of creshal's points, see the Atom section of this text editor rundown:

> http://eev.ee/blog/2015/05/31/text-editor-rundown/

This review (6 months ago) is based on an early version of Atom Editor (0.204.0), the latest stable (1.2.4) is way more mature and provides very nice packages [1].

[1] https://atom.io/packages


JavaScript programmers are truly delusional sometimes.


It’s not really JavaScript, I’d argue it’s a different reason.

node-webkit apps are cheaper to develop. Far cheaper.

Everyone can write a webapp, a native application requires professional developers. The payroll looks completely different.

And then the devs who have only worked with node-webkit don’t know how much better they could have it. I know devs who refuse to use map, reduce and filter, because "it’s black magic and we always used for".


The fact that you are being(edit: was) downvoted for constructive opinion shows unwarranted prejudice of HN hivemind towards web technologies. HN discussions on this topic are effectively useless, any constructive truth finding is drowned.

edit: I'll take my downvotes with pleasure. The fact that I am able to use Atom or Nylas N1 or Nuclide on linux with 0 problems alone is enough to welcome proliferation of web tech on desktop.


Your decry of prejudice would be more believable if it weren't only unsorted generalizations (i.e. prejudices)


Thanks for your support of N1. :) Feel free to ping me if you have feedback. I'm mg@nylas.com


Same here, I'm 26. The industry has changed a lot when people realized you could make a shitload of money with the web/scripting stuff. Coding schools, open source communities and huge companies love HTML and JavaScript and Python and Ruby for their simplicity. Just take a bunch of people, tell them they could make a lot of money by learning some dead simple languages and there you go. Doesn't matter if they write the most disastrous code in the whole universe.

Look at Code Academy, for example. They add new programming languages and technologies every now and then, but basically it's always the same. Just like their audience. They won't add C to that list, because that wouldn't work for this average not-nerdy-enough-for-real-programming audience.


Yes! JS, Python and Ruby are just "dead simple" "scripting" languages that only serve to "make a shitload of money" and aren't "real programming".

Real programmers (like me) use C.

Seriously, no. Just no.

And if I wanted to prevent people from writing "the most disastrous code in the whole universe", teaching C instead of JS would be much, much lower in the list than teaching how to split code into modules/libraries, write testable code, etc.


> And if I wanted to prevent people from writing "the most disastrous code in the whole universe", teaching C instead of JS would be much, much lower in the list than teaching how to split code into modules/libraries, write testable code, etc.

And for me that would be much much lower than teaching how to keeps things simple. The whole modern web has a bad case of over engineering, everything is modules of modules of modules, with so many tools associated you get a headache trying to install a simple JavaScript library (what is wrong with a download link and I drop the lib on my page, nothing that is what).

I'm all for modern approaches but I can't help but feel many developers have lost touch with what writing clean code is, it's not making module, libraries or even tests, it's making sure what you are doing is as simple as it can be and efficient at it. Sadly most modern web stack fail at that. All hidden in mumbo jumbo of modules and dependencies no-one really needed or asked for, often created by people who never questioned the purpose of what they were doing, or if the whole internet needed it (because you are at Google and have found a neat way to deal with your huge JS stack doesn't mean the whole web needed it too, and that you needed to spent a whole lot of effort making people adopt it).

As much as you make fun of C, learning and writing C will teach you to keep you programs simple and efficient, because the language requires it. And that's coming from someone who started programming with Perl, then PHP, and only learned C later on.

Makes thing simple not simpler should be the cardinal rule of programming, not modularize and test everything, those are situational, the former applies all the time.


> As much as you make fun of C

I didn't make fun of C. I made fun of a comment posted by a C programmer, which is very different. I have absolutely nothing against C.

> C will teach you to keep you programs simple and efficient, because the language requires it

From what I've read, the OpenSSL codebase is definitely not simple, and I'm not sure it's efficient either—it would depend on how you define efficiency. So your affirmation seems factually incorrect.

Other than that, I agree with your post. Simplicity is awesome. No point in using Angular to build a landing page if static HTML can do the job just as well. (Edit: let me take that back. There can be a point: the pleasure of experimenting and learning something new.)


Learn C, use python/go/whatever for most tasks.

C teaches you how stuff works. Its important.


Question coming from a long-time sysad whose experience is almost entirely batch and PowerShell, what does it mean to write "testable" code?


You write code that can be broken up finely enough into discrete, stand alone modules that can each be run through a series of tests.

For instance, you might create a model class, and then you know that model should have a name, shouldn't be able to be saved without a name, the name should be x-number of characters.

Then you can write a series of tests that makes sure that, regardless of how the model implements that name, all your assumptions about what that name should look and act like don't change without throwing a red-flag up to whoever is changing that model.


Say you want to test how a program fares when strings are malformed, or the disk is full, etc. It can be hard to simulate this if the code refers to variables and results from functions from all over the place.

Making code into reusable modules code is generally a good thing, but it's even better if you make them in a way that will help you test those chunks independently as well - that is testable code.


Usually it's about keeping it granular enough that various sub-activities can be tested in isolation. If you have one function with 10000 LOCs, that's not really testable beyond "something doesn't work".

Then of course you need some way to automatically run these tests, but that's usually provided by IDEs or standard libraries these days.

Edit: most curious downvote ever...


Obligatory because of "real programming" reference:

https://xkcd.com/378/


The only good thing (I think) of a generalist is that one is used to adapt, but yeah... it gets boring sometime. Although after a while, is interesting to see how we reinvent wheels and show them with a new fancy name, it is a pattern that you would see more or less each 10 years.

Sometimes it feels like everything was already invented in the 60's.


> Sometimes it feels like everything was already invented in the 60's.

Reading up on the capabilities of early mainframes is eye-opening if you (like me) grew up on Pentiums.


>Reading up on the capabilities of early mainframes is eye-opening if you (like me) grew up on Pentiums.

Close enough... I grew up on XTs, i286, i386, so on :) All of them way weaker than my phone (Not sure if that was what you were referring to).

However, in terms of architecture, algorithms, programming language features... It feels like we haven't advanced too much. Actually, the opposite... we are encourage now to do not be too clever on programming because processing power and memory usage is close to be a commodity and clean code is more important (which is fine but less fun).


Referring to architecture, algorithms, language features, etc obviously. They were all better on many machines from 1960's-1980's. Market kept rejecting anything that wasn't backward compatible with existing garbage and had max performance per dollar. So, dumb CPU's, COBOL, and C it is. :)

Gave examples of what features old ones had in the essay below with the first link mentioning the specific systems for further inspection:

https://www.schneier.com/blog/archives/2014/04/dan_geer_on_h...

Note: B5500, System/38, and Ten15/FLEX are all especially worth considering. Two were basically HLL machines with type-safety and interface safety enforced at hardware level. System/38 was object-based with HW- and SW-level protections plus portable microcode layer.

I'd say Channel I/O counts as one that kicks modern systems' asses. Servers have been copying it bit by bit over past ten years, maybe even exceeding it, but server OS's are inherently inferior in usage given mainframe OS's are designed for I/O offloading at core. Near interrupt-less architecture with acceleration engines makes many apps scream with performance. And would only cost $10 per CPU on desktops but would require Windows & Linux rewrites. (sighs)

https://en.wikipedia.org/wiki/I/O_channel


The modern equivalent to System/38, IBM's POWER hardware running IBM i still has the same benefits. I actually really like the concept and the way the ILE runtime works, but it's too bad that much of the platform is stuck with legacy design decisions and hasn't been modernized.


I agree with about everything you said. The System/38 design was one of the best cathedrals of old. Very forward-looking, thorough, consistent, and great for admins of the time. Still the only capability system bringing in revenue. Adapted pretty well to modern stuff but main OS's issues & stagnation hurt it as you said. I think the fact that it's pricey and proprietary kept the OSS innovation out, too.

However, the change to the AS/400 and POWER cost it one of its greatest features: hardware-enforced integrity at the object level. That's the feature that would still be giving hackers hell if it was widely deployed. The Intel i432 APX and i960MX had similar property. Interesting enough, IBM actually has secure CPU's they've prototyped and even sold to select customers. Would be great if they integrated one with IBM i at microcode, compiler, and OS levels. That plus an optional interface for new customers without legacy crap would be a huge differentiator that might give it new life.


Yes, especially the MULTICS operating system was so advanced and dozens of decades ahead that we still borrow from its concepts. It had for example 16 security rings. Intel CPUs support only 4 CPU rings, and Windows e.g. uses only 2 rings (for kernel mode and user mode) (hypervisor mode uses another ring in recent iterations).


Initial software based Multics had 64 rings, and as I recall only 8 in the hardware versions. No more than 4 were needed in practice: 0 for root, 1 for mail (e.g. you could delete mail you'd sent to other people from their mailboxes if they'd not read it yet), 4 for normal users, and 5 for some stuff that e.g. allowed anyone to use, but was restricted at touching anything deeper in the system.

AMD dropped rings in their 64 bit architecture which Intel was forced to adopt, so they're becoming a historical curiosity.


> AMD dropped rings in their 64 bit architecture which Intel was forced to adopt, so they're becoming a historical curiosity.

Not quite. Hypervisors are operating in Ring -1, SMM is equivalent to another ring above that, and I can't find anything about AMD64 dropping Ring 1/2? Ring 0/3 at least are still in use.


AFAIK 64-bit linux has kernel in 0 and userspace in 3?

http://wiki.osdev.org/SYSENTER

http://www.x86-64.org/pipermail/discuss/2000-October/001068....

I tried looking at the latest 4.2 kernel tree - but the assembler/c-code that sets up and deals with sys-calls has been rather re-factored, so I'm not entirely adamant it's ring 0 and ring 3 for both 32bit and 64bit - but I think so? (From a quick glance, no sections stand out as calling out ring 3 explicitly when talking about returning to user-land -- granted I didn't do any searching over the code).


I submit that "supervisor/user" isn't an implementation of "rings", plural (and it's got to predate Multics by a lot, but I can't quickly prove that), and that SMM is something entirely different. Hypervisors and there use really aren't comparable to Multics Rings.

A couple of minutes with Google only found hints that confirm my memory WRT to AMD64 and rings, and/or Intel not copying a segmentation feature added to later versions of AMD's chips.


Could you recommended an overview?


So this generation is rewriting everything that was made and working in the 90s. Was everything written in the 90s also a rehash of stuff from the 70s/80s? Just curious, what was the equivalent of HTML/CSS/JS in the 80s?


In a way, yes. If you look at OS interfaces from the '80s and compare them with modern ones, you'll see a lot of cpu power is now spent on eye-candy but functionally they're not terribly different. Except they're all built on C++, whereas before they were in C or lower-level languages. OO was the HTML/CSS/JS of the '90s.


Could all the large software applications of today have been created without OOP, just on functional programming paradigms? If OOP was beneficial 10 years on, maybe the cross-platform nature of HTML/CSS/JS will also be vital to future applications.


> maybe the cross-platform nature of HTML/CSS/JS will also be vital to future applications.

There are already cross-platform toolkits that can deliver everything a browser engine can, faster and safely.

Face it, "web technologies" are not winning because of any massive technological advancement, just like C++ wasn't this huge advancement over C. They just managed to achieve enough critical mass to make everything else look less popular. In the '90s, OOP did that through academia and commercial push (in what was a much smaller tech sector); html/css/js did it through the accidental monopoly that is the web browser. The end result is basically the same.


> "There are already cross-platform toolkits that can deliver everything a browser engine can, faster and safely."

Nothing with the reach of web technologies. The closest I can think of is Qt, but there are issues with deploying Qt apps on some platforms.


So our choices are

• Native applications that give users excellent UX and performance

• Web apps that are equally quirky and slow on every system

?


As with all design decisions, it's a trade off. The main benefit of web apps is found in their cross-platform nature. If this is desirable then you may choose to sacrifice a little performance to get that.

To give an analogy, it's like programming languages. It's possible to write very fast code with assembly languages, yet their portability to other architectures is practically non-existent. Part of the reason higher level languages like C/C++ are used is because they are much more portable.


LISP, a semi-functional language, was originally invented to solve the biggest, hardest problems. Scheme, Common LISP, Ocaml/ML, and Haskell have all been used in large systems with good performance. Entire OS's were written in LISP's with some benefits that modern machines still don't have:

http://www.symbolics-dks.com/Genera-why-1.htm

Note: And some that are laughably obvious and available today lol.

So, yes, what people use today is an accident of history. That includes COBOL, C, C++, OOP languages, HTML/CSS/JS, HTTP-centric everything, and especially whatever crap is being built on them next.

If you're curious, here's the history I put together on C language and UNIX in numbered list form. You'll see how IT evolution often works in practice to give us lowest common denominator. And afterwards people swear it was product of good design and great achievement. (rolls eyes)

http://pastebin.com/UAQaWuWG


The most remarkable part of "web apps" is probably what they do not improve on:

Smalltalk had (has) messaging and decent object orientation (and you have that especially in latest js) - but Smalltalk never had one standard vm implementation - different versions had different image (ram-saved-to-disk, source code and byte-code) and vms. Javascript has common source code, but no common vm/image format.

Office Suites had rich documents with smart(ish) widgets, but no security - a macro in Excel had access to all your spreadsheet data. Web apps don't really have any good encapsulation either -- so we'll likely repeat the macro-virus era with web virus era (I'm not sure if we already are or not, there's certainly been a few self-replicating ones, that eg spread via facebook updates etc. Not sure if they generally live in the phone-apps or various web-apps. Probably both).

We already had the future of web applications within reach years ago - but apparently no-one cared: http://lively-kernel.org/


> what was the equivalent of HTML/CSS/JS in the 80s?

Depends what are looking as an equivalent. Browsers didn't exists but markup languages have been around since the 70's (mentioned for first time in late 60's)[1]

Maybe an equivalent to express GUI as resources? yeah, we called it RAD back in the day and they were quit popular among some circles. Does somebody remember Hypercard? Or even the first versions of Visual Basic?

[1] https://en.wikipedia.org/wiki/Markup_language#History


Mainframes and thin clients maybe? "All of the computing power will be over here, and the clients will be dumb and just render stuff!"


Stick around for another 25 like I have...the fun is just starting!


Fun as in yelling at people to get off my digital, hand-optimized assembly lawn?


Written and read in hex code


I still get mad when people talk bad about Fortran and replacing the code with some slow solution. The power of CPU and cheap RAM has caused the world to favor the hand holding of developers.


It is sad to see Web GUIs taking over the native GUIs. Everyone run their apps in a virtual GUI inside another GUI. That's meta-meta-crazy.


There's nothing native about XUL, though.


Atom and VSCode both perfom poorly, and look like garbage on Linux.

Whoa! They have syntax highlighting! And intellisense! And a package manager!

Kind of like Notepad++ or half a dozen other editors?

I'm not saying "we should NEVER use HTML for native dev", but it's not there yet, and I don't understand the hype around these two tools.


> It happened before, with the transition from Mozilla Suite to Firefox.

And it nearly killed them.

http://www.pubarticles.com/member/user_img/216/1279777216.jp...

http://www.joelonsoftware.com/articles/fog0000000069.html


No, what almost killed them was the transition from Netscape Navigator 4.0 to Mozilla Suite 1.0 — that's where the massive rewrite happened. Looking at that first link, it's it doesn't point out where Firefox development started: mid 2002. By that point Netscape had already almost entirely lost all its marketshare.


> You and me could find crazy that people would openly choose to use IDEs built on HTML/CSS/JS, but that's what a lot of young folks are doing (Atom, VSCode etc etc). That's their world, that's what they like. An entire generation now exists, who learnt to code from web scripting rather than C or BASIC. They have taken over. It's just how it is.

They could still be doing it wrong.


Dude, you're likely typing on the "wrong" keyboard layout and use a "wrong" calendar... convention trumps correctness pretty much all the time. What matters is that enough people are doing it to make it "the way". I fully expect that we will eventually see a Javascript OS, because "it's so much easier to maintain".


> you're likely typing on the "wrong" keyboard layout and use a "wrong" calendar

When someone shows me/I find a better way to meet a requirement I will adopt it (static site generators, Go routines, ...). I won't doggedly stick to the first thing I learned; I don't expect the world to adapt to suit me.



We won't, because you can't write an OS in JavaScript; you need native code. Maybe a desktop environment or something, but not an OS.


All languages can be compiled to native code, it's just harder for some. For example, here's an AOT compiler for JS: https://github.com/tmikov/jscomp


It's still a scripting language, and you still can't write an OS in it. There's no low-level access to the hardware. You could of course write all the low level parts in C, but then you haven't written an OS in JavaScript.


C also doesn't have low level access to the hardware. The interesting bits of the OS are written in assembly.


I think the only bit that really needs to be in assembly is the context switch, but even that can be embedded in the C (which you might consider cheating, but isn't even an option in JavaScript).


The developer community is a good benchmark. When you do it wrong, people usually don't use your software.


> "You and me could find crazy that people would openly choose to use IDEs built on HTML/CSS/JS"

Why is that crazy? Makes perfect sense to me, especially as the performance of the programming language element improves. Ultimately these IDEs are sure to make use of WebAssembly, which should take away the remaining performance concerns.


It's crazy because, apart from performance concerns (it looks like we love to make our computers slower and slower every 10 years), you're just discarding all the features of the containing desktop OS. Formatted copypasting, usability features, network features, etc etc... you'll have to reimplement them all, solving all the problems that systems developers solved 10 or 20 years ago. The OS will become little more than a very expensive pixel pipe. But that's what people like, because C++ is hard, native widgets are hard to customise, and everyone loves designing interfaces, so that's where we're going.

I anyone ever starts making javascript-optimised CPUs and GPUs, they're going to make billions. At the moment we only have micropython, but who knows...


> I anyone ever starts making javascript-optimised CPUs and GPUs, they're going to make billions.

They said the same about Java in the 90s. Java accelerated CPUs never really took off.


Let's look at the Atom/VS Code use case...

> "Formatted copypasting"

Do you need this for a coders editor?

> "usability features"

Like what?

> "network features"

Which network features do you need for a coders editor? In the case of Atom/VS Code, can't think of a single one that a browser engine doesn't already provide. Not going to need things like AD-integration, etc...

> "The OS will become little more than a very expensive pixel pipe."

If that's what people want, then so be it. I see no problem with simplifying the OS, most of them are already too bloated.

> " But that's what people like, because C++ is hard, native widgets are hard to customise, and everyone loves designing interfaces, so that's where we're going."

The main advantage is the cross platform compatibility. If certain OS vendors didn't make it hard to build apps that utilised a common base then there'd be much less drive to produce web apps. It has very little to do with the complexity of C++.

> "I anyone ever starts making javascript-optimised CPUs and GPUs, they're going to make billions."

As I've said a number of times now, the final target with web apps won't be JS, it'll be WebAssembly.


Not to mention death of accessibility features - those HTML apps are effectively unusable for disabled users because they don't implement accessibilty features available in native UIs.


It's possible to build websites with accessibility features, is it not?


Partially, but the point is that you're (manually and expensively) reinventing the wheel that desktop toolkits had already built for you.


Sure, but with the end benefit being a universal UI toolkit.

Also, 'expensively' is debatable, I'm sure it'd be possible to have reusable accessibility components, wouldn't necessarily have to reinvent the wheel for each new web app.


So we finally have a free choice of OS. It's not the full story to complain about people solving the problems that the OS side had already solved decades ago. The big new thing is that they are doing so in a platform agnostic way. That I have nine different virtual machines installed on four different operating systems and the same code base can run on all of them smoothly.


Except that's not true: Webkit, Gecko, Trident... they are all different "OSs" you're writing for, you just wave them away by shipping the OS with the application. You could do the same by shipping a virtualised image running a stripped-down Linux configured to run only one application. One of these solutions is now socially acceptable, but both manage to completely discard everything the desktop OS achieved in 30 years.


> So we finally have a free choice of OS

Not really, you still a browser that implement all this stuff. The only difference is Open tech vs proprietary.

Open technologies are obviously a good thing. Writing a software as complex as Photoshop for instance with the exact same features with HTML/CSS and JavaScript isn't going to fly and be usable for someone who has to work 10 hours a day on it. The performance issues will be significant. It's not a big deal for an text editor though I still can't open a 5mb log file in Atom for some reasons. No problem with Sublime Text 2 or Vim. Why is this ?

My point is Web techs aren't a silver bullet.


My point wasn't that they are a silver bullet. My point was simply to say that while something is obviously lost (performance, native integration, etc...), something else is gained. And that is massive portability.


s/smoothly/in the same crappy but standardized way/


To be fair, what I have heard (I am not a Mozilla insider so I can't say personally) is that donations go to the Mozilla Foundation and the commercial agreements give money to Mozilla Corporation. So you aren't donating to the same place.

From the same source, apparently the Mozilla Foundation does a lot of important work that the Mozilla Corporation can't/won't do but most of the development is done by the Mozilla Corporation currently (possibly due to not enough donations to the Foundation?).

I haven't really verified that, but thought it was worth pointing out.


When you take an app that's been worked on for 15 or so years and then replace it's UI you're going to lose a TON of features. They'll slowly reintroduce some of the most popular features (hamburger menu will be priority #1!) but there will be a TON that they will not reintroduce. Why?

Hmm, that's how Firefox (then Phoenix) started in the first place in comparison to the bloated Mozilla Application Suite.


>So when the crappy HTML Firefox shows up, with way less features than the Firefox of today, remember that this (Thunderbird) was one of the things given up to have it.

Please recall that Phoenix/Firebird/Firefox was born as a lightweight, fast alternative to the bloat of Netscape Communicator/Mozilla Suite. Dropping features from an old bloated application is what launched Firefox to fame in the first place.


And then all the extension authors wasted years of their life reimplementing all the features that Firefox had lost along the way.

This isn't progress.


> In a few years the all new HTML Firefox will come out. My bet is that it will suck. It will lack a TON of features that the existing Firefox has, but hey, it's all HTML!

> As much as people complain about XUL not looking native, wait for HTML Firefox, it will take them forever to get where XUL was years ago.

And isn't Mozilla/Netscape literally the poster child for how to destroy your product with a rewrite? They only regained their market share because Microsoft let IE stagnate to a ridiculous degree.

> But now they know that only 10 million or even 1 million people use that feature, and they're only interested in 100 million user features! If Google Chrome doesn't have it, it must not be important!

I've pretty much only stuck with Firefox because of its extensions and the quirky little features it has. The more they focus on aping Chrome, the more they decrease the friction for switching away to it.


Vivaldi seems neat, which is essentially Chromium with a HTML5 GUI built by the old Opera folks intended to restore the best features of old Opera, with a heavy focus on customization.

My hope is that Mozilla will take the direction of using Servo to build something Vivaldi-like (instead of everybody running with Webkit/Blink) and start to restore old API:s and frameworks from old XUL-Firefox that all the best old addons relied on (things llike NoScript, Session Manager, Vimperator, uBlock, Tab Mix Plus, etc...) instead of just sticking with a blackbox rendering engine and settling with Chrome addon API parity.


Have you tried using Vivaldi? It's awfully slow to respond to user interaction because it's all HTML. Say what you want about XUL, but for something aiming to be like HTML it never felt slow. If Vivaldi's non-existing responsiveness is what the new Firefox will be, then they should seriously consider to rewrite the whole thing like a game (in OpenGL/Vulkan) instead. Oh and XUL was optimized for memory efficiency, which will have to be reoptimized for the HTML-only interface, but that can benefit the whole web.


Vivaldi is much faster than Firefox on both Linux and Windows, where I'm using it. Of course it's still alpha software, but the UI is blazing fast compared to FF.


It's substantially slower than Chromium and Firefox for me on Linux. Both Chromium and Firefox are very fast in comparison on my machines. I'm glad it's faster for you, I really am.


I do believe Otter is doing a better job of restoring Opera's past glory, feature-wise. Vivaldi, even with it's HTML gui feels like another opera/chrome/chromium derivative.

Plug: http://otter-browser.org/


They still suffer from the "different" rendering engine. Some things, it seems, can't be achieved beyond that.


Qt5 ships with WebKit.


> " When you take an app that's been worked on for 15 or so years and then replace it's UI you're going to lose a TON of features. "

What makes you think they'll drop XUL as soon as the first release of the HTML-based UI? It's pretty obvious they'd want to support both until the HTML UI was close to feature complete. You're finding problems where there aren't any.


Have you ever met a team that wants to support two things instead of one? :) No one wants to support the old stuff, especially when no one's paying for it. If they make staying with the old too convenient, people won't convert to the crappy new, and their adoption graphs will suck! Can't have cannibalism!


> "Have you ever met a team that wants to support two things instead of one?"

It happens all the time. Do you really need me to point out examples?


I spent a good deal of time dealing with XUL, heck I even helped with the French community. Designing UI was a ton more robust with it than with HTML, but XULrunner was (and still is from what I saw) not the most pleasant beast to run.

I'm not advocating a rewrite for the sake of novelty, but they're part of a product life cycle.

You mentioned that it would mean dropping some features. I think that's a good thing; an opportunity to cut the fat and make sure only what is proven and useful makes its way back into Firefox.

You seem to think volunteers won't pick up the maintenance of Thunderbird; if so then why should Mozilla care about it?

Now, I don't think HTML, CSS and JS are the best technologies. They each suck in their own way. But they are winning, and Mozilla is merely embracing that.


Actually Thunderbird developers are planning to go full on html+js as well:

https://mail.mozilla.org/pipermail/tb-planning/2015-Septembe...


Don't forget servo. Even if it's not official yet, Servo is the way to go for Firefox. Implementing XUL in servo would be a huge mistake.


Project Oxidation is what you're looking for.

https://bugzilla.mozilla.org/show_bug.cgi?id=1135640


This reminded me of jwz's old rant, "The CADT Model": https://web.archive.org/web/20151126183335/https://www.jwz.o....


> But hey, donate to Mozilla! $5, $15, $25,

Ah yes, thank you for the reminder!

http://imgur.com/hLZp8SG

Now that I actually have a career and money I have no problem giving back. I'm glad I can donate to Ubuntu and Wikipedia nowadays. I am grateful for everything Firefox has given me for well over 10 years...

I really don't know what XUL is (intermediate language between HTML and FF UI?), but I guess I do feel sorry for people who have been using Thunderbird. I hope it's significant enough a property that people will want to continue, write it in Rust?


Not sure if serious, but...

XUL (or rather XULRunner) is a cross-platform UI toolkit, basically a Javascript+XML runtime built with C++. It's what Mozilla programs are built with today, abstracting out a lot of OS-specific details. It's unlikely to ever be rewritten in any language, and pretty much failed to get any traction outside of the Mozilla ecosystem.


Why rewrite it in another language? It isn't currently broken.

Things that are not broken do not need rewriting.


> Why rewrite it in another language? It isn't currently broken.

You need to support that claim. The people who actually work on Firefox have a list of reasons why they made this decision:

https://blog.mozilla.org/addons/2015/08/21/the-future-of-dev...

Why are we to believe that you are better qualified to make that call?


Maybe they decided there's too much technical debt and they can't make changes to the browser as fast as they wish they could in the current framework?

Not 'broken' in a traditional sense but still a valid reason to rewrite.


Yeah, but there is a trade-off involved. If you spend 3 years rewriting your toolkit, then the time you will eventually save on further changes has to offset those 3 years, and that's very hard for most projects.


This is not the first time (Mozilla aka Netscape) and it reminds me back when I first started my programming career and would read good ole Joel:

http://www.joelonsoftware.com/articles/fog0000000069.html

Joel was right and wrong.. Firefox was a huge success (or at least in my mind it was) but some might say the Netscape company was trashed during the process (I don't know if I agree).

Starting over isn't always a bad thing but I agree that is also highly overrated.


It was a failure as he said because the goal was to rewrite it for the company to do better. They failed at that. More work was done including by Mozilla. That worked but was too bloated and all-in-one. Eventually, someone trimmed it up to make Firefox and added the customization features. That succeeded.

So, there was a failure, years of struggling changes, several new audiences, and another big change before it made it. Not seeing it as a counterexample as much as good luck for a project that seemed doomed to failure.


I don't disagree it was a bad idea to do the rewrite but rather the company (Netscape) was on a downward spiral regardless (it didn't help but I don't think it was the sole reason).


Definitely wasn't the sole reason, their server software generally sucked, e.g. a lot of people happily switched to Apache once it was perceived as being sufficiently trustworthy. I could see the latter happening in the 1995-6 period.

One thing I've read about the rewrite is that Netscape accuihired? (back before that was a word) a failed company, and put its failed managers in charge, who I guess were good at what really counts in the short term (looking out for themselves). An obvious corollary to the "don't do a rewrite" is "if you're going to do one, employ really good people to do it".


Its funny (and ironic to me given the topic) you mention Netscape and Apache. One of my first jobs out of college was rewriting an old LiveWire application (yes ... the original server side Javascript) to a JSP Tomcat application.


I forgot LiveWire existed. One of few that finally faded from memory. Of old & commercial ones, AOL server + a TCL web framework are still getting updated. Opera is getting redone. The product with the coolest name is still scraping by per .cfm pages I see. Some ancient tech still around but mostly going bye bye.

Only thing left is maybe to redo Mosaic in Ruby or Java to see if it's technically feasible to slow its rendering down any further.


I welcome a Servo based browser with an HTML UI.

An one-page email app running on Servo that is inspired by GMails interface would be great.

At the moment I use Thunderbird, Outlook 2010 and GMail. Outlook has good calendar integration and GMail the better tagging, search function and UX.

Opera 12 had an inbuilt HTML+JS based email client. Moving to Chrome/Blink with Opera 15 they stopped shipping the inbuilt email client.


Was it really HTML+JS? Because it was fast, many years back (used it just as an RSS reader though). Also, they shipped it as a standalone product[1], but I don't think it ever got updated after that.

1. http://www.opera.com/computer/mail


From memory, it was C++, much like the rest of the Opera UI. It used Quick — Opera's in-house cross-platform UI toolkit, and it died (along with the rest of the old Opera) because Quick was heavily entwined with Presto.


It's all open source right? You are not the first person I see complaining about losing XUL, why is no one forking it instead of just making sarcastic comments? I thought that was the whole point of open source, that when the original maintainer loses interest the community takes over. You even say in another comment that the effort is not very big (I don't know, never seen the code) so why not?


You have to build the entire pipeline again. I am not sure if Mozilla is even in the interest of keeping TB build infrastructure around.


"Why you should donate to Mozilla" - A big thread from yesterday. This is why you shouldn't.


But hey, donate to Mozilla! $5, $15, $25, anything helps.

They're starving over at Mozilla. They took in a mere $323 million in 2014. How can you expect them to properly fund multiple projects on such a pittance?

/sarcasm

https://news.ycombinator.com/item?id=10650325


Which would be a ton of cash for most applications, but this is a web browser and they are in direct competition with not just one, but several, of the biggest companies in the world. Here are their 2014 revenues: $86 billion (MS), $182 billion, $66 billion. Budgets for the other browsers aren't clearly defined, but we do know it's a big focus for all the manufacturers and they each have major incentives to keep users inside their own ecosystem.

Even just keeping up with all the standards, and contributing to them, is an enormous undertaking at this point in the web's evolution. Let alone UI design, devtools, porting to mobile OSs, etc.


I'm fairly sure the revenue attributable to Firefox dwarfs what any of the bigger competitors are spending on their browsers, so it's not really for lack of resources if Firefox falls behind. But it's the cash cow that subsidizes pretty much everything Mozilla does, some of which is great and some dubious. Everyone can make their own case for which projects are great and which are iffy.

Personally, I would think an e-mail client is the perfect complement to browser development, and something that could/should be another significant source of revenue instead of a burden.


> I'm fairly sure the revenue attributable to Firefox dwarfs what any of the bigger competitors are spending on their browsers

I suspect you would be wrong at least for the cases of Google and Microsoft. I don't have a good feel for how many people Apple has working on Safari and WebKit, but both the Chrome and IE teams are significantly bigger than the Firefox team from what I can tell, and probably more expensive unless you think Google and Microsoft pay developers less than Mozilla does. Let me ask you this: how many people do you think Mozilla, Microsoft, and Google each have working on their respective browsers? Ballpark figures, of course.

Also, estimates are that the money Google spends annually just _advertising_ Chrome in the last few years (TV ad campaigns with Justin Bieber and Lady Gaga, worldwide ads on public transit, etc) is comparable to the entire annual revenue attributable to Firefox.

More

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: