I'm just going to keep running Thunderbird until there are gaping security holes, people say Thunderbird is bad but everything else is worse.
Repo for Thunderbird:
I too had heard development had stopped and misconstrued that to mean all development.
Now I'm wondering if there are any issues I might be able to help fix. I still use Thunderbird so if I can help keep it alive, I'd like to.
As parent pointed out, most of the alternatives are worse.
There are always issues to help out on. Here's a bit more info on contributing . I imagine it won't change much as the project is officially handed off to the community. It's largely led by the community anyways at this point.
To really make Nylas compelling over Gmail, and also to replace Thunderbird, here's what I think needs to happen:
1. Bundle Sync Engine and N1 into a single program.
2. Implement local search. Faster than IMAP search and doesn't rely on a network connection.
(As it stands, Nylas requires an internet connection to work. As long as that's the case it's no Thunderbird replacement at all, and will have a v hard time winning over Gmail users.)
The good news is that both Sync Engine and N1 are GPLv3, so it's totally possible. The bad news is complexity:
* Sync Engine is written in Python and uses MySQL for persistence
* N1 is built with Coffeescript + React + Electron, and uses SQLite for persistence
So we need to simplify. We can't just bundle all that into an app, it would be a monstrosity. Here's how I think we could do it instead:
* Use Sync Engine as the basis for a Node library for IMAP sync
* Use the SQLite full-text feature (fts4) for search
That way you'd have a single Electron app with all persistence in SQLite.
I'm prototyping this now. Thoughts? Let me know if you have ideas or want to help!
Search is definitely an area we can make dramatic improvements, and we're working on it.
Having a server communicate with IMAP (which is slow), cache the results, and export a more efficient/log-structured alternative probably makes syncing client computers much faster.
The sync engine handles all the compatibility or dozens of IMAP and Exchange servers, and uses a variety of heuristics for real-time notifications. It also does the heavy-lifting for the huge amount of data in most people's mailbox. The N1 app can cache your entire archive in about 10% of the disk space required if you also download all unprocessed headers+attachments.
Right now our search implementation just sends a proxy request to the IMAP search, so you'll get the same performance as Gmail web, Mac Mail, etc. It's not ideal. We're working on our own faster search system, but it's tough to build a huge distributed search cluster that handles tens of terabytes. (And is growing quickly!) If anyone here wants to work on that, we're also hiring. ;)
N1 is built on Electron, which under the hood is Chromium+NodeJS so that comes with a bit of weight. But the app itself is pretty manageable. You can check out the source here: https://github.com/nylas/n1
I believe the sync engine translates between IMAP and a custom JSON protocol they've created. I'm not sure what the performance difference is, but looking at the API docs, it looks a lot nicer to work with than IMAP: https://www.nylas.com/docs/
It also seems to more accurately represent modern semantics for working with email. For example, it allows using OAuth for authentication, and it has separate fields for folders and labels, depending on the backing email service. (And unlike IMAP, the labels field is actually useful.) You can also download a transaction log to make sync faster, which I don't think has an analog in IMAP.
Subjectively, I've found the performance to be rather snappy in most cases.
(You can also run a local client of the sync engine, if you don't want to trust their hosted copy: https://github.com/nylas/sync-engine)
- FAST. Mutt lets you process thousands of messages in short order. Mutt is directly responsible for helping me dig out of a 50k deep email hole brought on by years of GMail's approach (archive, never delete) in a few days of on-and-off cleanup.
- Plays well with others. Uses a bog standard Maildir format that anything sane can read (including Thunderbird, so migration should be easy), provides a very powerful hooking system which can do anything from verifying PGP signatures to checking your spelling before sending to displaying attachments.
- Sane defaults. Doesn't require a lot of in-depth customization to provide the basics, and the bells and whistles are hardly out of reach.
- You have a backup of your mail for free (by using OfflineIMAP)
- Not hard to learn (at least for the crowd here). If you can use Vim, you can be using Mutt at full speed in less than a week.
- Search capabilities that blow nearly every other project out of the water (regexes with some very cool niceties)
- Secure by design. Mails are rendered as plaintext, HTML is decoded by piping messages through a program like w3m. Tracking bugs can't do their job, spammers can't see that you've looked at their stuff, and you're immune to whatever image decoding bugs crop up. (Though you can still view images, it's an explicit process). Also, builtin GPG support.
I followed Steve Losh's guide on setting it up.
I also recommend an indexer like notmuch or mu to gain Gmail-like search and agenda capabilities.
Some quick googling makes it look like you indeed do that first thing, and then set up a hook to load a different configuration file when you enter the folder for a different mailbox.
Unless you actually need to do things like move entire e-mail threads between accounts, it works well enough.
While I like command line apps, and am a die hard vim user, there are just some applications that I really want to use a mouse for, and email is one of them. I've tried mutt a few times before and it's just never clicked with me. It's not that I can't learn how to drive it --- it's that I don't want to have to. Finding little-used commands is so much easier if there's an actual menu.
The little-used commands can be found by pressing "?" in any mode. You get a nice list including any custom macros you've set up (I don't use many, but a few are essential). It's far simpler than clicking all over menus and dropdowns and hovering inscrutable icons...
I've been a Fastmail user for almost a decade. I've never been happier with any email provider (or any provider, to be honest). You pay for the service, but it's worth the money and more. You are treated like a valued customer and the support, if needed, is rapid and professional. I cannot recommend Fastmail enough.
(Fellow FastMail user here. They're really great.)
EDIT: Well, I see her/his comment about Webmail albeit as a second choice. It seems the preference is for a desktop client, but I shouldn't assume.
If I recall correctly, fastmail cannot be self hosted and I really don't feel like trusting some random service with my personal and business email. Or paying them money for something I currently get for free for that matter.
I used to run all my own stuff, but when my children came along, I didn't want to spend all my time chasing issues with servers.
Look the Fastmail guys up. If anyone is doing email correctly, it's Rob, Bron, and the guys at Fastmail. They actually write some of the code for the Cyrus IMAP server, so if you were in doubt as to how talented these guys are, don't fear. I would now NOT trust my email to anyone else. They are that good. Yes, I know, I'm some random guy on the Internet, but I'm an IT guy, which in itself says nothing, but I'm nothing if not extremely picky about my own IT. These guys can fix any issues you may have and probably give you some awesome suggestions. They are approachable, something you will never get from any other company. They are based in Melbourne, Australia, but use NYI here in the US. Worth a call if you value good service and top-notch know-how from guys who know email better than anyone else I'm aware of. They routinely post in the Email Discussions Web site forum.
Mhm, 2 of the "Five Eyes" countries. Yeah, you're really selling me on this one.
I'll admit I'm being a bit ridiculously paranoid, but it's cheap paranoia for me, Thunderbird works well as a client and I spend maybe 5 minutes per month doing mail server admin stuff. I'm really not willing to switch to any hosted solution for this.
Just looking for another mail client option if/when Thunderbird goes down hill, not a mail server.
I'm curious because we'll have to migrate to a better email provider in a month or two, and Gmail enterprise seemed to be the best fit for us (price, storage, good iOS app to leverage push notifications, etc).
On push, Google Apps gives you ActiveSync as a protocol, as an alternative to IMAP. But the integration with iOS Mail is weird and I could not use it.
But FastMail on the other hand, in a twisted turn of events supports push through Apple's own APN, getting the same treatment as iCloud: https://blog.fastmail.com/2015/07/17/push-email-now-availabl...
FastMail also has a native app that does push notifications, but it's just the packaged web interacts with push notifications added. This is both good and bad. It's good because the mobile web interface is very decent, compared to Gmail, and this means you always have a decent UI on whatever OS you have. It's bad because it doesn't feel native, but then there's nothing more native than the iOS Mail app.
I've been a Fastmail user for almost 10 years. Never an issue that was not solved in very short order and most professionally. Fastmail offer a modern product with old world service and charm. A win-win.
* even after you've paid for the product, the only way to get support is to pay $10 for every question you want to ask, however basic or advanced it is - and you'll need support, because...
* there's erratic unpredictable behaviour at many places; is it some subtle bug in the software? or is it some weird about:config option that you have to set to get things working as expected? Who knows! If you're lucky, you'll find a solution in the Thunderbird forums that works. Often though, that dialog option has been removed from Postbox or that about:config option no longer does anything, and...
* the documentation is no help. It's no worse than the average startup's help pages, except in this case, as mentioned above, there's no support - so you'd expect that to be compensated with exhaustive documentation, but no such luck. The docs cover just the most basic cases and leave all the complex interactions of emailing to your own guesswork. Also...
* the devs avoid users like the plague. The only-paid-support thing is (explicitly stated as) a consequence of this, but this even extends to bug reports and feature requests - which are free of cost to submit, but the two bug reports (and one or two feature reqs) I submitted were met with only gaping silence. The interaction on Facebook and Twitter too seems limited to version announcements and rare one sentence replies.
I really wanted to like Postbox, for it to be the solution, because it was quite a good product with advanced capabilities and a quite reasonable cost. But it didn't seem like a reliable option for the long run given these limitations.
A recipe for a terrible product. I tried Postbox when it came out and found it to be unbearably buggy (like they had released an alpha version). I couldn't even get Gmail to work, which should have been user story #1 for them: set up Gmail account.
Now I understand the culture that made it such a piece of garbage and will continue to avoid it like the plague.
EDIT: Nevermind, no Linux build, that's out for me unfortunately.
What is the catch here? Granted I'm using the integrated graphics that came with the CPU but I have 32 GBs of RAM. What gives!??
If you have enough rigid requirements nothing will ever be a good fit.
If you do try it, please let us know of your feedback in the issue tracker! :)
(And that is, you can also run ownCloud with the Mail app locally. We’re working on improved caching.)
It's a robust IMAP client that has cross account message rules, smart filters, blocking external images, OpenPGP, Google's wonky IMAP implementation, great support, and a lot more.
I think that Mutt might be a better option in that I think it is still maintained.
In a few years the all new HTML Firefox will come out. My bet is that it will suck. It will lack a TON of features that the existing Firefox has, but hey, it's all HTML! And you won't be able to stick on the old one, because within a week or two some critical security flaw will be discovered and eventually (like six weeks later) they'll stop supporting those for old Firefox.
Initially the HTML Firefox will suck. When you take an app that's been worked on for 15 or so years and then replace it's UI you're going to lose a TON of features. They'll slowly reintroduce some of the most popular features (hamburger menu will be priority #1!) but there will be a TON that they will not reintroduce. Why? Because when they were first introduced a decade ago it was a cool idea someone had, and no one knew how popular it would be, so heck, why not implement it. But now they know that only 10 million or even 1 million people use that feature, and they're only interested in 100 million user features! If Google Chrome doesn't have it, it must not be important!
As much as people complain about XUL not looking native, wait for HTML Firefox, it will take them forever to get where XUL was years ago.
They can't just kill XUL for Firefox though, they have to burn down the XUL ecosystem first so they're just releasing a new Firefox, nothing to see here.
1. They try to kill xulrunner as a project separate from Firefox. They try to move everyone to firefox -app.
2. They stop releasing binaries for xulrunner.
3. They deprecate XUL extensions.
4. They distance themselves from Thunderbird. They say it's better for Thunderbird. Yeah right! Thunderbird is built on XUL, it's not going to be rewritten in HTML any time soon, definitely not by volunteers. It's not going to be able to maintain XUL either, and when Mozilla stops supporting XUL for Firefox a few years after deprecating XUL extensions then Thunderbird will be screwed, but hey, it's not our project! We abandoned it years ago!
So when the crappy HTML Firefox shows up, with way less features than the Firefox of today, remember that this (Thunderbird) was one of the things given up to have it.
But hey, donate to Mozilla! $5, $15, $25, anything helps. Because we already make hundreds of millions of dollars and we do whatever is shiny and new, screw the "community" of existing stuff. We're fighting for an open web! (where you can use Gmail for email)
But as HTML and CSS slowly got more features than XUL, XUL development slowed down, up to the point where writing the Firefox UI in XUL became a pain because of poor tooling and sneaky bugs. More and more pieces of Firefox got written in HTML inside XUL, and factorizing code between the pieces in XUL and those in HTML was nightmarish.
Dropping XUL means putting those bugs and issues behind us, and focusing development on a single DOM language. You would probably be surprised by how much of the UI already is in HTML; tab groups is almost all HTML, and the DevTools' editor and DOM inspector are in HTML as well.
As for donating to Mozilla, the distinction between Mozilla Corp and Mozilla Foundation is understandably complex for outsiders, but basically only Mozilla Corp makes money from the partnerships.
Being nonstandard is completely irrelevant here. Firefox's UI doesn't need to be rendered by Internet Explorer or Google Chrome.
I'm glad to see it disappear, it's one of those things the world doesn't need. A failed experiment. And an HTML UI for firefox makes sense in the long run.
Fun fact of the day: Did you know XUL uses DTD to store translations? That's right, if you have a string you want to translate, you just have to create a new XML element in a localized DTD file. Isn't that just a wonderful idea.
I worked on TomTom Home, which was implemented in xulrunner, and I developed some internationalization/localization tools that had to deal with XUL DTDs as well as several other different and incompatible file formats for storing translations. I could never for the life of me figure out why they decided to use DTDs with external entities for translations.
It was meant as a pro. Since it was their own language, they could do anything with it. They could make a better HTML.
But HTML caught up. Currently, I believe HTML is better than XUL, and making XUL great again is both a silly reuse of a political slogan and a waste of effort.
I for one would rather see efforts made to allow CSS styling of all input elements in HTML.
That you say that is a testament to how well the meticulous CSS styling of the XUL elements—which applies equally well to HTML (try it in a browser chrome shell!)—worked.
It's relevant in that much of the work has to be duplicated (documentation but also layout implementation & the like) and none of the new and improved web development tools can be used for XUL.
No, it's not irrelevant. Standards are also about documentation (it's easier to document standardized things) and familiarity (people are more familiar with standardized things).
Given that there are more developers familiar with HTML development, it may lower the barrier to developing plug-ins, or providing patches.
Also, there's a generational shift underway. You and me could find crazy that people would openly choose to use IDEs built on HTML/CSS/JS, but that's what a lot of young folks are doing (Atom, VSCode etc etc). That's their world, that's what they like. An entire generation now exists, who learnt to code from web scripting rather than C or BASIC. They have taken over. It's just how it is.
(this said, I agree that donating to Mozilla feels a bit silly, looking at how much money they make from commercial agreements. It's like donating to Ubuntu or RedHat.)
No one has listed the ten awesome features that we're going to get from HTML Firefox (cause there ain't many) or the 1,000 features (tons of little details) that will be lost. If users listed their 10 biggest problems with Firefox I doubt any of them would be solved by moving to HTML.
Imagine if instead of writing VSCode from scratch and releasing it alongside Visual Studio Microsoft had rewritten the Visual Studio UI in HTML, abandoned all the nonessential features, and abandoned the old native Visual Studio.
One might say that Mozilla will wait to release the new Firefox till it has all the old features of the old Firefox, but that's not been my experience with how teams work. They'll get frustrated with the rewrite and want to get it out the door. "We can add those features later" they will say, and then they'll never get added.
Ah, but maintaining XUL means working on old code (which is boring), but moving Firefox to HTML means working on new shiny code (which is exciting).
Instead, they plan to render the UI natively, with only "some" parts in HTML.
So, instead of XUL + HTML, we’re going to get GTK + WinForms + Cocoa + HTML. Great, eh?
And we lose the ability to style it with addons – your themes can only change the background image of the header bar, that’s it.
And the remaining addons can’t modify the UI (tree style tabs, bottom tabs, etc) anymore either, instead you can only modify page content.
I’m seriously pissed off now, because Firefox was the last browser where I could actually customize it how I liked it.
I hope the person who made this decision is going to have to use software without any config options and with horrible defaults, like GNOME. For the rest of their life. May their car always have have either 60°C+ heat, or -20°C AC, may their screen of their phone always either be too dark, or too bright.
Official statement from the Mozilla post in the discussion regarding removal of support for "heavyweight" themes. Emphasis mine.
That’s a pretty clear statement that it won’t be 100% HTML.
Also, the fact that "arbitrary styling and scripting" won’t be possible is another issue.
Tell me how I am supposed to write an addon that adds tab-previews as thumbnails when you hover over a tab like Vivaldi is doing it: http://i.imgur.com/vqysJs1.png ?
How am I supposed to write an addon that colors the navbar and the current tab in the theme color given by the HTML, or, if not existing, the favicon?
With current addons I can do that, with the new addon system, I’m seriously fucked.
The Palemoon developers understand the folly of ditching the flexibility of the XUL interface. They won't be removing it.
So, a fork is a rough solution and will have maintenance issues for an app this size.
But these rants are just silly. XUL is a technology that needlessly duplicates what HTML/CSS do these days. And if you ever want to have a smooth transition to servo, which solves real, deep problems, having a html UI is going to be dramatically important.
As long as Moore's law provides enough lift under our wings, RAM usage is one of the less important aspects of an application. However, I fear the trend of building application UI's in HTML/CSS because invariably it will lead to wildly inconsistent look, feel and behavior.
Sigh. I feel old.
Realistically, getting to feature parity after rewriting a core part of any application is going to be nearly impossible. You end up with different features, hopefully better ones, but not exactly the ones you had before you started. You can't step twice in the same river.
I'm 25 and I feel to old for this industry already.
It's annoying. I miss Web 1.0. Add just a bit of dynamic functionality plus broadband and it would be fine for 80-90% of cases. And FAST!
Instead, we get Web 2.0, 3.0 (4?) which makes pages on 40-50Mbps load like my old 28Kbps modem on AOL. Seriously...?
It should be easy to find performance numbers showing how Firefox 42 renders old, pre-CSS pages slower than, say, Netscape 4, then.
Netscape 4 didn't have a JIT, didn't use hardware accelerated layers, trapped into kernel mode for GDI calls, didn't use accelerated SIMD for painting, and didn't have HTTP 2. It barely had any optimizations for dynamic restyling, so tons of stuff would get reflowed when it didn't have to. This is just the tip of the iceberg.
Browsers have gotten more complex, but the complexity is often in the service of making things faster.
And I never mentioned Netscape: it was called Netscrape then and hackers despised it. I used Opera and IE mainly.
In the late '90s, the time frame this thread is about, it was well known that the layout engines at the time were not dynamic: they could not in general reflow only parts of the page. Everything else I mentioned in the post that triggered this subthread is obvious simply based on browser engine and OS history.
> (Netscape 4/Opera/IE) didn't have a JIT, didn't use hardware accelerated layers, trapped into kernel mode for GDI calls, didn't use accelerated SIMD for painting, and didn't have HTTP 2. It barely had any optimizations for dynamic restyling, so tons of stuff would get reflowed when it didn't have to.
The features of a program = functionality.
So we're getting our wires crossed due to word choice.
In any case, I'll admit this has dragged on long enough. Truce?
It was just faster, better looking, and skinnable. Plus malware stayed hitting IE so there was that too.
Disabling js has given me the fastest turn-round on page-load times. It's usually not the browser, but loading 15 different un-optimized JS engines that causes the problem.
I must say: Atom Editor is great. Using HTML/CSS/JS to build desktop/mobile apps really makes sense to me, especially given that:
- You only have to support one rendering engine.
- You have access to the latest Web Components/ES6/CSS3 features.
- You can rely on native Node.js modules when needed.
It's good for portability, HTML/CSS became better at UI, JS becomes a better language, current IDEs are great, live debugging tools are great.
- There are far fewer native components, leaving accessibility down to the developer of the app and making it nonexistant.
- Platform integration is impossible, which means there is no way for the framework, for example, to create widgets differently on OS X, Windows or Linux (these platforms have many different conventions)
- Theming globally becomes impossible. If you want to write a dark theme for your desktop, you go from writing a theme once for GTK and once for Qt to once for each and every app you run. Uuuurgh.
But only supporting one rendering engine, yay! Much better than the 6 different engines we have to support in Qt (huh?).
And having access to all the latest JS additions! ... that were copied from other languages you could develop desktop apps in, because JS is a terrible hack.
And relying on native Node.js modules, yay! As opposed to native modules for literally every other better language out there.
I'm not sure what you're actually comparing this workflow to. Maybe one day writing HTML apps will be great, but today is not that day. Today, writing HTML apps is only beginning to be an idea that doesn't completely suck. But for the user, it does massively suck. Massive apps that ship their own copy of webkit/blink/whathaveyou, with security flaws that won't get patched, disgusting performance on low-end hardware, atrocious battery usage and decades of UX knowledge thrown out of the window just because the app developer doesn't have the knowledge to see it.
I don't look forward to this. And I'm younger than you.
So web apps can't have reusable components now?
> "And having access to all the latest JS additions! ... that were copied from other languages you could develop desktop apps in, because JS is a terrible hack."
WebAssembly will take over for web apps eventually.
> "decades of UX knowledge thrown out of the window"
UX knowledge isn't toolkit dependent, UX knowledge is just as applicable on the web. You can make a dog of an app with any toolkit, native toolkits offer no guarantees for good UX, you can only hope that designers choose to follow best practices.
I'm not saying they can't, I'm saying they don't. You go ahead and try to fix that, create widget#772981 that still won't support typed-selection, or will break with large text or what not... I've seen too many of those, most of them bad, and none of them standard. So far, React is the only thing that even comes close to a sane model for a contender to UI development on the desktop, and it still mostly fails the accessibility checkbox.
> WebAssembly will take over for web apps eventually.
More eventualism. Do you have evidence for that? Do you even have evidence that it'll be better than what we have now in other languages if it does take over?
> UX knowledge isn't toolkit dependent
A lot of it is. You'd be surprised just how much UX is crammed into Qt Widgets for example. Years of experience making them more accessible, more usable, more consistent with the platform they're running on, etc.
They already do. Electron, Web Components, etc...
> "Do you even have evidence that it'll be better than what we have now in other languages if it does take over?"
Compare the performance of vanilla JS vs. asm.js. It is clear from what developers have stated that WebAssembly performance will exceed the performance of asm.js, the threading improvements alone should offer noticeable benefits.
> "Years of experience making them more accessible, more usable, more consistent with the platform they're running on, etc."
So what are we looking at to replicate that? A theme per platform? Some accessibility work? What else?
It should be noted that the web isn't starting from zero with UX either, we've already had 20+ years of refinements to the web user experience.
And nobody can agree on which to use. It's not standard if it's just "some set of components some people reuse". A far cry from standardization.
> Compare the performance of vanilla JS vs. asm.js.
That's not what I'm comparing. I'm comparing the performance of native toolkits vs web toolkits on asm.js. It pales in comparison, and the battery usage is through the roof. ymmv?
> So what are we looking at to replicate that?
1. Standardization of components (developer does not have to build their own scrolling system, context menu, etc)
2. Themability of components at the application level (developer can style components not to clash with the style of the application)
3. Themability of components at the platform level (user can style the application not to clash with the style of their own desktop)
4. Performance needs to shoot way, way up. Apps can't rely on a performant GPU, it's unreasonable to ask that of every device at this point in time. Some day maybe every device will come with their own high performance GPU, but eventualism cannot excuse bad coding practices and unnecessary layering.
The rest should follow. But I still don't see us getting any of those things, any time soon. These are not easy problems to solve.
Atom uses Electron, VS Code uses Electron, Light Table uses Electron (starting with v0.8).
As for the web side, web apps are a young field, what else would you expect?
There are certainly popular UI elements, Bootstrap for example.
If you want to make an web app that looks native, using React Native could be a good solution.
This is lost in a sea of replies now but I'm sure pissed off Mozilla is completely losing their root mantra of fighting for the free web. Persona, Thunderbird, two critical components of a "free web": free global authentication, free email client. I'm sure next mozfest the same suits as every year will talk about how they're so proud of "keeping the web open". What a crock of crap. Firefox isn't even that good of a browser anymore.
This is one of the reasons why iPads have stalled - pro developers can't make money, for well documented reasons. The web equivalent will be the starving of open communication, thought and creativity, leading to homogeneous noise as a poor substitute for ground-breaking content.
Mostly I think it is an effort vs reward thing. Thunderbird doesn't have the user base and doesn't have a revenue stream the way fire fox sells the search box. I wonder if anyone over there has thought about cleaning it up and selling it as a white label email client.
So, developer convenience trumps user experience? Who cares about your battery run time, how hot your PC runs, the bandwidth overhead, the massive attack surface from all the useless components shipped, how badly the webapp integrates into your OS, as long as we can ship faster and faster?
As a developer, I feel that it's easier for me to create good user experience (nice UI, easy non-blocking I/Os by default, etc).
I don't say it's flawless, I say it's now becoming a very good alternative.
Is all that bullshit really worth… whatever we're saving? (Our company is mainly saving in developer salaries, because we can force kids fresh out of school to work for minimum wage, instead of hiring experienced developers… we'll see how long this keeps working.)
For some less positive feedback on Atom, which reinforces some of creshal's points, see the Atom section of this text editor rundown:
I've also frequently seen complaints to the effect that Atom is much too slow, even on machines only a few years old. Even my 486 laptop back in 1997 could run a text editor with syntax highlighting (specifically, JPad for programming in Java). What are editors doing these days that needs so much CPU power?
This review (6 months ago) is based on an early version of Atom Editor (0.204.0), the latest stable (1.2.4) is way more mature and provides very nice packages .
node-webkit apps are cheaper to develop. Far cheaper.
Everyone can write a webapp, a native application requires professional developers. The payroll looks completely different.
And then the devs who have only worked with node-webkit don’t know how much better they could have it. I know devs who refuse to use map, reduce and filter, because "it’s black magic and we always used for".
edit: I'll take my downvotes with pleasure. The fact that I am able to use Atom or Nylas N1 or Nuclide on linux with 0 problems alone is enough to welcome proliferation of web tech on desktop.
Look at Code Academy, for example. They add new programming languages and technologies every now and then, but basically it's always the same. Just like their audience. They won't add C to that list, because that wouldn't work for this average not-nerdy-enough-for-real-programming audience.
Real programmers (like me) use C.
Seriously, no. Just no.
And if I wanted to prevent people from writing "the most disastrous code in the whole universe", teaching C instead of JS would be much, much lower in the list than teaching how to split code into modules/libraries, write testable code, etc.
I'm all for modern approaches but I can't help but feel many developers have lost touch with what writing clean code is, it's not making module, libraries or even tests, it's making sure what you are doing is as simple as it can be and efficient at it. Sadly most modern web stack fail at that. All hidden in mumbo jumbo of modules and dependencies no-one really needed or asked for, often created by people who never questioned the purpose of what they were doing, or if the whole internet needed it (because you are at Google and have found a neat way to deal with your huge JS stack doesn't mean the whole web needed it too, and that you needed to spent a whole lot of effort making people adopt it).
As much as you make fun of C, learning and writing C will teach you to keep you programs simple and efficient, because the language requires it. And that's coming from someone who started programming with Perl, then PHP, and only learned C later on.
Makes thing simple not simpler should be the cardinal rule of programming, not modularize and test everything, those are situational, the former applies all the time.
I didn't make fun of C. I made fun of a comment posted by a C programmer, which is very different. I have absolutely nothing against C.
> C will teach you to keep you programs simple and efficient, because the language requires it
From what I've read, the OpenSSL codebase is definitely not simple, and I'm not sure it's efficient either—it would depend on how you define efficiency. So your affirmation seems factually incorrect.
Other than that, I agree with your post. Simplicity is awesome. No point in using Angular to build a landing page if static HTML can do the job just as well. (Edit: let me take that back. There can be a point: the pleasure of experimenting and learning something new.)
C teaches you how stuff works. Its important.
For instance, you might create a model class, and then you know that model should have a name, shouldn't be able to be saved without a name, the name should be x-number of characters.
Then you can write a series of tests that makes sure that, regardless of how the model implements that name, all your assumptions about what that name should look and act like don't change without throwing a red-flag up to whoever is changing that model.
Making code into reusable modules code is generally a good thing, but it's even better if you make them in a way that will help you test those chunks independently as well - that is testable code.
Then of course you need some way to automatically run these tests, but that's usually provided by IDEs or standard libraries these days.
Edit: most curious downvote ever...
Sometimes it feels like everything was already invented in the 60's.
Reading up on the capabilities of early mainframes is eye-opening if you (like me) grew up on Pentiums.
Close enough... I grew up on XTs, i286, i386, so on :) All of them way weaker than my phone (Not sure if that was what you were referring to).
However, in terms of architecture, algorithms, programming language features... It feels like we haven't advanced too much. Actually, the opposite... we are encourage now to do not be too clever on programming because processing power and memory usage is close to be a commodity and clean code is more important (which is fine but less fun).
Gave examples of what features old ones had in the essay below with the first link mentioning the specific systems for further inspection:
Note: B5500, System/38, and Ten15/FLEX are all especially worth considering. Two were basically HLL machines with type-safety and interface safety enforced at hardware level. System/38 was object-based with HW- and SW-level protections plus portable microcode layer.
I'd say Channel I/O counts as one that kicks modern systems' asses. Servers have been copying it bit by bit over past ten years, maybe even exceeding it, but server OS's are inherently inferior in usage given mainframe OS's are designed for I/O offloading at core. Near interrupt-less architecture with acceleration engines makes many apps scream with performance. And would only cost $10 per CPU on desktops but would require Windows & Linux rewrites. (sighs)
However, the change to the AS/400 and POWER cost it one of its greatest features: hardware-enforced integrity at the object level. That's the feature that would still be giving hackers hell if it was widely deployed. The Intel i432 APX and i960MX had similar property. Interesting enough, IBM actually has secure CPU's they've prototyped and even sold to select customers. Would be great if they integrated one with IBM i at microcode, compiler, and OS levels. That plus an optional interface for new customers without legacy crap would be a huge differentiator that might give it new life.
AMD dropped rings in their 64 bit architecture which Intel was forced to adopt, so they're becoming a historical curiosity.
Not quite. Hypervisors are operating in Ring -1, SMM is equivalent to another ring above that, and I can't find anything about AMD64 dropping Ring 1/2? Ring 0/3 at least are still in use.
I tried looking at the latest 4.2 kernel tree - but the assembler/c-code that sets up and deals with sys-calls has been rather re-factored, so I'm not entirely adamant it's ring 0 and ring 3 for both 32bit and 64bit - but I think so? (From a quick glance, no sections stand out as calling out ring 3 explicitly when talking about returning to user-land -- granted I didn't do any searching over the code).
A couple of minutes with Google only found hints that confirm my memory WRT to AMD64 and rings, and/or Intel not copying a segmentation feature added to later versions of AMD's chips.
There are already cross-platform toolkits that can deliver everything a browser engine can, faster and safely.
Face it, "web technologies" are not winning because of any massive technological advancement, just like C++ wasn't this huge advancement over C. They just managed to achieve enough critical mass to make everything else look less popular. In the '90s, OOP did that through academia and commercial push (in what was a much smaller tech sector); html/css/js did it through the accidental monopoly that is the web browser. The end result is basically the same.
Nothing with the reach of web technologies. The closest I can think of is Qt, but there are issues with deploying Qt apps on some platforms.
• Native applications that give users excellent UX and performance
• Web apps that are equally quirky and slow on every system
To give an analogy, it's like programming languages. It's possible to write very fast code with assembly languages, yet their portability to other architectures is practically non-existent. Part of the reason higher level languages like C/C++ are used is because they are much more portable.
Note: And some that are laughably obvious and available today lol.
So, yes, what people use today is an accident of history. That includes COBOL, C, C++, OOP languages, HTML/CSS/JS, HTTP-centric everything, and especially whatever crap is being built on them next.
If you're curious, here's the history I put together on C language and UNIX in numbered list form. You'll see how IT evolution often works in practice to give us lowest common denominator. And afterwards people swear it was product of good design and great achievement. (rolls eyes)
Office Suites had rich documents with smart(ish) widgets, but no security - a macro in Excel had access to all your spreadsheet data. Web apps don't really have any good encapsulation either -- so we'll likely repeat the macro-virus era with web virus era (I'm not sure if we already are or not, there's certainly been a few self-replicating ones, that eg spread via facebook updates etc. Not sure if they generally live in the phone-apps or various web-apps. Probably both).
We already had the future of web applications within reach years ago - but apparently no-one cared: http://lively-kernel.org/
Depends what are looking as an equivalent. Browsers didn't exists but markup languages have been around since the 70's (mentioned for first time in late 60's)
Maybe an equivalent to express GUI as resources? yeah, we called it RAD back in the day and they were quit popular among some circles. Does somebody remember Hypercard? Or even the first versions of Visual Basic?
Whoa! They have syntax highlighting! And intellisense! And a package manager!
Kind of like Notepad++ or half a dozen other editors?
I'm not saying "we should NEVER use HTML for native dev", but it's not there yet, and I don't understand the hype around these two tools.
And it nearly killed them.
They could still be doing it wrong.
When someone shows me/I find a better way to meet a requirement I will adopt it (static site generators, Go routines, ...). I won't doggedly stick to the first thing I learned; I don't expect the world to adapt to suit me.
Why is that crazy? Makes perfect sense to me, especially as the performance of the programming language element improves. Ultimately these IDEs are sure to make use of WebAssembly, which should take away the remaining performance concerns.
They said the same about Java in the 90s. Java accelerated CPUs never really took off.
> "Formatted copypasting"
Do you need this for a coders editor?
> "usability features"
> "network features"
Which network features do you need for a coders editor? In the case of Atom/VS Code, can't think of a single one that a browser engine doesn't already provide. Not going to need things like AD-integration, etc...
> "The OS will become little more than a very expensive pixel pipe."
If that's what people want, then so be it. I see no problem with simplifying the OS, most of them are already too bloated.
> " But that's what people like, because C++ is hard, native widgets are hard to customise, and everyone loves designing interfaces, so that's where we're going."
The main advantage is the cross platform compatibility. If certain OS vendors didn't make it hard to build apps that utilised a common base then there'd be much less drive to produce web apps. It has very little to do with the complexity of C++.
As I've said a number of times now, the final target with web apps won't be JS, it'll be WebAssembly.
Also, 'expensively' is debatable, I'm sure it'd be possible to have reusable accessibility components, wouldn't necessarily have to reinvent the wheel for each new web app.
Not really, you still a browser that implement all this stuff. The only difference is Open tech vs proprietary.
My point is Web techs aren't a silver bullet.
From the same source, apparently the Mozilla Foundation does a lot of important work that the Mozilla Corporation can't/won't do but most of the development is done by the Mozilla Corporation currently (possibly due to not enough donations to the Foundation?).
I haven't really verified that, but thought it was worth pointing out.
Hmm, that's how Firefox (then Phoenix) started in the first place in comparison to the bloated Mozilla Application Suite.
Please recall that Phoenix/Firebird/Firefox was born as a lightweight, fast alternative to the bloat of Netscape Communicator/Mozilla Suite. Dropping features from an old bloated application is what launched Firefox to fame in the first place.
This isn't progress.
> As much as people complain about XUL not looking native, wait for HTML Firefox, it will take them forever to get where XUL was years ago.
And isn't Mozilla/Netscape literally the poster child for how to destroy your product with a rewrite? They only regained their market share because Microsoft let IE stagnate to a ridiculous degree.
> But now they know that only 10 million or even 1 million people use that feature, and they're only interested in 100 million user features! If Google Chrome doesn't have it, it must not be important!
I've pretty much only stuck with Firefox because of its extensions and the quirky little features it has. The more they focus on aping Chrome, the more they decrease the friction for switching away to it.
My hope is that Mozilla will take the direction of using Servo to build something Vivaldi-like (instead of everybody running with Webkit/Blink) and start to restore old API:s and frameworks from old XUL-Firefox that all the best old addons relied on (things llike NoScript, Session Manager, Vimperator, uBlock, Tab Mix Plus, etc...) instead of just sticking with a blackbox rendering engine and settling with Chrome addon API parity.
What makes you think they'll drop XUL as soon as the first release of the HTML-based UI? It's pretty obvious they'd want to support both until the HTML UI was close to feature complete. You're finding problems where there aren't any.
It happens all the time. Do you really need me to point out examples?
I'm not advocating a rewrite for the sake of novelty, but they're part of a product life cycle.
You mentioned that it would mean dropping some features. I think that's a good thing; an opportunity to cut the fat and make sure only what is proven and useful makes its way back into Firefox.
You seem to think volunteers won't pick up the maintenance of Thunderbird; if so then why should Mozilla care about it?
Now, I don't think HTML, CSS and JS are the best technologies. They each suck in their own way. But they are winning, and Mozilla is merely embracing that.
Ah yes, thank you for the reminder!
Now that I actually have a career and money I have no problem giving back. I'm glad I can donate to Ubuntu and Wikipedia nowadays. I am grateful for everything Firefox has given me for well over 10 years...
I really don't know what XUL is (intermediate language between HTML and FF UI?), but I guess I do feel sorry for people who have been using Thunderbird. I hope it's significant enough a property that people will want to continue, write it in Rust?
Things that are not broken do not need rewriting.
You need to support that claim. The people who actually work on Firefox have a list of reasons why they made this decision:
Why are we to believe that you are better qualified to make that call?
Not 'broken' in a traditional sense but still a valid reason to rewrite.
Joel was right and wrong.. Firefox was a huge success (or at least in my mind it was) but some might say the Netscape company was trashed during the process (I don't know if I agree).
Starting over isn't always a bad thing but I agree that is also highly overrated.
So, there was a failure, years of struggling changes, several new audiences, and another big change before it made it. Not seeing it as a counterexample as much as good luck for a project that seemed doomed to failure.
One thing I've read about the rewrite is that Netscape accuihired? (back before that was a word) a failed company, and put its failed managers in charge, who I guess were good at what really counts in the short term (looking out for themselves). An obvious corollary to the "don't do a rewrite" is "if you're going to do one, employ really good people to do it".
Only thing left is maybe to redo Mosaic in Ruby or Java to see if it's technically feasible to slow its rendering down any further.
An one-page email app running on Servo that is inspired by GMails interface would be great.
At the moment I use Thunderbird, Outlook 2010 and GMail. Outlook has good calendar integration and GMail the better tagging, search function and UX.
Opera 12 had an inbuilt HTML+JS based email client. Moving to Chrome/Blink with Opera 15 they stopped shipping the inbuilt email client.
They're starving over at Mozilla. They took in a mere $323 million in 2014. How can you expect them to properly fund multiple projects on such a pittance?
Even just keeping up with all the standards, and contributing to them, is an enormous undertaking at this point in the web's evolution. Let alone UI design, devtools, porting to mobile OSs, etc.
Personally, I would think an e-mail client is the perfect complement to browser development, and something that could/should be another significant source of revenue instead of a burden.
I suspect you would be wrong at least for the cases of Google and Microsoft. I don't have a good feel for how many people Apple has working on Safari and WebKit, but both the Chrome and IE teams are significantly bigger than the Firefox team from what I can tell, and probably more expensive unless you think Google and Microsoft pay developers less than Mozilla does. Let me ask you this: how many people do you think Mozilla, Microsoft, and Google each have working on their respective browsers? Ballpark figures, of course.
Also, estimates are that the money Google spends annually just _advertising_ Chrome in the last few years (TV ad campaigns with Justin Bieber and Lady Gaga, worldwide ads on public transit, etc) is comparable to the entire annual revenue attributable to Firefox.