It's a myth that open source tools are more secure than closed source tools.
Use whatever tool is put out by the team with the most expertise and the least reason to do something untrustworthy (or alternatively: the most to lose). A backdoor can persist in open source tools just as easily as it can persist in proprietary ones, and teams which make open source tools are frequently smaller and underfunded for secure feature engineering when compared to teams which develop closed source tools.
> It's a myth that open source tools are more secure than closed source tools.
It's a myth that they are always more secure, but my experience has been that they are generally more secure. I'd love it if we had some statistics behind this.
I would argue that foss tools have an inherent interest in being more secure, while closed-source tools have an inherent interest in being less secure.
The "open source" development model is based on the claim that it produces superior software. Of course that claim will be made. But I agree that it is a fallacious claim. _However_, there are strong benefits of anyone being able to audit the code rather than just privileged entities (see below).
Security is also about trust. I can't trust non-free software as a matter of principle---the developers, by keeping something proprietary, are hiding something from their users. The intent is irrelevant---"trade secrets" or not, the fact is there.
Back to code audits: a common rebuttal is that you can audit certain proprietary software through the signing of nondisclosure agreements. The trust then shifts to the auditor, performing an extremely complicated job (depending on the software). Nobody can verify their work, or even the integrity of the work.
Certain types of changes can be "audited" by chance in Free systems: malicious commits/patches to free software is an extremely risky operation because of the chance that you may be found out. You can set yourself up for plausible deniability, but that can hurt your reputation, and the reputation of the project.
Since trust is an important aspect of security, non-free software is by default less secure _to me_. It isn't even an option.
> A backdoor can persist in open source tools just as easily as it can persist in proprietary ones
Just as easily? Citation needed. Especially when we are talking about a simple security device that can ship a tiny, well written, well documented codebase.
> teams which make open source tools are frequently smaller and underfunded for secure feature engineering when compared to teams which develop closed source tools
This is besides the point. The same code, developed by the same team, can be released, or simply made publicly auditable.
Use whatever tool is put out by the team with the most expertise and the least reason to do something untrustworthy (or alternatively: the most to lose). A backdoor can persist in open source tools just as easily as it can persist in proprietary ones, and teams which make open source tools are frequently smaller and underfunded for secure feature engineering when compared to teams which develop closed source tools.