* Modern phones (and all the flagship phones) have had separation between their basebands and APs for years; a modern smartphone baseband is essentially a USB peripheral.
* The two largest smartphone vendors have large, world-class security teams that do things like audit their basebands. Has Purism?
* A modern flagship smartphone will have some kind of secure enclave. Apple's has dedicated silicon, and an encrypted memory bus linking it to the AP. How does Purism's hardware security model compare?
* I don't know how much Apple and Google spend annually on outside security research for their flagship phones, but it's a lot. Who has Purism engaged to evaluate their designs and spot flaws?
If you want to use a niche phone as a fashion or political statement, more power to you. But if you try to market that phone as "transparent code is the core of secure systems", I'll take issue with that; it's neither a necessary nor a sufficient condition for security.
This phone may very well be more "fair" or "ethical" than an iPhone. But if it's not as secure as an iPhone, it's unethical to claim otherwise.
Since when? I'm going to need evidence for a claim like this -- as far as I was aware, Apple is the only one that does this, because they make their own SoC and then use a physically separate Qualcomm/Intel(?) modem.
All the other flagship smartphones use a Snapdragon eight-something with everything on a single die. How could you possibly claim with a straight face that it's "only" (really, truly, pinky swear) a USB peripheral, with no other route to main memory, which oh-by-the-way is also on-package?
PCIe would give the baseband full access to main memory unless the mmu is set up correctly (there's lots of evidence that phone manufacturers have been lazy about this), and also assuming that the mmu based protection actually works correctly, which is next to impossible to verify.
(And I think tptacek's "essentially a USB peripheral" might be intended to be read as "metaphorically a USB peripheral" -- i.e. with only the capabilities you'd expect a USB peripheral to have, even when hooked up over PCIe, because of the IOMMU.)
They're not even going to bother configuring the iommu, because there won't be a perceivable difference in behavior to their boss or to the average user. It has nothing to do with how long the feature has been around, their priorities are just fucked.
You're either an iPhone guy or a Samsung Galaxy guy.
Or you want to save money, than you probably have a OnePlus or maybe a Huawei.
I'm personally hoping my 5X will survive and be updated for a long time because I haven't anything in my preferred price range that speaks to me in a long time.
I certainly don't trust any of those players to even ship relatively secure or even modern code to boot the ARM cores, let alone properly protect memory.
With the Purism stance of ensuring the baseband is on separate silicon, on a relatively limited (and easily hardware isolated) bus, they're able to effectively treat the baseband as an untrusted device.
If when you say "ROM" you're talking about the persistent storage holding the modem's firmware, then absolutely -- that firmware lives in a "vendor" partition on the same emmc as the OS.
When talking about "RAM", things get a bit more complex. Most usb and even pcie modems are going to ship with enough built-in (that is, on the same die) RAM to get the job done. They have absolutely no need for the quantities or qualities of external DDR (where your OS is running from most of the time).
That said, when the modem is embedded in the SoC, all bets are off -- it's really convenient to share as much as possible, and is entirely possible for the modem cores to depend on the same DDR memory as your ARM cores.
They only sat on this one for two years before fixing it.
Five-eyes have just finished re-dedicating to each other that they'll strong-arm and backdoor any popular platform and company product that they can get their hands on. What point is there to your nicely built Apple device then? That's rhetorical; there is no point having a secure device that your government can pry open at it's whim - be it hardware or software backdoored.
This seems silly, you can easily flip this on its head and say "Well the iPhone is closed, so there's no way to verify it has any security". You can't separate the ethics of security advertising from the ethics of everything else about how the device works.
However, in the context of ethical security conversations, the two are anything but orthogonal. The security expectations of someone buying an open phone, and the security expectations of someone buying "private" access into a proprietary platform, will be different. Sure, my baseband could be compromised, but if I'm more worried about leaking information to ad networks, what I really need is a firewall and hosts file, not an isolated baseband unit. If we're talking about an ethical security conversation, maybe we should mention the fact that this is the security Apple has to control what code runs on your phone, not the security a user has to control the flow of their data. Though, presumably you can have both goals, Apple shows interest only in the former.
My point being the two phones are extremely different products, especially regarding ethics, so the straightforward comparison and verdict struck me as extremely narrow-minded about how people use their phones.
I bring it up because the project makes a very big deal out of how much security their ethical approach adds. From what I can tell, their approach nets them materially less security than either flagship phone.
Most users care much more about the decisions that Apple and Google make with their data than they are about $random_blackhat.
Even more, a lot of the security features you mentioned are very difficult for an open source phone to achieve because the hardware ecosystem is so endemically closed source and proprietary. We can fix that by pushing hard for open source hardware that starts to make inroads and break those barriers down.
When Apple decides to take away an app I paid for, I have no security to block them. When Apple decides to quit posting security updates, I'm out of luck. When Google reaches in and takes my location data without permission, I can't stop them.
With open software, there's no guarantee I'm more secure. But when some security issue does come to my attention, I'm more likely to have some say in how I respond. Somebody claims to have a more secure or privacy-respecting driver? Strongly recommended by experts I trust? I'll look into it. With Apple, I can't. There is no such alternative.
The ssl lib has known vulnerabilities? Wait for Apple? Or install some shim until there's a fix? Oh wait, on Apple devices, you don't have an option.
Apple is not only closed. It's walled. As the owner I am kept on the wrong side of the wall. And there are parties on the other side of the wall that I don't want to be on that side.
This is a security model that is hard to lose to. If Purism let's me control the wall, I'd say that's a welcome change.
> When Apple decides to take away an app I paid for, I have no security to block them. When Apple decides to quit posting security updates, I'm out of luck. When Google reaches in and takes my location data without permission, I can't stop them.
This is not security. This is some weird definition of security you're using here.
> With open software, there's no guarantee I'm more secure. But when some security issue does come to my attention, I'm more likely to have some say in how I respond. Somebody claims to have a more secure or privacy-respecting driver? Strongly recommended by experts I trust? I'll look into it. With Apple, I can't. There is no such alternative.
Software security doesn't much matter with phones when the hardware security is crippled or non-existent.
Software security means fuck-all if you can pull from the phone's memory via a cable, via the modem, or via JTAG. The iPhone is the most secured hardware of any phone on the market and by quite a long shot.
Reading through the comments here is frustrating because tptacek seems to be talking about hardware and all of the responders are talking about software. It's not like your personal computer where you "need to get physical access in a powered on state". Getting access to an adversary's phone is _beyond trivial_.
I'm just trying to draw emphasis to your point being sound and not for HN personality-cult reasons.
To understand means to see beyond security design to see security principles. If one company magically decides to move your data to a government controlled cloud without any public discussion around why or if they even agree with it, the design is not going to matter. When the next internal decision occurs negatively affecting security in some undiscussed opaque way, you won't be able to point to security audits to explain the folly of human traits.
For instance in this case it seems like the iPhone has (considerably) more hardened hardware for many scenarios where you lose control of your phone. That’s not the only sense of security people care about. From the perspective of securing the phone from doing unwanted things without your permission, you have to trust Apple entirely to make the right decision. If you don’t trust Apple to make these decisions, the phone does not reflect a secure interaction.
If the software I use to verify the closed source software is closed, then I will need another piece of software to verify that the verify-ing software is what it says it is and so on until we reach a piece of software that I can verify with my own eyes (or a friend has verified with their eyes).
We all eventually have to place our trust somewhere but we shouldn't have to agree with where you decided to trust.
For fun I just compared objdump's entry point function assembly of a downloaded Pokemon Go IPA to the Hopper representation, and, no surprise, they are identical.
Quite the opposite it’s axiomatic in the industry that you need capabilities to verify closed source.
No one knows what nasties are hidden in these blobs that run on EVERY phone out there. I’d actually go as far as saying cell phone processing modules are completely insecure. Unlikely to receive updates. Poorly documented. Security standards are likely to be weakly implemented with gaping holes.
As for proof, I’d point at the lack of source code as a starting point. Open source doesn’t guarantee security but it at least lets interested parties try to the degree they want or need to.
Much better proof would be frequent exploits through the vectors you consider examples glaring insecurity.
I’m sure various agencies are quite frustrated by their inability to use the cellular modem as an entry point into Apple’s phones. That by itself is another pointer.
I didn’t claim to have proof other than that.
Why is a trillion-dollar company which designs their own silicon only two steps ahead of some hackers? Instead of $200 billion in the bank, unspent, shouldn't those devices be reviewed and redesigned until impregnable?
Computing devices are usually insecure due to limits on experience, time and cost. None of those applies to Apple in any meaningful manner.
Personally I'll just stick with cheap Android phones running custom ROMs and treat them as insecure, disposable terminals.
Also, it's not some hackers. There's quite a big community out there looking for bugs and every time, such 'hacking' requires unlocked phone.
The problem is the Mythical Man-Month. Apple's devices are built on a mountain of ancient C code. XNU is a bizarre Mach/BSD hybrid, built for the sake of expedience. (This is not to bash Apple; most other devices aren't much better.)
Apple has been working to improve the situation, but there's a limit as to how rapidly all those millions of lines of code can be changed.
As governments are pushing more and more to have back doors(or side gates here in Aus) it is becoming very important to have fully open code, as it is the only way to get around the vulnerabilities governments will be requiring phones to have in them.
Source code is merely convenient for the security community in cases like this. But it's hardly a requirement at all, and you can absolutely build a secure system if you design for it.
All that said, I think source code for a device as personal as a phone is largely more of a question of ethics and this is what Purism heavily leans on as an angle for their marketing. I think that's a good angle, independent of any security aspects: owning the code and having the right to it (even if you don't ever touch it) is an important aspect of owning a device as deeply personal as a phone. People like us who believe that should totally advocate for it -- I think it's a fairly simple case to make. But it really has little to do with _security_ on its nose, if you ask me.
(Also, if your government mandates backdoors, they will not be thwarted so easily by simply recompiling AOSP or whatever and loading a new ROM, if they do not wish to be. Any belief to the contrary is a fairy tale that people like us enjoy telling themselves. But I'll give you a hint: writing a Makefile cannot get you out of jail, and you are not that good at covering your tracks.)
Realisticaly, though, a manufacturer or intelligence agency will be the one paying security people for thorough reviews. Anyone doing this in the open would do this to build a reputation or for their own security, and will need much more time to review something that first needs to be reverse engineered. They will almost certainly not be able to look at everything.
It's not a logical argument to say that we security people are used to reading obfuscated binaries anyway and conclude that therefore no fewer bugs will be found compared to the situation in which it is open source.
If a developer does not reveal the source of a program, then surely a user of this program is not secure against that developer, who can make that software do anything without easily being discovered.
Therefore if we are talking about security from the users perspective, transparent code is definitely not sufficient for security, but it is a precondition.
If you make the (IMO wrong) assumption that a developer on principle can never be a security threat to a user, then sure, you can claim that something non-free like iPhone is "secure". I guess it's secure in that Apple can be very secure you won't use it to run GPL'ed software?
It will just be a more pleasant tool to use, without crap getting in my way.
> So does all this mean Open Source Software is no better than closed source software when it comes to security vulnerabilities? No. Open Source Software certainly does have the potential to be more secure than its closed source counterpart. But make no mistake, simply being open source is no guarantee of security.
(and some other points that, I think, miss the point)
Once the baseband is isolated as a USB peripheral and the radio modules are physically pluggable modules ... and have kill switches ... the appropriate comparison is no longer to other phones. The appropriate comparison is to a PC, like a laptop.
... has the shape of a phone, but that sure looks like a PC to me.
I am happy running a PC without a secure enclave, with a USB cellular modem attached, and with only hazy knowledge of the baseband running on that (in my case) USB stick.
And as proven by the set of Linux Security Summit 2018 talks, there is still lots of work to be done regarding security on Linux distributions.
If you use Qubes OS then it is another matter.
There's also social ethics which the company Fairphone tries to tackle with their products (Fairphone, Fairphone 2, and the upcoming successor). They're transparent about their ethics  which contains examples about sourcing ethical cobalt, the supply chain of the displays, and so on.
Do they, or is it a convenient claim to argue against? (honest question as I'm unsure about what they've claimed compared to iPhone specifically). By using volume and money spent as security metrics, it's quite easy to see why large players would be the best.
For many, I suspect that principles drive some of their security assumptions. Companies that are opaque and do things behind users' backs are always going to appear less trustworthy than those that are open about their software and processes. While that doesn't translate to empirical security, it's enough of a paradigm shift from secretive approaches that I hope that facet alone helps them succeed.
At its inception you could make the same argument for most open-source projects which usually don't start decked out with cash. But most trusted and respected projects of today (especially in comp sec) are open-source as a prerequisite.
Spending a lot on outside security research for flagship phones is neither a necessary nor a sufficient condition for security.
I like what Purism does, and those firmware blobs are one of the most significant issues I have with current smartphones.
I'm not familiar with the mobile microprocessor space, so curious about any details.
The price of having "a large team of ______" is either closed hardware or complete loss of privacy.
It pisses me off so much when I read well reputed people like you say ridiculous things like this. How big was Linux's security team when it started out? Look at duckduckgo and protonmail competing with google search and gmail. Are you saying they won't be able to afford a dedicated security audit team once the product starts profiting in the tens of millions?
> If you want to use a niche phone as a fashion or political statement, more power to you. But if you try to market that phone as "transparent code is the core of secure systems", I'll take issue with that; it's neither a necessary nor a sufficient condition for security.
I will make that fashion statement. And you know what,security is all about risk(and I know you could teach me an entire course on this). As an individual, hardware or software that cannot receive an independent review either due to lack of openness or transparency in the development process carries an enormous amount of risk. Even someone as talented as yourself cannot unilatetally audit android's codebase,you can't even start with iOS(even if you could you 'd be breaking a fineprint somewhere). The fact others can audit the code and design is a relevant factor when assessing risk and evaluating security. More transparency == less risk == better security.
> This phone may very well be more "fair" or "ethical" than an iPhone. But if it's not as secure as an iPhone, it's unethical to claim otherwise.
What happened to threat modeling? As an individual,I care much more about Google tracking my activity and sharing it(I'm sure you read the recent AP piece revealing google doing just that even when location was turned off) than I am worried about russians using a 0 day kernel rce or the FBI trying to decrypt my phone. Many of us are just normal people seeking the basic dignity of privacy and property ownership(as in freedom over the phone you own). Their product has a very real and significant impact to the reduction of the amount of risk I have to accept as an individual who has to use a smartphone, I don't see why you need to bilittle their work with obviously non-constructive criticism.
Edit: to all the '*' points you mentioned,how exactly and practically would purism's lack of those features impact the privacy and security of purism under real world threats. Also,in case it wasn't clear already,Google and Apple are considered threats to privacy and security by many who support works like purism's.
A phone needs to be secure and protect privacy.
Having a security team doesn't mean you get closed hardware, it means you have a security team.
>How big was Linux's security team when it started out?
Not very large and nowadays there is lots of people working on Linux security and they discover a lot of CVE's that are fixed in the kernel and then backported. Early versions of the Linux kernel weren't very well guarded against outside attackers (in fact, we're only just seeing the tailend of when vulnerabilities get introduced to the kernel lift from the beginning of the git history at around 2.6).
>Are you saying they won't be able to afford a dedicated security audit team once the product starts profiting in the tens of millions?
I think, and this is pure speculation based on the text of the OP, that the statement they made is intended to highlight that security teams cost money and Protonmail and DDG aren't able to afford as good security teams as google as a simple function of "how much money can we spend on it".
>As an individual, hardware or software that cannot receive an independent review either due to lack of openness or transparency in the development process carries an enormous amount of risk. [...] More transparency == less risk == better security.
Sing along kids: "It all depends on your threatmodel!"
But seriously, this conflates "open source = security" which isn't remotely true and can be easily disproven by adding a backdoor to any open source project and sending you the result. How many people out of 100 would even understand the code, how many of those are able to find the backdoor, how many of those can patch it out and compile it without backdoor?
Open Source gives security for people who understand code, not for the people I meet at work that have difficulties opening Facebook if you remove it from their homepage setting.
>What happened to threat modeling? As an individual,I care much more about Google tracking my activity and sharing it(I'm sure you read the recent AP piece revealing google doing just that even when location was turned off) than I am worried about russians using a 0 day kernel rce or the FBI trying to decrypt my phone.
Same trap as before. Lots of people don't worry about facebook tracking and don't worry about russians using a 0day on them. Most people won't buy a phone on the premise that google won't track them, they buy phones based on the features it offers and the big numbers vendors print on the spec sheet. Of course lots of people still don't want tracking but together with the previous group they form the majority of people who might not want tracking but ultimately can't be assed to care enough to spend more than a couple dollars per year on avoiding it.
>to all the '*' points you mentioned,how exactly and practically would purism's lack of those features impact the privacy and security of purism under real world threats
First point; separation of concerns is good here, a separated baseband means if for any reason the baseband is compromised, and even open source gets compromised, it cannot damage the phone itself. This makes hacking the phone from the outside through the telephone network rather difficult.
Second point; Auditing by security teams improves code security. As mentioned above, the Linux kernel and many other high profile projects receive a shitload of auditing by security professionals combing through the code because if they don't the world would spontaneously combust about 32 seconds later.
Third point; A secure enclave is very useful. Even some open source projects have them such as the U2FZero because they enable the software to operate on a zero-knowledge principle; it cannot leak your private key if it has no access to the private key. Similarly on a phone, your encryption key for storage can be a very safe 512 bit key and your password is compared using on-enclave software protecting the key itself. This way a state-actor or malicious mussad-level actor can't get your phone (though mussad would just replace it with a uranium bar and kill you by cancer because mussad doesn't care) encryption key because the software will delete the key or simply not grant access if it detect manipulation.
Fourth point; Getting third parties to evaluate your design is helpful, again as mentioned, a lot of high profile OSS projects have third parties scanning the code because two pairs of eyes is better than one.
Not true,even the most insecure phone is secure against some subset of attackers. If you don't trust the maker of your phone,all other security is useless. Imagine going into war but you suspect your body armor and vehicle is boobytrapped by your own side...
> Having a security team doesn't mean you get closed hardware, it means you have a security team.
Didn't claim othetwise. Closed hardware is needed to control the market well,alternative being to control user data and open up hardware and software.having a security team dedicated to a baseband audit means your profit a very large sum and you're already succesful...
> Not very large and nowadays there is lots of people working on Linux security and they discover a lot of CVE's that are fixed in the kernel and then backported. Early versions of the Linux kernel weren't very well guarded against outside attackers (in fact, we're only just seeing the tailend of when vulnerabilities get introduced to the kernel lift from the beginning of the git history at around 2.6).
Same can be said about windows seecurity,the point was lack of dedicated security teams early on. Even making $10mil profit a year,you'll find it difficult to find one security guy and dedicate resources to support his work. My whole rather obvious point you missed here was the correlation between adoption of a product and ability to dedicate resources like a security team.
> I think, and this is pure speculation based on the text of the OP, that the statement they made is intended to highlight that security teams cost money and Protonmail and DDG aren't able to afford as good security teams as google as a simple function of "how much money can we spend on it".
Somewhat,I understood it as "more money means more security and transparency is much less valuable". Which I disagree with. Transparency and good security hygeine are much more important than throwing money and bodies at it.
> But seriously, this conflates "open source = security" which isn't remotely true and can be easily disproven by adding a backdoor to any open source project and sending you the result. How many people out of 100 would even understand the code, how many of those are able to find the backdoor, how many of those can patch it out and compile it without backdoor?
I made a point out of security being a measured evaluation of risk. Transparency and being open source are variables,just like having skilled developers,a good security procress,good project management and resources like money and time. You need some of each but completely ignoring a variable means everything it multiplies is also 0. Opensource helps improve security,but only as a variable.
> Same trap as before. Lots of people don't worry about facebook tracking and don't worry about russians using a 0day on them. Most people won't buy a phone on the premise that google won't track them, they buy phones based on the features it offers and the big numbers vendors print on the spec sheet. Of course lots of people still don't want tracking but together with the previous group they form the majority of people who might not want tracking but ultimately can't be assed to care enough to spend more than a couple dollars per year on avoiding it.
Most people didn't have sex with condoms either until sexed came along. After fb's facebook fiasco,something like 42% of their users either stopped or dramatically reduced using fb. People have no choice but to buy apple or iphone therefore you're purely speculating here. Most people don't know exactly how bad things are,try showing someone you consider average their google activity and location history and offer them a purism phone and prove me wrong. Most people want a fancy featured phone but they also believe their hard earned money should be enough of a price to pay. They would buy the privacy enabling phone with the same looks and features for a higher price.
For the rest of what you said,it seems you ignored the part where I said 'real world threats'. Name one real world compromise using baseband firmware and I'll donate $100 to a btc wallet of your choosing.
As far as I can tell , the flagship phones (i.e. everything except the iPhone) are pretty heavily invested in the Qualcomm integrated baseband CPU + general purpose CPU . And unless something has changed radically in recent years, the baseband CPU has had direct DMA access to the same memory as the main CPU, and thus any vulnerabilities or backdoors in the baseband CPU have the ability to directly access memory.
With the rising prevalence of devices like the LimeSDR  putting the ability of intercepting and communicating with the baseband CPU, vulnerabilities in the baseband like this one  are even more of a risk than before.
I don't think anyone is arguing that Purism is going to have produced the world's most secure software, but the design space they've put themselves in allows them to be audited internally and externally - something that while you say "Apple and Google have spent a lot of money on it" I can't really verify that it's lead to a quality product. As flaws like Intel Management engine fiascoes have shown , even heavily audited code can have terrible flaws. The thing people don't like about the current approach with cell phones is that if the phone is too old you just have to throw it away because no one will update it. Purism is offering you something where you can throw away just the vulnerable modem or wifi card, and keep your phone. Even if you don't know of a flaw, you could purchase a different vendor's M.2 4G LTE card and swap it in, and make your attack surface different than other owners of the Librem 5.
There are other things which Purism will doubtless be way worse at catching/auditing, but honestly this is going to be like Linux: the benefit in terms of security is going to be that you are one of maybe 1,000 people using that device in that specific configuration, and you won't be worth exploiting.
Librem 5's design makes it relatively easy to safely disable and not use the part you think is vulnerable, but you can't really replace them - that would put this project into a completely different budget category.
That'll teach me to post after I should be asleep!
This really does not resonate with me. In most of these chips there is a functional or partially functional firmware in ROM, then the OS applies a RAM patch to provide full functionality or address functional or security issues. I'm not sure how I would be more free or secure if Broadcom or Intel placed the full firmware in the ROM and never updated it, than if the continued to supply updated firmware blobs.
The firmware for these devices historically is riddled with security issues, just recently this CVE affected most of the Intel AC WiFi cards 
Also Redpine supports firmware blob updates with some versions of their hardware, so I'm not sure if they are just playing word games here by saying it will WORK without extra blobs, but then expect everyone will really still use the blobs to stay up-to-date. 
>The host MCU can update the RS9113 module FW over UART using the kermit protocol or over SPI using the proprietary binary protocol. The default UART baud rate is 115200 kbps 8N1 with xon/off or no flow control. For baud rate >900000, hardware flow control must be used. Refer to the WiSeConnect PRM for more info - the "Upgrading the Firmware in Module" section discusses the SPI mode and "Firmware Upgradation" section discusses the UART mode.
And I kind of agree with them. If code doesn’t have DMA or network access, I’m pretty happy to treat it like the code in my microwave.
There’s firmware blobs in almost everything. I’d prefer them to be FOSS, but we all need to get things done :)
The FSF guidance on the co-processor exception is breathing room. I haven't read the fine print on what they mean by co-processor. My conjecture is that it was written at a time before these sort of architectures, and referred more to traditional GPU over pci-e like co-processors so that non-essential stuff can be excluded from requirements. In this case, you have single silicon, and single RAM. RAM needs blob for init. You claim that if you squint, you can see single silicon as multiple processors, and voila RAM init quandry is solved. That is purim's gymnastics.
- either you give your users an ability to upgrade the firmware, so they can RE it, write and use an open replacement
- or you don't give your users an ability to upgrade, which doesn't help anyone since they can never be sure if the chip manufacturer didn't put some undocumented interface somewhere that even you weren't aware of; so now only the chip manufacturer can upgrade the firmware and the user can't
Realistically, in a context of mobile phone components, there hardly such thing as "not updatable firmware". You can never be sure.
Crypting and signing is trivial, so nobody is going to RE Intel ME or microcode updates.
Updatebale firmware With physical switch would be much better solution than completely non-updateable firmware.
It means that there will be just one version of firmware shipped. Just one version of firmware that the whole community will focus on analyzing, patching (only for the librem phone, it's only pure chance if that helps any other device), and if a patch is needed:
* Redpine can either tell the entire community how to flash the firmware, or else the egg is on their face for shipping a broken firmware. *
(There's basically no chance that the Redpine chip is actually not field-upgradeable if you have the right vendor tool and probably also a windows box to run the tool on.)
The rest of the world seems to think, "flash it every time it boots up and I don't really care where that binary blob came from."
I'll also just add, I can respect your position. Thanks for supporting Free Software, and I get it that there's value to being able to upgrade. That's kind of what I'm hoping happens later, if (cough, cough, when) there's a bug or exploit or worse.
If purism waited until a manufacturer offered a perfectly open device, they'd have to wait until GNU Hurd ships on a phone... j/k
Openmoko already went through that route, although AFAIR there was no formal RYF programme yet back then. Freerunner was all fine from FSF PoV until Openmoko released a firmware upgrade for the Calypso baseband that fixed some ugly bugs, suddenly making it not free enough for FSF. Which is especially funny considering that it basically opened the way to run OsmocomBB on that thing.
I wish the FSF RYF requirements explicitly allowed a later release of a flash tool. (It seems like this only improves the user's freedom - suddenly you can treat your device as programmable, but only if you want to.)
But I don't think people will be nearly as upset if the Librem 5's RYF certification becomes void later.
I think the impact will still be large if it has RYF certification on launch.
Don't want to support conflict mining, but are okay with Android on a Qualcomm SoC = Fairphone
Don't want to run a non-free OS or integrated baseband, but are okay with manufacturing in Shenzhen = Purism Librem
Which you shouldn't, for obvious reasons.
Ethical products don't spy on users. Fairphone uses Android, so it isn't entirely ethical.
Because customers want Google.
The bootloader is unlocked. If you want to install Fairphone Open then you can. That's an Android version sans Google/OpenGapps. 
If you want, you can also install different firmware. Examples: LineageOS (with or without OpenGapps), LineageOS + microG, SailfishOS, UBTouch.
FWIW, the SoC is a SD801.
From leaked LG LK code it shows that's definitely where their bootloader unlock keys lived [at one point a couple of years ago].
Since the Librem device is a different beast, though, I wonder what their bootstack looks like? Super curious now!
A phone where you could get creative and manipulate every aspect of it without the artificial [security/functionality/SDK] limitations imposed on apps you can write for Android or that other comapny's phone OS I don't like even more.
A phone that will not get planned obsolescence.
A phone with OS that can be managed just like any other linux distro, where you can write apps in any of the readily available languages, etc.
Failing this, it's just overpriced wannabe Android clone.
I sometimes wonder how much Purism actually cares about all this security, trust, ethics rhetoric and how much of it is a bunch of geeks who just want to finally have a smartphone which can run an actual Linux distribution and figured "hey, consumers are pretty riled-up about this whole government spying thing - and it's pretty easy to audit open source software for that sort of stuff - maybe we can work an angle..."
I guess that's true, but these people probably will not be the driving force of the community around the phone. They'll be helping sustain the vendor financially, which is also valuable though.
I already have a mobile device to play with with a similar design to this phone, that has a separate 3G modem and can also be made to make calls. It's a bulkier 7" tablet. It will never get an opensource GPU driver, and suspend to ram is also out of the question for now. But it's fun to play with and explore.
Recently, I had fun with adding touch UI support for u-boot, to implement a boot menu and some battery indicator:
Also if you drop the typical Linux userspace, and replace systemd with a custom init binary, create a DRI based UI app it's possible to get cold boot times to useable GUI in 1-2 seconds from pressing a button. So I'm excited for what will be possible to do if the hardware of that phone ever materializes.
4g has gotten really good those latest years, I've stopped using landline connection and just use my phone as wifi hotspot for everything. But this also means I don't have NAT and thus don't have a local network, which is super annoying when I want to transfer big files from one computer to the other (not that I don't have enough options between usb keys, bluetooth send or cloud sharing, but it's just not as convenient as scp).
Can't wait to finally be able to properly configure my phone as a router.
I initially checked if there was any possibility for my spare router (a d-link dsl-3782) to connect to my phone's wifi and then allow other devices to connect to the router, but it had no such feature (plus, if I got that correctly, you would need two wifi cards for that, one which would be in access point mode and one that will connect to other access point).
My router has an usb port, so it may just be that I could use it to flash the software, install OpenWrt, and then share phone connection with router through usb. Worth the try. (and sounds like a lot of fun :P )
EDIT : sadly, that model is not supported by openwrt.
So the other day I hit this article  on planet.kde.org about KDE Itinerary, an application that can store your boarding passes and offer some additional services, such as calendar integration or notifications in case your destination has a different socket type, they drive on the left side, etc. It seemed quite useful and some parts are novel. Maybe there is a future for phones with just free software.
OSM and is no where near as good as google maps at most things but its still perfectly usable.
The average person probably won't be happy with a purely foss phone right now but if its something you care about then it can be done fairly easily.
There was an article recently here about someone using a RasPi (with golang) to build a custom GPS mapping solution. I've been doing something similar (though using raspi connecting to my head unit via composite video) as well as pulling sensor info from ODBII and currently looking into CanBus.
"Don't do evil" hit me the same way. I assume Google is well-intentioned, but there are many, many areas in which Google and I have moral disagreement regarding the way they operate. That's fine. Principled people can differ.
The same is true with Microsoft and Kroger's and Costco and lots of other brands I deal with. I know for a fact these companies support causes I believe to be immoral. I suspect in turn they disagree with some of the causes I support. But they don't rub my nose in their superiority with smarmy phrases like "don't do evil" this or "first ethical" that.
In the case of Purism, a much quicker way to my heart and wallet is say it's completely open for the following reasons. That's enough for me. I don't need your fundamentalist preacher bloviating on top of everything else.
To push the argument to the extreme, even if you find a radio component that matches your requirement of free, it's still going to talk to a radio tower that you don't control, running a software stack that you don't approve. This problem never ends, unless you imagine some end-to-end channel that you control, and then you don't care about the lower layer's lack of openness.
Does that make any sense?
> We will continue to evolve as we understand better how this subset of the industry works. The success of our phone is critical, as it will provide us with the legitimacy and leverage we need to bend hardware suppliers to our way of thinking by showing them that we have the potential to be market leaders with ethical products that respect users.
Purism’s style in press releases is pretty annoying, but I really appreciate the blog posts.
I backed the phone. I’ve backed most GNU+Linux replacements for my iDevices.
If it half-works it will have wildly exceeded my expectations!
I wish them the very best.
Ok, so in the dev kit/final product will there be a physical switch to turn it off?
Or at least a CLI command? Possibly a GUI with a big toggle labeled "Turn off the insanely complex unauditable OS that I must run to live in the 21st century because patents"?
As I understood it, yes.
It is at least an order of magnitude quicker for the user to toggle a kill switch that resides at all times on the phone. Negligence due to physical exhaustion is thus eliminated-- as long as the user can physically access the phone they have sufficient energy to click the kill switch. And as long as the user is able to reach the phone they can access the kill switch, which isn't necessarily true for the tin foil.
Additionally, there are a variety of social engineering attacks that can cause a user to become reluctant to wrap their phone in tinfoil, reluctant retrieve the foil, and even reluctant carry their fanny pack with them. None of those attacks exist for clicking a toggle, especially one designed as an elegant part of the phone's facade (which itself may go unnoticed by the would-be attacker).
Finally, toggling an unobtrusive switch on one's phone doesn't invite onlookers to come over and start discussing unwanted topics like PGP keysigning parties, onion sites, and how the average life span would have increased by 20 years if everyone were still using Gopher.
Details on hardware report at (as of today).
I’m not sure what ethical means when a corporation says it. Purism at least state what they mean and how they apply it.
What I’m looking for in a mobile phone:
Good screen. iPhone X quality.
16gb main storage.
Headphone jack nice but not required.
Linux OS. (Android is ok)
Ability to write my own apps for custom accessories.
Some kind of AppStore. (Debian apt repo is ok)
Cellular is not directly on main system bus.
Fast enough to play video with background tasks.
Too bad LG quit doing removable batteries, their phones with that design were relatively repair-friendly.
Not that you can really escape those realities the way things currently are.
If there was a phone made in the USA by unionized FTEs I’d buy 20, but I’m not sure that the factories even exist in 2018.
‘Assembled’ maybe. Their laptops are assembled in South San Francisco.
Disclaimer: Views expressed here are my own and do not have anything to do with my employer.
I'm just saying, it would be a huge advancement for the Linux desktop to fit Linux desktop apps in a mobile context. (And a huge step for reactive design, if done well)
It's all open source BTW. Look at https://gitlab.gnome.org/Community/Purism
So why the difference?
I love the idea of an open source phone which could be used as a phone & laptop replacement. Even if I’m still likely to carry an iPhone as well (but probably as a wifi only device if PureOS can a handle my phone needs).
There are easier ways to frustrate the tech industry. Oh and if you live in the US mobile networks will still sell you out no matter what brand your phone is. They all have to connect to cell towers.
How do they compare when explicitly turned on?
Did some upgrades, 1TB NVMe, 2TB SSD, and 16GB of RAM.
Also do not use PureOS.
This is the only bit I read, the rest was blah blah. The first entirely predictable delay of many, if I might add. They are on a long road to "Sorry we tried, here are some discount coupons for purism laptops". Well, it won't be that long really, unless a badly needed funding angel swoops in.
Their delusion is in some sense laudable, as any startup should believe it's own bullshit. But when you know better than they do about their chances, it's still hard to watch folks put themselves through this.
It doesn't look like they're a startup as far as I can tell, they're a laptop manufacturer with 2 products on the market for a few years, doesn't look like there's any VC money in the mix either.
Note: I'm generously estimating their profit at around $1M year. ~$100M was the cost to bring up the Essential phone.
It will be tough for them to find investors, the phone industry is a ruthlessly competitive market.
The CEO is very headstrong. That's great when you are an expert in hardware projects, but he's not. In my opinion, they are trying to do too many things, and aren't learning from their mistakes or listening to the advice of others.
The best thing they could do right now to maximize their chances would be to lay off their software team and focus entirely on the hardware. Let the community supply that part of it.
Would this hurt their marketability somehow? Wonder if anyone from there is tuned in here...
I agree that this delay was entirely predictable though. Some time ago when I heard at their Matrix channel that "we're still targeting January" I was all like "wait, what?". A plan that gives you just 2-3 months from sending out devkits to final shipping simply has to fail; I'm glad they didn't stubbornly tried to deliver and just delayed, although I'd rather have them delay to May or June already, as it's IMO still a bit risky.
OTOH, I think they'll have trouble attracting that much of a community to sustain the project if they just throw out such an expensive initially useless hardware out there.
But at least people will have something to play with. Better than promisses and sorrys.
HW must be appealing though, even if by being dirt cheap and with plentiful options. But it still takes years for community to produce something useful.