Hacker News new | past | comments | ask | show | jobs | submit login

Open source is a development model and doesn't have magical privacy and security properties. An iPhone is going to remain the best overall option for privacy and security for the near future, especially for users that aren't very technical. That's not really in spite of it being almost entirely proprietary but rather that's something quite orthogonal to it. GrapheneOS aims to provide a much more private and secure option down the road, but it's trying to do that based on merit rather than by claiming that being open source makes it better. Either way, the any ARM SoC is going to be a massively complex set of proprietary hardware / firmware / microcode. An open hardware SoC wouldn't provide inherently better privacy or security, and unlike software you wouldn't even be able to reproduce the build and verify that it matches what it's supposed to be. In reality, that provides very little for software, because sources are full of vulnerabilities and it being open source doesn't magically fix them. A maliciously inserted backdoor designed to be stealthy would be indistinguishable from those, and it's nearly impossible to know how many of the vulnerabilities being fixed in software were intentionally inserted backdoors, if any. A sophisticated attacker in a position to insert a backdoor into hardware or software could just use the existing vulnerabilities, and if they did insert a backdoor how would you distinguish it from one of those?

Components with DMA can be contained by IOMMU, and that's the industry standard today. However, you seem to be implying that backdoors are being inserted into non-CPU SoC components, and it's very hard to understand the threat model you're applying to this. Why would there even be a backdoor inserted into an SoC component like the image processor, which is contained by the IOMMU, rather than the CPU? These SoC components aren't third party components. They're on the same die as the CPU and come with it. That doesn't mean they can freely access all memory... but it does mean that supply chain attacks targeting them would generally be able to target the CPU instead.

If a hardware component is compromised, an attacker would target the driver and gain code execution in the Linux kernel via an exploit. The Linux kernel is a weak target (monolithic - no internal security boundaries, fully written in a memory unsafe language) and drivers are rarely well hardened against attacks from hardware since developers have a tendency to trust it and to not apply an adversarial model towards it as they do with userspace. They don't need unrestricted DMA access, and proper IOMMU setup keeps them from having that. Having DMA does not mean having full control over all memory. Not having DMA doesn't mean that the component is well isolated. Whether or not the component is on the same die is totally orthogonal to whether it has DMA access. These are common misconceptions, and are being abused by dishonest marketing to trick people.

> Android can give you privacy and enough security for most people.

Some of that is due to the improvements landed upstream based on the work in this project.

> This can't add much more as long as its running on the same devices.

I don't agree with that at all. It can't improve the security of firmware directly, but it can certainly improve the isolation of it by auditing and improving IOMMU configuration along with hardening the drivers. It also won't be supporting devices without decent IOMMU support and firmware security updates. The project has also reported various firmware security issues to the relevant companies over the years of the project, so that's an indirect way of improving them.

A large portion of the project will also be on app layer projects like https://github.com/GrapheneOS/Auditor usable on the stock OS and other operating systems. Auditor / AttestationServer support the stock OS on a bunch of devices, along with CalyxOS and GrapheneOS. Other apps will generally be more portable, but in this case it has to have a database of the verified boot key fingerprints and other device properties. The verified boot key is the only information included in the signed hardware attestation data that it can use to distinguish between devices which it needs to do in order to show the device model and apply different checks based on the device. That's why devices need to be added to Auditor one-by-one based on users submitting sample attestations with the app.




"They're on the same die as the CPU and come with it. That doesn't mean they can freely access all memory... but it does mean that supply chain attacks targeting them would generally be able to target the CPU instead."

I don't disagree with your overall post. I do want to add that there's a good reason to not put the backdoor in the CPU: it's main place they'll look with plenty of people capable of spotting it. The guy that taught me about hardware subversion years ago preferred hiding stuff in analog parts of mixed-signal ASIC's. He said digital people neither saw it nor understood it. He and others taught me about how the two can interact in invisible ways where analog or RF portions might pick up leaks. So, deniability is maximum if it's some kind of analog or RF part of a chip. He claimed to have never found backdoors but that he and others used this for I.P. obfuscation a lot.

I do like the IOMMU and firmware work. There's a lot of custom I.P. to build before being competitive with one of high-end SOC's. One thing I considered about trying to make an open phone is whether a company with money could just pay for Snapdragon to be integrated with the RISC-V cores. Modify RISC-V core to use microcode for security updates and product enhancements. Put security barriers in key places so Snapdragon I.P. is a little less dangerous or can even be powered off component by component. Then, if the agreement gets more data on hardware, use that with secure, development practices to make robust drivers. Include method for secure boot and update that still allows user to put their own stuff on the phone if they choose.

What you think?

EDIT: In case it wasn't clear, I know there's stuff like IOMMU's in Snapdragon. I'd just prefer an independent, security-focused company to be making those components. Sort of a check against incompetence or malice on Snapdragon's end.


> Open source is a development model and doesn't have magical privacy and security properties.

This is misleading.

> A maliciously inserted backdoor designed to be stealthy would be indistinguishable from those

This is a false equivalence. In most closed source systems the vendor does not need to put efforts into designing a stealthy backdoor.

It just adds tons of code that spy on the user and call it features. The amount of homecalling done by android, ios and windows is staggering.

Not to mention the ability to push an update that can contain a backdoor on a specific, targeted device without the users being aware of it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: