Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] We see Project Pegasus, we must have Open Source Smartphones: let’s discuss (theguardian.com)
54 points by fullmoon888 on July 19, 2021 | hide | past | favorite | 47 comments



"Please use the original title, unless it is misleading or linkbait; don't editorialize."

https://news.ycombinator.com/newsguidelines.html


When we tech people warned about this, the answer was always "I got nothing to hide". Now that it affects journalists, they are all shocked and write articles. We need till it hits politicians, then we can have new laws.


In India it is hitting politicians including the leader of the opposition. Sadly nothing much is going to be done though, it is too late there for any real change in India through legislatures


Unheard past warnings do not change the situation we live in. Agree we need this to hit someone on the upper floor.


Please link to an article where the journalist's point of view on surveillance tech was what you claim.


Let’s start to discuss where we stand with open source smartphones, both in terms of software and hardware. Really worried about privacy and human rights. We cannot trust Apple and Google on this..


The article says that Pegasus is installed via a zero-day (presumably via a phishing attack or similar). I love opensource software, but it is not a magic bullet that stops the likes of zero day attacks. In other words, using a fully opensource stack would not prevent something like Pegasus if they exploited a zero day.

Purchasing phones which ensure frequent software patches for a number of years is a far better tactic IMHO. For example I recently purchased a Nokia X20 (https://www.clove.co.uk/products/nokia-x20) which has a promised 3 years of OS upgrades...something I've not seen by other manufacturers.


Partially true, but moving away from proprietary hardware and software and especially cloud services would still greatly reduce the attack surface, and is something I committed to fully half a year ago by purging Google/Play services from my phone, having already wiped Windows off my drive five years ago, replacing it with Arch.

This still wouldn't protect me from a targeted surveillance attempt like Pegasus, but it does protect me from automated mass surveillance in the cloud, and at least partially reduces the attack surface, by getting rid of unvetted, unreviewable, backdoored proprietary software.

0 regrets, only privacy vibes every since.


The current state is not great; I keep trying out the distros for my pinephone but everything is pretty much still unusable. Apps crash a lot, including the window manager and there are a very large amount of bugs.

The hardware seems fine, but, as said above, it seems that critical hardware (and software for that hardware) is closed.

What is great is: replaceable battery and dipswitches to physically turn on and off every module in the phone.


So you don't trust large US tech companies. Fair.

But why do you expect people can trust a group of anonymous developers building open source smartphones?


Isn't that a large part of the open-source nature? Audit the code or hardware designs yourself, determine their trustworthiness from that. It's much harder to trust something when you can't examine the inner workings of it.


Ok thanks, I get that. But how many people can do that? Like 0.001% of the population?


I think its fair to call it 0%. Auditing a large, modern code base is going to be impossible for a single person. For example, the Linux kernel is 27.8M lines of code (as of Jan 2020, [0]). Yes, a lot of that code is for drivers you wont use, or platforms you aren't running on. But still, no one person is going to be able to get through all of it with enough attention to detail to notice things like subtle race conditions, especially if they were inserted maliciously.

[0] https://www.phoronix.com/scan.php?page=news_item&px=Linux-Gi...


How many people can authenticate a dollar bill? How many people can validate a cryptographic signature? How many people can direct a blockbuster action movie?

The point is, right now, nobody can audit these things. Once someone -- anyone! -- can, everyone else can benefit.


There's also the matter of traceability.

Even if there is no direct audit of the code, once a vulnerability is discovered it can be traced back to the person(s) who introduced it.

With a closed system, only the owner of the source code history can do that. With open source, any person in the world can, and can start a discussion to understand whether it was malicious or not, if the person(s) should be banned from pushing code, new code security standards to be adopted, etc. You lean on the world's expertise at that point.

Bad things happen. It's important to have the ability to understand why and mitigate for the future.


Nope, it can be traced back to a random nickname on the Internet, using a computer somewhere in the globe.


That may be so, but it's usually been through code reviews, pull requests and whatnot - which means a maintainer somewhere has approved that code.

In any case, "a random nickname on the Internet, using a computer somewhere in the globe" is a lot more information than none.

Finding out that that's the case for a given project is part of traceability.


How do ensure it wasn't a malicious maintainer?

That information is meaningless if traces back to an empty room.


If the project has a malicious maintainer, it's easier to find out if it's in the open - and either forcing change or not using the project at all. It's impossible to do that when you have no access to that information in the first place.

It's not perfect but it's something vs nothing. I'll take something every time.


How do you track down a malicious maintainer, introducing a back door slowly during one year long, a little change at a time, given how long CVEs in OpenSSL have been unnoticed as example?


I guess you have never had commit rights to any Linux distribution or such?

You don't get commit rights as a random person, so yes, a commit can usually be traced back to a person. Sure, the committer could have received a patch from a unknown person, but then he's still responsible for the commit.


I guess you are a security expert that knows how every single FOSS project that might be used as dependency works.


That's not what I tried to say. It's up to you as a user to make due diligence and make an informed decision if you want to use the software or not.

Any serious project would have some form of web of trust and know who has commit rights. It's up to you to decide if you trust their web of trust.

I guess from your comments that you are not actually interested in contributing to the discussion since you just sprout single line comments with no information at all.


I contribute with experience instead of FOSS ideology absent from how things actually work.


No, you throw out nonsense single line sentences that say and contribute nothing.


>How many people can authenticate a dollar bill? [...] nobody can audit these things.

USGOV has a pretty comprehensive guide on how to validate them:

https://www.uscurrency.gov/sites/default/files/downloadable-...


There is also plenty of documentation and books to learn coding and start auditing if you want to.

Fake validation is less like coding as to catch a really well made fake you would need years of experience seeing all sorts of fakes , while coding needs only experience to see what is good code to able to catch most issues


> coding needs only experience to see what is good code to able to catch most issues

If that were true, the software industry would have a much smaller problem re: bugs and errors than they currently do.


Less bugs would be there if the industry wanted it and paid for it.

Sadly the problem is good enough is how the industry sees everything, constant cost cutting , off shoring or replacing senior talent with fresh graduates , inadequate focus on security, debt is all too common, unless/until something affects bottomline there is no pressure.


Please note the entire absence of the dollar bill from that document.


> Please note the entire absence of the dollar bill from that document.

A fair point. I took "dollar bill" to be the generic "US currency" rather than specifically "the $1 bill". But this page covers everything from $1 to $100 (although it seems the $1 and $2 have barely any.)

https://www.uscurrency.gov/denominations/


You clearly don’t understand the transparency and power of open source code. Fair.


Helping me to understand would be appreciated and in the spirit of HN. Mocking me? Not so much.


Open source is like Open courts or Right to Information.

Just like anything going in secrets courts is bad for judicial integrity, or RTI laws can help keep government somewhat honest, Open source can help like any other transparency framework.

Just transparency is not a magic solution , open source alone is not going to solve everything. It is just one among many other controls we need.


Misleading title - the article is about Pegasus and doesn't one mention open-source anything...


That's definitely a twist made the person who posted the link here.


As I understand, the GSM and Bluetooth modules are closed source because patents etc? We would need open hardware for the entire phone, then the software will come.


That's the opposite of how patents work; if there were patents involved they'd be public record and we could look them up.

No, these modules are closed because it's simpler than making them open. Someone is getting ready to type "FCC and other regulatory bodies prohibit consumer reconfiguration of specific certified radios" but that has nothing whatsoever to do with openness. Being able to monitor something and being able to configure it are not the same thing.


But, so, in short, we are screwed for practical open radios? Or is there a way out?


We can make open systems from scratch: https://osmocom.org/projects/baseband

or we can try to reverse engineer existing basebands, but I'm not aware of any successful projects working toward that.


I want privacy. I also sort-of buy the "nothing to hide" argument - as another comment below says, for most people risk-adjusted cost of privacy loss is greater than the cost of maintaining it.

But this article writes about the very people who have plenty to hide (for good reason!). I think it's a bit misleading to say investigative journalists have "nothing to hide" - confidential sources, on-going stories, contacts, whereabouts etc. Mixing this up, in my eyes, is not helping the "privacy for the masses" adoption.


More than nothing to hide, you and I have nothing worth exploiting our devices over yet.

However as the cost of exploits and ease of mass surveillance becoming cheaper . That statement has made less true for more and more people.

In the NSO target list for India I am seeing all sorts of people like virologists and journalists I wouldn't have thought were doing important enough to be tapped. More than the tapping that surprised me.

Sooner or later either we will be worth slightly more than cost or costs will become cheap enough.

However at that point it will be too late. Like the infamous quote goes " first they came for communists/Jews"


Won't help unless we have open chat protocols.

All the open phones in the world won't help if you use closed-source WhatsApp / Facebook. And you kinda have to if you want to talk to your less tech savvy friends and family.


And open chat protiocols won't help because we don't have open source smartphones :) Maybe this battle is on different fronts and there are many factors that are into play.

In the EU there is a law in preparation that will force big players in chat networks to open up to third parties: The Digital Markets Act (DMA): https://ec.europa.eu/info/strategy/priorities-2019-2024/euro...


We could also just jail the buyers, and the NSOgroup and FinFisher.


That sounds great, but given the tribal nature of open source and disparity of development effort i will hold my cash.


right to repair to me starts with unlocked bootloader's such that we can run non-broken OSes. having an open source system would be a huge upgrade in repairability.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: