Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I just can't authorize an app to have full control on my phone if it's not open-source.

What guarantee do I have that you are not selling all my user data?





DigiPaws has the headlining feature of the app advertised here, and is open source.

https://github.com/nethical6/digipaws


Not that I suspect maliciousness in the case of digipaws or OP, but does the app's code being open-source actually guarantee any security? Is there anything forcing the app I download to be consistent with the repo on Github?

The readme clearly directs the reader to the F-Droid package, which are built on their buildservers and signed with their APK keys. This does not answer the security question directly, but it's the same model as say Debian repos. There are eyeballs on it by an independent third party packagers who use code scanners and manual review to detect malfeasance, and often have to tweak builds and code to get rid of unwanted things present in some upstreams.

Even better: if the build is reproducable, it guarantees that the source code of the repo is the same as the version that is distributed by FDroid.

It doesn't guarantee any security, but it is necessary for you to be able to to be able to have confidence in the security in a reasonable time frame. And if you need a guarantee that the source matches the binary, then you can build it yourself.

Not really. I guess to be 100% sure you need to build the app yourself. I don’t think that publish attestation exists on play store. Probably would need to openly build & upload the app via a CI runner, print all hashes inside that runner and then the playstore also needs to display those hashes before you download - but that doesnt exist for play store downloads yet.

I understand your point.

The short answer is that, indeed, it comes down to trust, and I really understand and respect your perspective.

The long answer is that it's very unlikely this trust would be broken. Let me explain:

Firstly, the accessibility service doesn’t provide anything close to "full control." It’s just an API provided by Android that gives accessibility events, like changes in the screen layout and the UI nodes present on the screen to infer the type of content shown (Reels in my case). You can check online for details on accessibility events. It's nothing like a constant screen recording where the app gets all your data.

Also, Google is very strict with these permissions. When you publish an app on the Play Store, you need to clearly disclose why you're using those permissions. If you do something wrong or try to abuse this, they will take your app down. Anyone who values their reputation wouldn’t attempt something like this just to sell some user data.

Lastly, ScrollGuard doesn’t need to connect to any server to work!, all the detection happens on the device. So, if you want to be extra cautious, you can always go to your phone settings and block internet access to ScrollGuard. It will still work, and without internet access is imposible to export any data.

If you want even more control and just need a solution for Instagram, you can modify the app yourself. I wrote an article a couple of years ago on how to do this here: https://breakthescroll.com/block-reels-instagram/


What guarantees do you have that open source code faithfully reflects what is in the compiled binary?

The idea is that you download the source, review, and then build it yourself.

It’s easier for security researchers to check

must be a rather useless device you have there then...

The device has many eyes on it. Random apps don't.

> full control on my phone if it's not open-source.

bro is using social media that listens and records ton's of data in background.

https://www.hipaajournal.com/jury-trial-meta-flo-health-cons...


I understand the position, but I think that's a silly concern here. This is an app that stops you from using social media features that absolutely farm every bit of data out of you they possibly can.

Feels a bit like being afraid to install a smart lock on your front door, so instead you leave it unlocked all the time.


This is a bad take, as much as I don't use social media at this point, people need access to good tools to curb use, and in this case, "good" means "open."

Can you elaborate why? It sounds like we agree to me. People need access to good tools to curb use, and all else equal, open is definitely better than closed. I just am saying that I'd rather have an effective closed tool than no tool at all

It does sound like we agree, but my main issue is the further shifting of the (for lack of a better word) overton window around when closed software is acceptable.

For all its flaws (and despite my general ire towards them), the FSF has done one thing really well over the years, and that's keep the conversation alive around open-source software (which, in turn, has landed us at what I consider to be a really good compromise of a ton of high-quality source-available software).

The FSF isn't pulling as hard as it used to for a variety of reasons, but I think it's important to keep the pressure on and in cases like this, it's really easy to take the stance that at least source-availability shouldn't be compromised on, since the app presumably needs very broad permissions and capabilities from the OS.


Social media apps don't have the same level of permission to detect scrolling even when they aren't being used. This app does have that higher level of control (accessibility service) and so should be subject to more scrutiny.

I am afraid to install smart locks. Too much goes wrong with software. I would install a regular lock instead.

I got locked into my (100+ y/o) house due to a smart lock soon after purchase. It got promptly removed. I'd much rather leave the door unlocked.

A lot of discussion is about the security of these devices (resistance to false open states). But most of the time the safety (false closed states) has even higher stakes associated to it. Having to wait because some api server is slow is annoying but can quickly become life threatening in a different context. Fail-Safe vs Fail-Secure is (imo) often overlooked and probably just as important as the actual implemented security.

Wait, are there smartlocks that depend on the availability of some api service to even open the door? I'd rather call that stupidlocks instead. I mean, just because you're an IoT device it doesn't mean you are smart, ffs.

I have been locked out of more than 1 airbnb due to lack of cell service not being able to get codes for locks, it is very annoying and dumb.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: