Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The challenge is trust, can you trust that an third party ROM hasn't been tampered with, and that third party maintaining the build for your device is often some guy on the internet you probably only know as a pseudonym. It might be the latest version, it might have some modifications that 'distro' of ROM applies to all their builds, some required per-device changes to make it work, but what degree of confidence is there that nothing else changed and the OS is lying to them?

The same applies to most open source efforts, but I think it's understandable for institutions with consequences to what their apps do (like handling money and bank accounts) to opt out of potentially being undermined by the OS when they have the means. There's also elements of phones being general purpose computers now vs locked down appliances, GPL3 vs tivoization, and so on.



But if you're going to worry that a custom ROM might be compromised, shouldn't you also be worried that you know that the stock ROM contains unfixed critical security vulnerabilities? Wouldn't any reasonable argument to forbid the former also forbid the latter?


There are cases every day where an employee of a company can demonstrate that A is at least as good as B, but trying to get permission to do B is impossible.

There's simply no universe where an employee at a typical F500 company is going to be able to convince the IT team that they should permit custom code on phones even if the logic of the argument is sound. Even if they found the person, persuaded them, got them to persuade their management, in house legal teams, compliance teams etc., I seriously doubt whether the MDM solutions (eg InTune) would even have a checkbox to override the policy on one phone. To say nothing of the ongoing cost of tracking that one employee's specific setting.

This also plays a bit into the argument of why integrating your personal device into the corpo ecosystem of your employer is not a decision to be taken lightly.


> There's simply no universe where an employee at a typical F500 company is going to be able to convince the IT team that they should permit custom code on phones even if the logic of the argument is sound.

That's my point: I know that's how it is today, but the reasons for it being that way are illogical.


I'd say when something like a bank is involved, they need something more substantial to point at for their insurance arrangements if/when something goes wrong. "Our app allowed a $5000 transfer on the user's stock Samsung" is easier to grasp the state of the system they're playing in than a user modified one.

Also knowing what security risks are in play with N year old OS lets them decide if they want to allow the user to proceed or work around. Some applications are more aggressive in what version OS they need and that's a driver for upgrading phones, pushing the OEM to keep a phone in support for longer, or third party ROMs if this catch22 wasn't part of it.


> I'd say when something like a bank is involved, they need something more substantial to point at for their insurance arrangements if/when something goes wrong. "Our app allowed a $5000 transfer on the user's stock Samsung" is easier to grasp the state of the system they're playing in than a user modified one. Also knowing what security risks are in play with N year old OS lets them decide if they want to allow the user to proceed or work around. Some applications are more aggressive in what version OS they need

I've seen plenty of apps need a minimum major version of Android, but never any that have needed a minimum security patch level.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: