There are a ton of products on the market that are vastly more dangerous than computers: guns, cars, motorcycles, bicycles, chainsaws, table saws, cigarettes, alcohol, junk food. Yes, consumers do sometimes harm themselves by using these products. That's the price of freedom. I think it's bizarre that we treat computers as the most dangerous products in the world that for some reason demand paternalism, when none of these other products are locked down by the vendor.
The reason that computers are locked down by the vendors is not that computers are somehow more dangerous than other things we buy. The reason is simply that it's technically possible to lock down computers, and vendors have found that it's massively, MASSIVELY profitable to do so. It's all about protecting their profits, not protecting us. We know that the crApp Store is full of scams that steal literally millions of dollars from consumers, and we know that the computer vendors violate our privacy by phoning home with "analytics" covering everything we do on the devices. This is not intended for our benefit but rather for theirs.
If protection of the casual user was an argument, there would be an easy option to unlock your system, be that phones or desktop computers.
But on many systems these options do not exist because the vendor likes people dependent on them. This is why devices like chromebooks or all mobile phones are more or less e-waste in the making. In my opinion it is a waste to use any development capacity for these systems apart from consumer devices offering the next shitty app that hopefully always stays optional.
We even have dysfunctional laws that require banking apps to only run on these shitty systems. In my opinion, these errors need a quick correction.
Also, the most cases of scam still work as they did before and exfiltrating information, e.g. tracking and "diagnostic data" by bad operating systems are an additional security problem.
> If protection of the casual user was an argument, there would be an easy option to unlock your system, be that phones or desktop computers.
Making it easy to unlock could make it easy(er) for scammers to get it unlocked:
> I received the same type of call a little later in the day. They were very adamant they were calling from the Bell data centre, on a terrible line and I made them call back three more times while I considered their requests. They wanted to have me download a program that would have given them controI of my laptop. […]
> Making it easy to unlock could make it easy(er) for scammers to get it unlocked
Making laptops that weigh two pounds instead of 40 pounds could make it easier for thieves to steal them. Making computers less expensive could increase the number of spammers who can afford one and make it easier to send spam. Making encryption widely available could make it easier for bad actors to communicate.
But these things have countervailing benefits, so we do them anyway and then address the problems by a different means. When someone insists on doing it in the way that "incidentally" provides them with a commercial advantage, suspect an ulterior motive.
Easy doesn't mean without any warning, it just means that the device is unlockable by design and without OEM's approval.
It would be reasonable to:
- factory reset the device before unlocking it to protect existing data (like Android phones require)
- display warnings, for example "if someone's asking you to do this, it's probably a scam"
- for the owner to be allowed to permanently disable unlocking, e.g. the commonly cited example of someone setting the device up for their elderly parents
> for the owner to be allowed to permanently disable unlocking, e.g. the commonly cited example of someone setting the device up for their elderly parents
This opens a wormhole that warps us back to one of the core issues / battlegrounds in computing: ownership, and specifically, the balance of power and responsibility between the owner and the user, when they're not the same person.
Unfortunately, the same means and the same arguments cited in case of "someone setting the device up for their elderly parents" also apply to employers "setting the device up" for their employees (where "setting the device up" may just mean letting it access the company network), vendors "helpfully" "setting the device up" for the customers (this is basically the whole history of mobile phones - bootloaders now, SIM locks before), etc.
I don't know what the good answer is. I'm personally strongly biased towards "end-user should always be the owner" perspective, and while I recognize there are strong examples where this isn't the case, I can't figure out how to cleanly separate "legitimate interest" from for-profit or for-power abuse.
The balance of power between a company and you, vs you and another person, are vastly different, which IMO makes the issues distinct. Presumably the elderly parents are willingly relinquishing control to someone they trust, and they can always go out and buy their own phone if they don't want that.
In that scenario I think employers should have the right to make this decision since they own the device and it likely contains sensitive data and credentials belonging to them. But vendors selling devices to retail customers shouldn't be allowed to make that decision unless the customer explicitly asks for help.
I think it's pretty consistent, whoever legally owns the device should be allowed to decide what is and isn't allowed to run on it.
Yes, my point is that in practice, this gets abused. In particular, the possibility enables vendors to invent business models that rely on denying users ownership, and those happen to outcompete the fair, honest models.
It seems relatively simple to me. Only the owner is allowed to make these decisions, more or less. Employers can only do it to devices they own and provide. Phone vendors cannot do it, and cannot make any services contingent on doing it. A family member in this situation is reasonably acting on the owner's behalf.
And while we're at it, let's not allow apps to refuse to run because of rooting.
> factory reset the device before unlocking it to protect existing data (like Android phones require)
I never understood this point. From what threat is it protecting the data from? Surely a thief should not be able to unlock a device without first typing the correct pin/password, and it they can do that they should be able to access the data regardless.
In principle I agree but the edge case I think has to be accounted for is that many people have weak PINs protecting highly sensitive apps (financial, banking) on their phones like that could be backdoored with root access.
There have been times when I really wished that I could OEM unlock my Android device without wiping but overall I think I sleep better knowing that my PIN isn't sufficient to extract all of its data.
Unlocking should require a physical modification, like soldering a jumper or flipping an internal switch requiring disassembly. That would filter out basically all scam victims. If a scammer can teach a complete novice how to do micro soldering, they've earned their pay.
The Chromebooks that require removing a single internal screw are a fairly civilized example of this approach (might be a little harder to execute in a phone).
The Prusa Mini required you to snap a part of the pcb off to flash custom firmware. I actually like this approach, you have to very deliberately break apart of it to signal that you know what you are doing.
> But on many systems these options do not exist because the vendor likes people dependent on them.
Dependent is not exactly the right thing here. Lower support costs probably is. If a vendor gives out root access. If that root access can brick a machine. Then you will get a small percentage of very high touch broken things as returns. Customers like this are in the 'dangerous enough' but not 'good enough to do it correctly' stage of hacking. They will then not claim any responsibility for breaking it. As they are hoping you just fix it for free.
I had one customer who would randomly change out stored procedures on our code. Then yell at our tech support for thing not working or being broken. Wasting hundreds of hours of time until we realized what he was doing. Locking him out is very appealing. Instead we sold him and his management on 'we will do the work for you for a fee'. Which was more along the lines of 'you do this again we will fire the customer'.
Damage caused by the customer isn't covered by any warranty anyway, and realistically, how many people would tinker with root access as long as the device worked as intended?
I'd be really surprised if the number was more than 1 in 100. And if 1 in 20 brick the device in the process, that's 1 in 2000.
According to [1] the average warranty claims rate for consumer electronics is 1 in 100. I doubt the difference in support load would even register on the scale.
> Damage caused by the customer isn't covered by any warranty anyway
Exactly. We charged the guy for what he did. We gave him 'sa' access to the database and he tried to burn us.
I think you may be assuming people act rationally? They do not. Most will but you will always get 'that guy' especially at scale. People will lie about what they have done. Or not even realize what they did goofed things up. In my example the guy was asking us to pay them back for defective software (millions of dollars). Right up until we proved he had broke it on purpose. I later found out he did it on purpose (confirmed by former coworkers of his 'he likes to mess with vendors'). He was not even alone. At least 3 other people tried that trick on us at different companies.
Most service requests are 'easy'. Small tweak/reship and off you go. But someone who has really broken something can be as easy as 'ship them a new one' to weeks of trying to figure out why a device has suddenly started acting out of spec. That means at least 1-2 people working on something for a period of time. That costs money.
> I'd be really surprised if the number was more than 1 in 100.
It is the time you have to put into looking into why did you end up with a defect that is not a defect. The margin on some of these IoT devices is in the couple of bucks range or smaller. You have to dedicate 2 guys for 3 months to figure out what went sideways can eat the entire profit margin of the whole run.
I was just saying I can see why a company would withhold the info. I did not say I agree with it. Especially for things that are out of warrantee. I think companies are using it to basically have no support and basically leave what would be a decent customer hanging and hoping they can covert to another sale. There is no 'one reason' there is a list.
It does seem like there ought to be a reasonable split between personal software and business stuff. I mean you guys had a big contract, it is some negotiated thing between two peers, it could be reasonable to negotiate root in some subsystems, not in others. In the end you can’t really trust anything a system tells you if somebody has full root of it. It seems like you guys keeping control of the logging would be a reasonable give for them, if they expect support. (But why would you guys have planned around a downright adversarial customer? That guy is weird).
Also, doesn’t this seem like… basically some kind of fraud? I wonder if your annoying user expected to be able to add the savings whatever he got back from the support contract to his “value to the company” somehow.
For personal customers who are just buying smartphones, we don’t really have giant support contracts to screw around with.
It was frustrating. As it was me who got to speed weeks figuring out what this guy did. My group figured out the root cause though was the software was not doing what he wanted. So we made up a new group to sell that custom service to others. Everyone eventually came out ahead there. Because someone in his management chain realized that we had a good breach of contract case. Weird is nice for what he was doing. He was being a jerk because the stuff was forced on him. It broke his small empire of spreadsheets he was holding the company hostage with. Our 'mistake' was assuming our customers were rational. Many are. But you always have a handful that seem to just be in a bad place in life and they like to take it out on others.
For IoT devices/cell/etc it could be 'bad' to give out the root password from a company PoV. As there are so many out there with the exact same password on several thousand devices (poor security but you can image a thousand devices in a few hours). So once given out it is written down into some wiki and everyone has it now (welcome to the botnet). So if you get one change whatever you were given and assume everyone else has it. Or maybe the 'secret sauce app' is under some random user account. But give out root and that special secret account is bypassed. Then it is off to china somewhere to be ripped apart and resold under a new brand name and half the cost.
Then on top of that lets say you are a nice company giving the thing out. That means you will need some sort of training for your support guys. Documentation on how to do it. And so on. Those things cost money for a EoL product you no longer make anything on.
Like I said there is a list of things as to why not to do it. There is also an interesting list of why to do it. But the upside is low for the company to allow it. I wish more companies would do it. But it is rare.
If people want companies to do this, the company has to be incentivized to want to spend any time/money on it. If people can make this an upside to companies doing this and not 'shame' and 'you broke the law' the companies will help.
It's an old anecdote, but years ago Samsung refused a warranty claim for a _failed USB port_ that would no longer charge the phone _because I had rooted it_ and the fuses were burnt. I think this was unreasonable of them, but it's not like I had any recourse. If vendors were really worried about this aspect, they would/could implement such draconian policies.
That only explains why a company wouldn't want to provide free software support for software they didn't write. There are at least two alternatives to that. First, sell hardware the user can replace the software on, or that doesn't even come with software, and then don't provide software support at all. Second, provide software support and bill by the hour, in which case the customer messing up their stuff and calling your support is the opposite of a problem.
You can even combine them if you want. Free support for the software that comes with it but if you replace the boot loader then support calls are billed hourly. There is no excuse for not allowing it -- it's leaving money on the table.
Unless the reason is that locking the user out of the device has the purpose of monopolizing ancillary markets, which should be an antitrust violation.
Before we put all the blame on vendors, I submit to you, ladies and gentlemen, this: the public finds this tradeoff (privacy for entertainment) completely acceptable. With all the outrage, privacy-centric solutions are out there and relatively easy to find, how come they don't get more traction? Including among the HN crowd?
There is nothing inherent to the benefits that these companies tout that require them to lock us out of our own devices.
What you are describing is not a tradeoff but a magnificent bribe. They bribe us with measly benefits in order to accept the deal that is incredibly favourable for them.
I'd argue the general population doesn't even know this trade off exists (not helped by the pros being advertised to users and the cons purposely not mentioned). Even then the minority (us) shouldn't be stopped from doing what we want with our stuff just so some company can make more money.
> privacy-centric solutions are out there and relatively easy to find
Really? Please name them. Over the past 10 or 15 years, I've never seen anything other than the iPhone/Android or Mac/Windows duopoly for sale in any retail store. I've never seen any advertising for other than those duopolies. The HN crowd may be aware of obscure options, but for the vast majority of consumers, they don't exist. And since we as developers make money catering to the vast majority of consumers, we're kind of stuck with the duopoly too, at least as far as our work is concerned.
And as for "why are not selling this in every retail store?", the answer is the same - because if they were, no one would buy them. I found the situation curious, while everyone complains about it, only very few people are trying to do anything about it. Perhaps the breaking point was not reached yet, and something big has to happen to change people's perspective.
> And as for "why are not selling this in every retail store?", the answer is the same - because if they were, no one would buy them.
That's purely hypothetical. How could any prove or disprove the assertion?
The general point, though, is that consumer awareness is essential for sales. People won't buy things that they don't know about. As an indie developer myself, I'm painfully aware of this. It doesn't matter how great one's product is if nobody knows about it. Advertising is very expensive, so it requires vast capital outlays in order to get your products into the minds of consumers and onto the shelves of stores. The big established brands have a massive advantage, making it difficult for competitors to break into the market. Apple itself leveraged its existing brand, with Mac and iPod, in order to promote iPhone. And Apple's primary competitor is Google, who also was already an established brand via search and Chrome.
Remember that back in the day, Microsoft almost destroyed the entire desktop OS market. They almost killed Apple too. Only the Department of Justice put some kind of break on it, and Microsoft let Apple live in order to provide antitrust cover. If MS had for example simply withdrawn its apps from Mac—Office, Internet Explorer (remember that Internet Explorer was originally the default web browser on Mac OS X before Safari!)—Apple likely would have died.
It's not just about familiarity. People are willing to try new things. The actual problems are network effects and vendor lock-in.
The hardest part about switching from Facebook isn't installing some other app or anything like that, it's getting everyone else you know to switch from Facebook.
The hardest part about switching from Windows isn't installing Linux, it's getting e.g. game developers to target Linux before it has significant consumer market share.
That isn't to say that doing these thing is impossible, but it certainly isn't trivial, so anyone wondering why it hasn't happened already can't seriously think the only explanation is that nobody cares. It's like saying nobody cares about high healthcare costs -- of course they care, the question is what do we have to do to fix it?
I'm glad that Fairphones are available in stores right next to Xiaomis, but they cost three times the price for half the specifications. It may plausibly be cheaper to buy a Xiaomi phone and then personally sue Xiaomi to get it unlocked than to buy a Fairphone.
I've never even heard of that before, and I'm terminally online.
Anyway, desktop computers aren't really the main problem here. For example, Apple Macs offer vastly more personal freedom than Apple iPhones. If iPhones behaved like Macs in that respect, then we might not be having this debate. To the extent that Macs have been increasingly locked down over the past 15 years, it's mostly just copying the iPhone, porting the "features" over from one platform to the other.
FWIW, I have no idea if this is any good. My point is, I found this after maybe 3 minutes searching. If we were to spend 30 minutes, we would definitely find something reasonable.
I'm using a GNU/Linux phone (Librem 5) as a daily driver, and it has a lot of rough edges. Root access is a no-brainer (it basically runs Debian), but a small company making them can't possibly provide Apple experience.
That's fair. What kind of rough edges did you find? I think I'm OK without any Google services, because I can simply keep another phone just for those and banks/trading platforms.
Thanks! Looks like some of the concerns are legit. I guess I'll carry two phones if I buy this one. The web browsing experience is the most concerning one -- if that's bad then I might as well use a dumb phone.
I have no data to back this up. So what follows is purely my personal opinion.
I think the reason people don't care, is because they don't know. The average person either doesn't know or barely knows That anything deeper than what they see in the user interface is happening on their system.
We humans are very much an out of sight out of mind type of creature. If we can't see it, it's hard for us to imagine that it exists.
People know, Facebook and Google getting crap for all their tracking is evidence enough.
The reason people don't care is because digital freedom/privacy is largely irrelevant to most people's lives. You can't convince someone to care about something that doesn't affect their life, they're too busy for that.
Exactly. Even the people who complain about these things immediately get defensive when you call them out on their uses: "Well, I can't switch because what about my banking app?" or "Well, games don't count as software to me." or "It won't make any difference to the big tech companies if I'm the only one who switches, so why bother?"
I believe GP is referring to things like privacy-centric de-Googled Android phones, which definitely are an option. I would not classify those as "least bad" or even bad.
GP is correct about Apple products; even among the HN crowd they are likely the most popular devices. I think this is because most readers aren't trying to die on the hill of openness. They're more concerned with software and ubiquity, two areas where Apple is doing very well.
You do get many here enthusiastic about open access to your own hardware, but I think we're talking about a Venn diagram; we're not all the same. (I'm an Android user.)
Actually, I was disagreeing with the GP specifically about Apple products. I'm an Apple user, but very much because they're the "least bad" option. De-Googled Android phones still have awful audio latency (I'm a musician who makes a music app on the side), very limited messaging and notification features, and integrate poorly with desktop OSes. For how I use my devices, open or no, Android simply isn't a viable option.
The thing about all this is, Apple's products being well-integrated and well-designed doesn't require them to be locked down the way they are. The EU move to force them to use USB-C/Thunderbolt over Lightning is a perfect example of this. It unilaterally improved things for users, and iPhone 14 vs. 15 sales reflected that pretty clearly.
So I'd especially describe Apple as the "least bad" rather than "completely acceptable." They're specifically what I had in mind saying that.
> Apple's products being well-integrated and well-designed doesn't require them to be locked down the way they are.
That's definitely true, and it's what has made me favor Google over Apple for decades now. Google's deal has been free software for the price of your user data, but I've accepted that deal because Google has never practiced predatory lock-in. Apple makes claims to value your privacy (I wouldn't know) while making predatory lock-in fundamental to everything they do. Denying access to your own device is part of this.
The irony is that I loathe the data economy. I think it has gone far beyond what Google ever envisioned (for years it seemed they had yet to discover a way to make money at all). The privacy aspect matters, but I also hate the way it makes companies and their products behave; the way it feels like every click results in an attempt to directly advertise to you. And it's all clumsy and broken. How often are ads even correctly targeted? I feel about conglomerated user data the way I feel about meme coins: it's all built on speculation, hopes, and dreams, and has less to do with people actually buying your product. I can't wait for the bubble to burst and/or for a global ban on the sale and purchase of user data.
I think we're very much in agreement on most of these things, and our "platform loyalty" led us to perceive different options as the "least bad" - that's totally okay, though! I was an Android user from 2009-2020 because I agreed with you, up until I started working on my own music software, which pushed things the other way for me.
For your last sentence, though... user data and its utilities are arguably not a "bubble." And as we've seen with AI training, use of data being illegal doesn't really stop companies from doing it. I think we'll have better actual results from governments forcing Apple to let us run our own software on the hardware we buy, as opposed to governments trying to prevent Google, Meta, et. al. from abusing customer data.
A lot of this has to do with the fact that the former is about regulating our rights with hardware, while the latter is about software. Hardware is just easier for governments to regulate. When you try to regulate software, companies will do things like the deliberately-annoying cookie popups we got after GDPR/CCPA, because it's cheap to produce lots of bullshit to experiment with ways around those regulations.
This isn't about privacy. Not directly anyway. This is about your right to have control of your own property.
You make a fair point though; the case does need to be made as to why this is a market failure and not just consumer choice working as expected. Why _do_ consumers tolerate manufacturers retaining ultimate control of consumer's property after the sale? It certainly doesn't seem to be that important to them. Maybe greater awareness of the issue would help somewhat?
> Why _do_ consumers tolerate manufacturers retaining ultimate control of consumer's property after the sale?
Just my opinion from many conversations with normies about this: It's because most of them don't know (the marketing material from these companies certainly doesn't advertise it), and the ones who do know don't care because they wouldn't be able to (technical knowledge) or want to root/unlock and utilize the capabilities.
> the ones who do know don't care because they wouldn't be able to (technical knowledge) or want to root/unlock and utilize the capabilities
This is a good point. Some of that is perhaps self-perpetuating: Why root if there's nothing you can do with root? And why develop stuff you can do with root if there's nobody who can use it? If there weren't so much active suppression of software freedom by manufacturers maybe the situation would change and the benefits of consumers having full control of their devices would be more apparent.
And ironically, it was the jailbreakers who demonstrated to Apple why the company needed to add third-party apps to its platform that originally didn't allow them.
Agree fully. Don’t know why you’re being downvoted. I accept the risk or tradeoff of Apple or MS spying on me. It’s not that, but the right to repair, to tinker, to hack. Those things have brought us so much interesting wonderful things. My entire generation (millennial) has superior tech literacy to both those that came before and after (no shade to the older gen - some of you are better than us, but with millennials it’s so much more widespread than eg gen X). Many younger gens never use ”real” computers (only tablet & phone). The gilded age was an anomaly, and is over.
> the case does need to be made as to why this is a market failure and not just consumer choice working as expected
I swear this consumer choice navel gazing will be the death of innovation. The US is obsessed with this narrative, that the magic market hand will self-correct, without any justification or scrutiny. Yes, consumer choice is necessary, but not sufficient. Just look at the developments in tech over the last decade+. I don’t have the solution but anyone who’s not entirely lost in dogma should be able to see the failures.
Market failures do happen, so I'm not claiming consumer choice is the perfect solution in every case. But consumers aren't stupid either; if this _were_ a mainstream concern the market _would_ self-correct. But it hasn't self-corrected on this issue, because most consumers don't really care that much. So I think you have to carefully consider why that is before you start thinking you know what they want better than than they do and eliminating certain choices by government decree.
There are costs to any regulation, and lots of possible unintended consequences. So even though I'm personally a strong advocate for user control and software freedom, I'm wary of acting without strong justification and careful consideration of the underlying reasons behind the status quo.
> I accept the risk or tradeoff of Apple or MS spying on me.
For what it's worth, I do think this issue has indirect effects on privacy. If you have ultimate control of the software on your device, you can use that control in ways that help protect your privacy. Otherwise you're limited to whatever protections the manufacturer decides to grant you.
There are lots of similar positive possible downstream effects of software freedom, which is why I think this is an issue worth serious consideration despite my misgivings.
> if this _were_ a mainstream concern the market _would_ self-correct.
The underlying premise here is that the alternative is available for consumers to choose, i.e. that you can buy something which is otherwise equivalent to an iPhone but supports third party app stores or installing a third party OS. But that isn't the case.
What you get instead is e.g. Fairphone, which has the specs of a $200 phone but costs $800 and if you actually have root your bank app might break etc. And still many people buy it. So all you can conclude from this is that the price the mass market places on freedom is less than $600 plus some non-trivial usability issues, not that they value it at zero and don't care about it at all.
On top of this, it's a threshold issue. If the median phone was rooted, people would develop apps that need root. When the percentage is some low single digit if not a fraction of a percent, they don't, and then taking the trade offs of a phone that can be rooted isn't buying you what it should because you need a critical mass in order to achieve the expected benefits, but you need the benefits in order to achieve the critical mass. This is the sort of situation where a mandate can get you over the hump.
> There are costs to any regulation, and lots of possible unintended consequences.
A good way to handle this is through anti-trust, because then you can do things like exempt any company with less than e.g. 5% market share. That means not Apple or Google or Samsung, but if there is any major problem with the rule then the market can work around it by having 20+ independent companies each provide whatever it is that people actually want. Meanwhile that level of competition might very well solve the original problem on its own, because now a couple of them start selling unlocked devices without any countervailing trade offs and that's enough to make the others do it.
was that when they said “instead of uploading the images to our servers to do the CSAM scan, we’ll do a quick once over in the privacy of your own phone to see if we can allow-list your photo” ?
And then the whole world suddenly went apeshit, so Apple basically shrugged, said “fine, we’ll do it just like everyone else and put your photos in the relatively unprotected server domain to do the scan”. Sucks to be you.
Understand that at no point was there an option to not do the scan on upload, like all cloud providers, Apple scans for CSAM on any uploaded photos to stay out of any government grey areas.
A server is someone else's device. Your phone is your own device. So no, doing the scan on your own device and making your device your potential adversary is not better than doing it on the server. You can always choose not to use the server.
Apple only ever scanned images being uploaded to the server. They were only ever going to scan images (even if it was done on the local device) if they were uploaded to the server.
On the one hand you have:
- do the scan in private, get a pass (I'm assuming we all get a pass), and no-one outside of your phone ever even looks at your images.
On the other hand, you:
- do the scan on upload. Some random bloke in support gets tasked with looking at 1 in every 10,000 images (or whatever) to make sure the algorithm is working, and your photo of little Bobby doing somersaults in the back garden is now being studied by Jim.
If you never uploaded it, it was never scanned, in either case.
So yes, you've lost privacy because faux outrage on the internet raised enough eyebrows. Way to go.
It also significantly hampers progress and the utility of tools themselves.
This is hacker news after all. What made the computer great was programs. What made the smart phone great (smart) is applications. It's insane to me that these companies are locking down their most valuable assets. The only way this works is if you're omniscient and can make all the programs users could want yourself. This is impossible considering both individuality and entropy (time). Both in the sense that time marches on and the fact that you don't have time nor infinite resources to accomplish all that. I mean we're talking about companies that didn't think to put a flashlight into a phone but it was one of the first apps developed. You could point to a number of high utility apps, but I'm also sure there's many you all use that you're unlikely to find on most people's phones.
We can also look at the maker community. Its flourished for centuries, millennia even. People are always innovating, adapting tools to their unique needs and situations. To some degree this is innately human and I'm not embellishing when I say that closed gardens and closed systems are dehumanizing. It limits us from being us. That person obsessed with cars and makes a sleeper Honda civic, that person that turns trash into art, that person that finds a new use for every day objects. Why would you want to take this away? It even hurts their bottom lines! People freely innovate and you get to reap the rewards. People explore, hack, and educate themselves, dreaming of working on your tech because of the environment you created. By locking down you forgo both short term and long term rewards.
I also want to add that we should not let any entity claim to be environmentally friendly or climate conscious that does not create open systems. No matter how much recycling they do. Because it is Reduce, Reuse, Recycle. In that order. You can't reuse if your things turn to garbage and reusing certainly plays a major role in reducing.
this!!! sustainability is a huge aspect that seems to be getting lost in the broader discussion. locked devices are leading to an incredible amount of e-waste and it's entirely preventable.
People living in "bad neighborhoods" have to spend more energy and money on locks, fences, security cameras, self-policing as to not go out alone after dark, etc.
Problem is, Internet (and international phone system, to a lesser degree) makes everything so much closer, that scammers from half-way around the globe are "local" for all intents as purposes. Thus, online, every neighborhood is a "bad neighborhood".
And there are plenty of laws in many countries on how to use them, seatbelts, helmets, chain gloves, plastic cover, minimum age, access exam,...
Failure to obey them, might get jail time on those countries if caught disobeying, or an hefty fine, not counting what misuse might bring in, regardless of the country.
By "we" I meant online commenters debating the issue of tech company device lockdown.
I didn't mean "the law". To the contrary, the submitted article author was proposing that we pass laws giving greater individual consumer rights over their devices. But the big tech companies have been viciously fighting against consumer rights, such as the right to repair.
This is also strange as the commenters don't propose the measures that would correspond to viewing computers as more dangerous than guns (lockdown aren't that), but unlike with the law, I don't have a good simple illustration of that.
Vendor lockdown is that. Defenders of vendor lockdown argue that computer users need to be protected paternalistically from themselves.
For some reason we accept that for computers, but nobody would accept refrigerators and ovens that only allow you to eat healthy foods, nobody would accept homebuilders controlling the doors of your house and having to approve anyone who comes in, etc. Why do computers get this special treatment of vendor lockdown, but not any other product?
Wow, ok, if you think this is on par with the lockdowns that the commenters support for guns (which I've previously proxied as ~ existing restrictions), then I'm not sure what to say
> Why do computers get this special treatment of vendor lockdown, but not any other product?
Of course they don't, plenty of other products are treated much more seriously by "us" (supporting lockdowns that limit your own use without supervision), some of them you've already listed
You appear to be conflating two different things: legal mandates and vendor lockdown.
There are legal mandates regarding the sale and use of certain products. For example, you have to be a minimum age to buy cigarettes and alcohol, stores in some localities can only sell alcohol during certain hours, bars have to close at a certain time, you can't drive drunk, you must wear a seatbelt, you can't exceed the speed limit, etc.
But there are no vendor lockdowns in this regard. A cigarette will allow anyone to smoke it, a container of alcohol will allow anyone to drink it, you car still works if you're drunk and don't put on your seatbelt, etc. If your car made you take a breathalyzer test whenever you wanted to drive, or it didn't allow you to exceed the speed limit, that would be vendor lockdown.
I discussed the issue in another comment: "The equivalent would be if you could only use specific brands of replacement chains, blades, tires, or bullets that are approved by the manufacturer, for which the manufacturer gets a cut of the sales of those replacements." https://news.ycombinator.com/item?id=42684134
A cigarette will not allow anyone to smoke it if the vendor locked down sales and can't sell it to you, so you don't have it.
This is a much more serious restriction of consumer freedoms than if anyone can buy and use, but can't smoke when drunk and in bed (fire safety) due to some other lockdown mechanism built into the cigarette itself. More people are affected, and the effect of full exclusion is stronger even though the "lock" mechanism is built differently
So here is your mistake when you only accept something almost literally identical to computer lockdown (same with your fridge example), but brushing off more serious usage "lockdowns" that don't exist with computers
> The equivalent would be if you could only use specific brands of replacement chains, blades, tires, or bullets that are approved by the manufacturer, for which the manufacturer gets a cut of the sales of those replacements."
Yes, this exists and is common in complex mechanical things, e.g., you lose warranty if you use unapproved parts, or for some parts there is actually not even an alternative, so manufacture is the only one getting a cut
So again, there is nothing unique or "most dangerous" about computers in either reality or people's prescriptions
Although since your argument isn't about real restrictions, but about what commenters support, you'd need to ask them which of these existing restrictions they support vs computers
> A cigarette will not allow anyone to smoke it if the vendor locked down sales and can't sell it to you, so you don't have it.
You're equivocating on the word "vendor". You know full well that in this context, the vendor means the manufacturer of the computer, for example, Apple, and not the retail store selling the computer, which may not be Apple but rather Best Buy, for example. Likewise, in my analogy, vendor lockdown of a cigarrete would mean lockdown from the manufacturer of the cigarette, for example, Philip Morris, and not the retail store selling the cigarette.
> This is a much more serious restriction of consumer freedoms than if anyone can buy and use, but can't smoke when drunk and in bed (fire safety) due to some other lockdown mechanism built into the cigarette itself. More people are affected
This is actually false, because the only restriction on the sale of cigarettes is that you can't buy them if you're under age 18. Anyone age 18 or older is free to buy and smoke as many cigarettes as they want. Adults have full, unrestricted freedom. And that's what they should have for computers too. For better or worse, children have a huge number of legal restrictions on them.
Computer vendor lockdown affects all adults, no matter how old. Indeed, some people claim that the point is to protect your grandma, yadda yadda.
This is actually my point about being "dangerous". That is, we seem to consider computers as the most dangerous product for fully grown adults who have no age-related restrictions on purchasing things, because nobody is proposing or defending manufacturer lockdowns on other products for fully grown adults. We think that fully grown adults get to decided whether to smoke cigarettes, drink alcohol, eat junk food, etc., but for some reason fully grown adults can't decide to install software on their own computer.
> So here is your mistake when you only accept something almost literally identical to computer lockdown (same with your fridge example), but brushing off more serious usage "lockdowns" that don't exist with computers
I wasn't "brushing off" legal restrictions. I was merely distinguishing them from restrictions that come from the manufacturer.
The difference, of course, is that computer vendor lockdown is not legally mandated, and thus they don't have to lock down the devices. They're doing it totally voluntarily, and I believe the reason is increased profit rather than increased security.
> Yes, this exists and is common in complex mechanical things, e.g., you lose warranty if you use unapproved parts, or for some parts there is actually not even an alternative, so manufacture is the only one getting a cut
And this malicious practice is being challenged by "right to repair" laws.
> So again, there is nothing unique or "most dangerous" about computers in either reality or people's prescriptions
You're missing the entire point here. There are a lot of people who defend computer vendor software lockdown, in the name of "security", but there aren't nearly as many people who defend the warranty practices you just mentioned.
I find the limitations of your analogy artifical and thus irrelevant. Other people thinking about the trade-offs aren't bound by whether you decide that in the whole supply chain only the manufacturer's limits should be considered. So while you're free to arbitrarily limit your thinking, that won't help you answer questions like "Why do computers get this special treatment of vendor lockdown, but not any other product?"
> reason is increased profit rather than increased security.
That's fine, but we shouldn't rely on vendor motivation anyway, so the validity of your assessment doesn't help us decide when the increased security is worth it
> You're missing the entire point here
You've cut your quote off to make it seem so. I've explicitly mentioned the perception in the very next sentence
> but there aren't nearly as many people who defend the warranty practices you just mentioned.
That would depend entirely on the specific tech involved and other factors. Are you sure people defending software vendor lockdowns would not defend some limits for parts for nuclear plants? For guns?
Also why did you skip the "for some parts there is actually not even an alternative" practice?
Would fewer people defend the right of a manufacturer to also manufature parts for sale (forcing some kind of divestment so that the "vendor" doesn't get an extra "fee" from the parts business)?
Prior to modern AI, one could be done at scale, now I suppose both can which may change my calculus on this one. I hadn't thought about that until your comment. Thanks!
This scam is much older than the Internet or even computers. It was called a Spanish Prisoner scam in the 19th century but I would be surprised if it wasn't happening in the ancient world via cuneiform tablets.
> Exactly how many people have fallen for the scam, out of all computer users
Who the fuck knows ? And how is that even remotely a useful question to ask - it's not answerable, those who commit the scam are the only people with the figures, and there's no "register of fuckers who scam other people" where they have to tell you how well they do.
> how exactly does device vendor lockdown stop this particular scam
Premise 1: All (for a suitable definition of "all") computer users are clueless when it comes to internet security
Premise 2: You are not trying to help any given individual's security, because some of them violate premise #1. You are trying to raise the bar for the clueless hurting themselves.
Premise 3: It is not about "personal freedom". It is about preventing the clueless (by no fault of their own, this shit is complicated) becoming drones and mules for attacks on others. It is an attempt to increase the greater good at the expense of placing restrictions on what any individual can do on their own phone. Those restrictions can be mitigated mainly by coughing up $100/year, which is a sufficient bar to prevent bad guys from doing it en-masse, but not so high as to prevent the people who want to do stuff from doing it.
Stopping people doing stupid stuff because they don't know any better is the goal, and that inevitably gets more and more restrictive as time progresses, because an arms race is instituted between the truly evil arseholes who prey on the clueless, and the manufacturers who don't want their produces seen as vehicles leading the clueless to the slaughter.
Personally I don't give a crap. The iPhone is fine for me as-is, I can install my own software on my own phone, and sure it costs $100/year. That's not a big deal IMHO, in terms of outgoings it barely registers above the noise floor. YMMV.
> Who the fuck knows ? And how is that even remotely a useful question to ask - it's not answerable, those who commit the scam are the only people with the figures, and there's no "register of fuckers who scam other people" where they have to tell you how well they do.
Um, why do crime statistics have to come from the perpetrators rather than from the victims? The victims report the crimes, duh.
Anyway, you spent a lot of words avoiding my question, which is how exactly does vendor lockdown stop the Nigerian prince scam? You're arguing that vendor lockdown is supposed to protect consumers, but you can't seem to explain how or how often.
> Um, why do crime statistics have to come from the perpetrators rather than from the victims? The victims report the crimes, duh.
You asked for (quoting) "Exactly how many people have fallen for the scam, out of all computer users". Not every crime is reported, duh.
> Anyway, you spent a lot of words avoiding my question
Nope. I can't answer the question because it's non-answerable. If you believe that nobody has ever fallen for phishing, Nigerian-prince, etc. etc. scams, well, I don't know what colour the sky is on your world, but it's not the same as on mine...
If you further believe that allowing everyone root access to devices that are also linked directly to their bank accounts, social security numbers, driving licenses, etc. etc. Then again, sky colour becomes an issue.
You seem technically savvy. I do not believe you are typical of the average phone user. I think the restrictions in place are a necessary tragedy of the commons, to prevent the destruction of trust in the system as a whole.
As I said, YMMV, and I'm not saying I particularly like the situation, just that I think it's necessary, and opening up everything to everyone is a foolish, idealistic, and hopelessly naive idea.
> You asked for (quoting) "Exactly how many people have fallen for the scam, out of all computer users". Not every crime is reported, duh.
Not every crime is reported, but it's indisputable that a lot of crimes are reported. So give me a statistic, any reported statistic.
> If you believe that nobody has ever fallen for phishing, Nigerian-prince, etc. etc. scams, well, I don't know what colour the sky is on your world, but it's not the same as on mine...
How do you know this, except from reports by victims? That's what I'm asking for.
And once again, you haven't explained the mechanism by which vendor lockdown prevents this scam. However many or few victims there are of the scam, precisely zero of them are helped by vendor lockdown. I'm not going to stop asking how to explain how vendor lockdown is event relevant here.
> If you further believe that allowing everyone root access to devices that are also linked directly to their bank accounts, social security numbers, driving licenses, etc. etc.
This is hand waving, and it's not clear how root access by the owner of the device somehow exposes userland data to criminals. Moreover, all of this data is on desktop computers, and it's mostly fine.
As I said, I don't care about the current OS situation, I think it's actually pretty well reasoned out. I'm not spending my time tracking down statistics for you to "prove" some point to some other person on the internet.
$700k a year as an excuse to lock down over a billion smartphones? Not to mention that once again, this is an email scam, and thus vendor lockdown is irrelevant and doesn't prevent it.
It appears that you're the one believing whatever you want to believe, despite the empirical facts. The problem is that proponents of vendor lockdown always make gross exaggerations to defend it, pure fearmongering.
> There are a ton of products on the market that are vastly more dangerous than computers
The thing with chainsaws and motorcycles is that they look and feel dangerous, and people have an intuitive understanding of how to approach those dangers.
If you ask a random person on the street about safe motorcycle riding, they'll probably tell you about respecting speed limits, wearing protective gear, only doing it when sober, not pulling stunts / showing off etc. I've never been on a motorcycle, have 0 interest in them, and I know those things.
Computers don't work that way. People can't distinguish between a real banking app and a fake banking app that looks real, an update pop-up and a fake "you need to update Adobe Flash Player" pop-up on a phishing website etc.
I've done plenty of "helping non-technical people out with computers" during my middle / secondary school days. That was when people still used Windows a lot, as opposed to doing everything on their phones. Most computers I've seen back then had some app that hijacked your start page, changed your search engine to something strange, would constantly open random websites with "dpwnload now free wallpapers and ring tones for your mobile now" etc. You didn't even have to fall for a scam to get something like that, plenty of reputable software came with such "add-ons", because that's how you made money back then.
I feel like that era of "total freedom" has somehow been erased from our minds, and we're looking at things through rose-tinted glasses. I, for one, vastly prefer the world of personalized ads and invasive surveillance over one where I constantly have to be on alert for my default browser being changed to Google Chrome for the hundredth time this year, just because I decided to update Skype.
> If you ask a random person on the street about safe motorcycle riding, they'll probably tell you about respecting speed limits, wearing protective gear, only doing it when sober, not pulling stunts / showing off etc. I've never been on a motorcycle, have 0 interest in them, and I know those things.
How did this matter? People may know these things, but they nonetheless ignore speed limits, don't wear helmets, drive drunk, pull stunts, etc. And the motorcycle manufacturer can't stop them. They have the freedom to harm themselves.
> Computers don't work that way. People can't distinguish between a real banking app and a fake banking app that looks real
Guess what, people can't distinguish between the real and fake apps in the crApp Store either. Let's stop pretending that it's safe, when we've seen over and over that it's not.
> That was when people still used Windows a lot, as opposed to doing everything on their phones.
People still use Windows a lot. Smartphones have not replaced desktop computers but rather added to desktop computers. Almost every desktop computer owner also has a smartphone I believe that desktop computer sales are as high now as ever; I know that's true for Apple Macs, specifically.
> I feel like that era of "total freedom" has somehow been erased from our minds, and we're looking at things through rose-tinted glasses.
It hasn't been erased. The desktop never left. It's been surpassed in volume by smartphones, of course, but let's not pretend that desktops were somehow made obsolete and removed from the Earth. The people who have enough money buy smartphones and desktops. Many even have a smartphone, a desktop/laptop, and a tablet. The choice is not about security, it's about money and form factor. When I leave home, I put a phone in my pocket. When I'm on the couch, I use a laptop. When I'm reading an ebook, I use a tablet.
> You didn't even have to fall for a scam to get something like that, plenty of reputable software came with such "add-ons", because that's how you made money back then.
That's why you never blindly clicked "next" in installers. Everyone got one of those IE toolbars accidentally at some point, but it usually only took doing it once to learn the lesson.
> There are a ton of products on the market that are vastly more dangerous than computers
An irrelevant "whaddabout" argument.
It doesn't change that we need security and privacy for our information handling devices, as well as personal control. The real conversation is about how to best balance these.
> It doesn't change that we need security and privacy for our information handling devices, as well as personal control. The real conversation is about how to best balance these.
An irrelevant false dichotomy argument. There's no inherent conflict between security/privacy and personal control. I would argue that a device which has to phone home to the vendor to get approval for everything results in both less privacy and less personal control.
I guess people are unaware of the various malicious rootkits that have cropped up?
If you're serious about this stuff binary thinking is a mistake. It's not a question of whether rooting is possible or impossible. It's a question of under what circumstances it can be done, and under whose control.
Also, "conflict" is the wrong word here. It's a question of competing concerns not conflicting ones.
We probably want root access to be under the end-user's control, but in such a way that minimizes the ability of malicious parties to exploit it.
e.g., one way would be to allow anyone to easily install any root they want, but to disallow software from, say, the Apple app store from running on such "rooted" devices. While that gives end-users control and would mostly prevent malicious actors from getting things they want, it's probably not what most user's would want. They probably want to run all their regular software along side the root software.
Another way would be to allowing people to easily install software as root, and allow software from popular app stores to run on it. That gives users max control, but is pretty easy for malicious actors to exploit too. People aren't going to be too happy with this when some coupon clipping app starts emptying people's bank accounts.
These are just examples to give the idea of the range of possibilities. The real answer needs to be a lot more nuanced than this. The point is, pretending there aren't issues doesn't get us anywhere. You might as well have no opinion on this.
I just don't have this paternalistic instinct to try and protect people from rootkits. Even if I did, this is certainly the wrong way to do so—you need to hold companies accountable for the flaws in their software (for which we have basically no legislation at the moment) or they have no incentive to make the regulations meaningfully protective. Otherwise you just end up with shipping hardware that's still insecure, but checks the right regulatory checkboxes, and still restricts people from using the hardware they bought, and still no way to remediate when something inevitably does slip past the regulatory controls.
> It doesn't change that we need security and privacy for our information handling devices, as well as personal control.
I can do online banking on my PC as root user if I so choose, but I cannot do online banking on my phone because my bank's app employs a rooting detector SDK that as of now even Magisk+a host of (questionable) modules can't bypass.
"The reason that computers are locked down by the vendors is not that computers are somehow more dangerous than other things we buy "
It makes sense to allow the _buyer_ to responsibly lock out others. This is common in other products that could be dangerous. But allowing the _seller_ to lock out others, e.g., competitors or the buyer, is a recipe for malfeasance, at the buyer's expense. Interestingly, with computers and pre-installed software, there is no option to lock out the sellers such as Apple or the companies that partner with sellers and pre-install software on the computers, such as Microsoft, Google, etc.
"It's all about protecting their profits, not protecting us."
It is interesting that the "protections" are not optional. It assumed _every_ buyer wants the protections from others _and also from themselves_ enabled by default, and also for protections from so-called "tech" companies to be _disabled_ by default. A remarkable coincidence.
Perhaps if buyers were given the option to login as single user and change the default protections some (not all) might disable phoning home to Silicon Valley or Redmond. They might block unwanted access to their computers by so-called "tech" companies who sell them out as ad targets. The so-called "tech" companies and their customers (advertisers) from other peoples' computers might be locked out.
Indeed letting buyers lock out whomever they choose might diminish the profits of so-called "tech" companies.
In the past HN commenters often sidestepped the question of these "protections" as self-serving and argued that so-called "tech" companies serve the "majority" of computer users and in fact these "protections" are what computer users want even though these users were never asked or given the choice to opt-out. If that were true then allowing a "minority" of users to control the protections themselves, i.e., operate as root, would only affect a minority of profits.
Actually, chainsaws, table saws, cars, motorcycles, and even guns all have safety mechanisms installed by the manufacturers and tampering with them voids the warranty.
Nobody is arguing that computers shouldn't have safety mechanisms. But the manufacturers of those devices don't have remote control over what I do with them. There's no equivalent of a "curated App Store" (and one that requires a 30% cut to the manufacturer, which is the real point behind it).
The equivalent would be if you could only use specific brands of replacement chains, blades, tires, or bullets that are approved by the manufacturer, for which the manufacturer gets a cut of the sales of those replacements.
Tampering with safety mechanisms on your car voids the warranty on the safety mechanism, not on your whole car. Otherwise using third party mechanics would be impossible.
> I think it's bizarre that we treat computers as the most dangerous products in the world that for some reason demand paternalism, when none of these other products are locked down by the vendor.
That's because there are people behind every product, and the people behind computers tend to be the paternalistic, nanny-state type. Just read through the histrionics in any HN thread about leaf blowers, they want every landscaper locked up and their tools of the trade taken away. Someone once suggested they should be forced to use rakes. Imagine if some landscaper insisted what laptop you should use.
As you wouldn't expect to find many in-the-Army buzz-cut guys roaming the Google campus as you would at a gun company, you wouldn't expect some blue-haired face-pierced sales engineer selling you table saws.
Thanks, but no. I'm never buying a device with easy root access for a non technical family member ever again. Freedom is great, and I'm using this freedom to buy something with exactly the capabilities I need.
So they'll never use a PC or laptop or anything of that ilk again?
To use the same logic, they shouldn't be given anything with a visible screw, or are you going to tell me they _wouldn't_ take a screw driver to an appliance because that would be silly for someone who doesn't know what they're doing in there?
If there were a multi-billion dollar industry of scammers always trying to trick them into taking the screws out of things so they could steal from them, then no I probably wouldn't buy them anything with visible screws.
Which is probably fine, that's not the same as taking away everyone's screwdrivers.
The problem is that a line is being drawn in an arbitrary place; if scammers are the worry, don't let them have a phone, or internet or email either, in fact just don't let them talk to any strangers in person or otherwise, but that would be awfully inconvenient for them.
Everyone is willing to make a compromise somewhere so long as the compromise isn't something they care about. Some readers probably think the suggestion of taking away their phone or email is absurd to protect them from scammers, and I'd place preventing root-access in the same category; not disabling it by default, I'm ok with that, but preventing it entirely.
My opinion is that everything should be secure by default, but when it's something you own, there should be reasonable, measured steps to "unsecure" it, whether that's removing a couple of screws, or accepting a disclaimer to gain root access to the device you own.
If I don't own it, let's cut the bullshit and tell me I'm merely licensing or renting it, and we'll adjust the price I'm willing to pay accordingly.
This is a strange argument, kind of like, "We can't defund the police, because look at all of the crime out there!" If there's so much crime occuring already, then what in the world are the police doing?
To an extent, crime can't be eliminated. You can't even eliminate crime by instituting a strict authoritarian regime, because power corrupts, and those in power become criminals themselves. That's why turning big tech companies into paternalistic device authoritarians doesn't work. The big tech companies have become massively corrupt, demanding a 30% cut of everything that happens on your devices, in return for what? Some low paid, low skill reviewer spending a few minutes to approve or reject a third party app submission? That's not security, it's security theater.
There were phone scams before there were smartphones. Before there were mobile phones, when everyone had a landline. There's no technical solution for crime and scams, much as tech people want there to be. Education and viligence have always been the only effective resistance. Unfortunately, the big tech companies don't want to do education; to the contrary, they want consumers to be eternally technically ignorant—despite the increasing importance of computers in our lives—because that's more profitable. At least with cars, we have mandatory driver's education.
It doesn’t have to be easy enough to let through a person who doesn’t understand what they’re doing (aka blindly click through the annoying popups - that’d be bad).
And non-owners shouldn’t be able to have access solely based on their physical possession - quite the contrary, owner should have means to fully use hardware security features for their personal benefit, locking their own device as tight as they want (within the device’s technical capabilities).
I take it you mean easily unlockable bootloader, not really out-of-the-box root access which no phone have.
I have taken the opposite stance on that. Never again will they be left with some Samsung bloatware which hardly receives any Android updates when phones such as Nexus, Nokia and Nothing costs the same and has excellent LineageOS support.
Lineage is stable, bloat-free self-updating and requires no maintenance from my side.
And here is (in effect) a completely legitimate reason for manufacturers to wall off root access. They did not want to sell and support a full-access, general-purpose computer. Nor provide liability coverage for anyone who reprograms their toaster and starts a fire.
Because it isn't at all reasonable. There is no argument to not allow root access. You don't have to use it, perhaps most users would be safer with a conventional user account, but it is not reasonable to outright deny full system right to the owner of a device since there are so many disadvantages connected to that.
My thinking is that if I have device that doesn't allow me root access, then what I have is more than likely a device designed to keep making money for the company that made it or wrote the software for it.
I'm willing to stand corrected, but I can't think of a single smartphone on the market from a reputable manufacturer that is sold with root access. If I want a smartphone I have to accept that the manufacturer will have the bootloader locked down, I don't have a choice.
I have zero experience in the android world, but a quick search tells me that Xiaomi Devices, Google Pixel Phones, OnePlus Devices, Redmi Note 4, Samsung Devices and MediaTek Devices at least are rootable, with some rules with various degrees of freedom for the procedure (in particular warranty is voided pretty much all the time when device is rooted).
Google Pixels are the few devices that enable not only to unlock the bootloader but also the ability to flash your own keys and still have secure boot together with full kernel sources availability (which is why Grapheneos only support them as far as I know).
As far as I know Mediatek (and vendors that use those chips) are usually not good with regards to GPL Compliance, which means no Lineageos if kernel sources are not available...
That's because the opinion presents a strawman position. From the linked-to page :
> I agree with the premise that consumer devices, such as mobile phones, should be as secure as they can by default. This can even go so far as shipping new devices with locked bootloaders and blocking access to root. ..
> But this shouldn’t come at the expense of being able to make an informed choice to unlock these privileges to install any software you want, even if that means adopting a higher level of risk.
One does not require "easy root access" to make that informed choice - complicated root access (within reason, as pulling out the soldering iron might be a step too far) should be enough for tasks like installing a new OS because the company no longer supports the hardware.
my fuel injected chain saw, has a data port,
but luckily, my back woods repair shop showed me the computerless,seasonal re-tune procedure
that only requires a stop watch, works a charm
As to other devices....phones, we need a whole re write of the privacy and publishing laws, to allow each person to regulate themselves. With an ultra basic "standard" set up for the masses who do want
to be entertained, while having buying "oportunites" presented to them. But it has to be consentual, and basics like a phone number, email address, and personal/comercial web space, a non alienable birth right.Ban utopian concepts outright, and get back to bieng the quarlsome and somewhat violent species, that we are.
I am starting to wonder, is the root cause of all of the ancient civilisations, lying in there own dust, what we are doing now, and the vast echoing silence from the stars, the same.
You've left out one important player here: it's not just about the manufacturers and the masses yearning for entertainment, but also about the surveillance industry. Phones in particular, but computers in general, are increasingly important for surveilling the population in novel ways that AI opens up. Giving people root access on their tracking equipment would jeopardize its surveillance functions, because people might elect to give themselves privacy.
This is a very popular HN opinion; but not a very popular real world opinion.
The average customer wants a device that works consistently, every day, that is easy to use, with a collection of 3rd party apps who won’t steal their life savings.
Windows failed to deliver this; the average customer never downloads an Exe from a newer publisher without terror. The average consumer is literally dozens of times more likely to trust a new smartphone app than a new desktop app.
We can also see this in the console market. Windows exists; old gaming PCs exist; the locked down console market will be with us forever because even Windows can’t deliver a simple experience that reliably works.
The average customer wants a car that doesn't explode because you installed a sketchy spark plug. Does that mean the manufacturers should install locks on the hood of every new car, with the threat of jail time if you pick the lock and look underneath?
A sketchy spark plug does not have the ability to make a car explode, so the analogy is pointless.
On that note, even if someone stole your car, at least your car does not have access to your bank account, your passwords, your messages, and even your sexual history. The personal and reputational cost of losing a car is not comparable.
Many people would actually probably prefer their car to be stolen than the contents of their phone be public.
I think a more accurate comparison would be to an electrician. In Australia, doing your own electrical work is a crime even for the homeowner, because it can cause physical death, and is too likely to be done wrong. Yes, you will possibly go to jail for replacing $2 light switches. I assure you that most people’s phones have things they would prefer physical death over being publicly distributed.
> On that note, even if someone stole your car, at least your car does not have access to your bank account, your passwords, your messages, and even your sexual history. The personal and reputational cost of losing a car is not comparable.
You're conflating vendor lockdown with device encryption. The latter does not require the former.
"The very worst offender is Nissan. The Japanese car manufacturer admits in their privacy policy to collecting a wide range of information, including sexual activity, health diagnosis data, and genetic data — but doesn’t specify how. They say they can share and sell consumers’ “preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes” to data brokers, law enforcement, and other third parties."
> In Australia, doing your own electrical work is a crime even for the homeowner, because it can cause physical death, and is too likely to be done wrong. Yes, you will possibly go to jail for replacing $2 light switches
And do you find this reasonable, and a good thing to expand to smartphone use?
There's a lot of it about, mate. The other day I had an American tell me with a straight face that we can get jail time for flying a Union Flag here in Blighty - I guess there's a big industry for convincing people that everywhere else is a hellhole over there.
It’s not a crime to do your own electrical work in Australia, it just invalidates your insurance unless you get the work signed off. The websites saying it “could be illegal” strangely never reference the actual law you’d breach.
> I think a more accurate comparison would be to an electrician. In Australia, doing your own electrical work is a crime even for the homeowner
In this comparison Google and Apple have the role of the government, if you believe that argument, that also implies that you believe they should be broken apart for antitrust
Sounds a lot like "We don't need free speech because I have nothing to say".
Just because you don't need or want it, doesn't mean it's not an important right to protect. Considering the influence of computers these days, the right of general purpose computing is probably at least as important as the right to free speech.
The problem is not the ability to unlock your phone.
The problem is that 90% of people unlocking their phones will either be for piracy (against the company’s interests), or against the customer's own interests (stalkerware, data extraction, sale of stolen devices).
There is a reason malware is over 50 times as prevalent on Android.
> The problem is that 90% of people unlocking their phones will either be for piracy (against the company’s interests), or against the customer's own interests (stalkerware, data extraction, sale of stolen devices).
Why would you think that?
Many Android phones can be unlocked, so it's not a hypothetical situation. I does not enable software piracy, since piracy doesn't depend on root. I know a few persons would install of sort of shit on their phone, including obvious malware, and they lack the knowledge to root their phones.
The data extraction problem happens today on unrooted phones in a "legal" way, it's done by your regular friendly companies like TikTok, Google or Meta. Rooting enables limiting this which is likely why they are against it.
If you look around on forums that discuss the topic of unlocking/rooting Android phones you will see that there is little discussion of piracy and people seem mostly driven by the will to control their own machine instead.
Given that the vast majority of Android devices aren't rooted, bootloader unlocked, or even installing apps from outside the store(s) that they ship with, what exactly do you think is the reason for more malware on Android? (Taking the claim at face value)
>The problem is that 90% of people unlocking their phones will either be for piracy (against the company’s interests), or against the customer's own interests (stalkerware, data extraction, sale of stolen devices).
The first point is irrelevant once I've bought a thing. Once I own a thing it is mine to do with what I want, and the company's interests ought to be irrelevant. As for your second point, that is mitigated by making the process sufficiently annoying (eg. hiding it in the developer menu).
Why do I give a shit about the company? I bought the phone, it's mine, I should be able to unlock it. If I catch malware, I'm an adult and I'll live with my choices.
> There is a reason malware is over 50 times as prevalent on Android.
What's the reason for that bogus-sounding statistic?
Let's say for a second it was accurate (It's probably not), perhaps it's because Android has a far higher market share globally, and it's much cheaper and easier to get started writing apps (or malware) for Android than say iOS.
You also don't need to buy a single device from Google to get started. You can take the PC you're at and get started right away, and publish that app (or malware) without spending a penny (though I don't recall whether they still charge that nominal fee to get a developer account).
Saying 90% of people root for piracy is hilarious, I rooted every Android device I owned until the last one or two, and I've never pirated anything, why did I root? Mainly for customisation and host-based ad-blocking.
I can't understand the thought process of these people who think the things you own should be locked down to protect you.
We should stop selling screwdrivers too in case someone's granny tries to open their toaster and electrocutes themselves, after all, a screwdriver is the pre-tech root access to all things electrical and electronic. I suspect those same people who argue in favour of locking these devices down would also say "don't by silly, my granny wouldn't open her toaster with a screwdriver, because she's not an engineer".
Yeah, agreed. This "I don't want to own my things because I want Big Brother to protect me" attitude is really frustrating, especially when you can have protection without Google holding all the keys. GrapheneOS isn't less secure than stock Android.
It's a kind of madness people only have towards our (technology/IT) industry.
I don't know if it's because they don't understand it, and that's scary, so they think it's safer for the big boys to hold the keys, but imagine if people acted the same in other contexts?
"The bank should keep hold of the keys because otherwise I might accidentally lock myself out, or lose my keys, or leave the door unlocked for a bad guy to come in and steal my stuff".
That's fine if you can't trust yourself to look after them, let someone else handle your keys for you, perhaps someone "trust worthy" could offer it as a service, but I'll keep my keys in my own pocket thanks.
Where does the 50x figure come from and what types of malware does it include? It doesn't really match neither my experience or pricing on the grey exploit market.
Malware has a wide definition however, and if you include all the spyware included with phones that aren't sold outside China and to a degree also India, you could probably hit that mark. But as they aren't allowed to access Google services or the official Play store, it's also a bit misleading.
> The average customer wants a device that works consistently, every day, that is easy to use
And it can only be archived with a fully locked down hardware?
Of course not. The modern OS archives system security through permission and isolation, which don't require bootlock etc to work. In fact, it worked well too even after the device is unlocked & rooted.
> Windows failed to deliver this; the average customer never downloads an Exe from a newer publisher without terror
Windows (and Linux for that matter) is not modern OS. They're classic OS that offers the entire computer as playground for the program running on top of it. That's why Windows can be contaminated with a single malice EXE, but not Android or iOS.
OSs are not the same, don't try get the water muddy that way.
Still, those permissions are standard Linux permissions. So the argument that Linux is less secure than Android is a little hard to understand. A little more specificity might help.
They're definitely not "standard Linux permissions." Yes Android does use many of those (such as standard user IDs, file system permissions, and now SELinux) to implement some of its permissions, but it adds a ton of permissions on top that are not part of Linux.
They're part of Android. Android is not Linux and Linux is not Android, anymore than a car is a wheel and a wheel is a car. Don't confuse the foundation with the building.
Here's the API reference if you'd like details [1]. They are very much not just standard Linux permissions. Android includes a huge set of APIs on top of Linux
There's nothing wrong with wanting that, but as the author said those of us who want to opt-out should have the choice to do so.
If I buy an iPhone, I should have the option to completely disconnect it from Apple and be able to replace the OS with whatever I want. If I do not have the option to do that do I REALLY own the device? The answer is no bacause what I have is a device that I can only use the way Apple allows. When the phone is obsolete and Apple stops updates then all I can do is send it off for recycling since Apple won't allow me to repurpose it with new software.
You are putting a lot of trust in the manufacturers as well. For example, they have the technical capabilities to kill the second hand market in their devices if they simply decided to refuse to allow a new user to login to a device. Sure, you could still sell the hardware, but it wouldn't be much use if the manufacturer stopped it from connecting and autorizing. I know this is an extreme example and no sane manufacturer would implement it, but I think it demonstrates why having to option to disconnect is a good thing.
The same applies to all other devices that are locked down, things like smart TVs, IP cameras and appliances. Just look at how many early smart TVs are now dumb because the manufacturer stopped updating the on-board apps. There should be no reason why the owner of such devices should be allowed to do whatever they want with them to try and bring them back to life.
> Windows failed to deliver this; the average customer never downloads an Exe from a newer publisher without terror. The average consumer is literally dozens of times more likely to trust a new smartphone app than a new desktop app.
Yet that trust is, for the most part, unfounded. There's a ton of malware in app stores - you can assume any app that contains ads is sending data about you to some shady server, for example. You can't even trust the most popular apps not to be malware [0].
> with a collection of 3rd party apps who won’t steal their life savings.
This is blatant unempirical scare mongering. How many desktop computer users have had their life savings stolen by 3rd party apps? Citation needed.
> The average consumer is literally dozens of times more likely to trust a new smartphone app than a new desktop app.
This is a false dichotomy. Almost all desktop computer users have a smartphone too. The people who have enough disposable income buy both smartphones and desktop computers. There's no inherent conflict between the two.
> the locked down console market will be with us forever because even Windows can’t deliver a simple experience that reliably works.
That's a competely ahistorical interpretation. Originally, the gaming consoles had no third-party games: the games were all written by the vendors. The first third-party game development company was Activision, a group of former Atari programmers who learned that their games were responsible for most of Atari's revenue, but Atari refused to give them a cut, so they left and formed their own company. There was a lawsuit, and it was ultimately settled, allowing Atari to get a cut of Activision while allowing Activision to otherwise continue developing console games. It had nothing to do with "reliablity" or "security" or any kind of made-up excuse like that.
> You’re kidding, right? You seem to have completely forgotten, or put the drunk glasses, on what living in the 2000s was like.
Again, citation needed. I made it through the 2000s just fine, thank you.
> What a stereotypical HN comment. Cite something that only applied to the 2nd generation of consoles to prove me wrong, even though my point spans almost all console generations.
No, I was explaining the historical origin of the game console business model. Of course the business model continued, as these things usually do, through a combination of monetary incentives and inertia.
> Again, citation needed. I made it through the 2000s just fine, thank you.
Playing devil's advocate: banking trojans used to be really common here in Brazil back in the pre-smartphone era of the early 2000s (smartphones already existed, but weren't very commmon; most people who used online banking did it through their home computers). They're the reason why, for a long time, it was hard to use online banking on Linux: banks required (and still require) the use of an invasive "security plugin" on the browser (nowadays, there's also a Linux version of that plugin, which IIRC includes a daemon which runs as the root user), which attempts to somehow block and/or detect these banking trojans.
What does this even mean? Do you stand behind what you say? If so, then just say it without hiding behind the devil. And if you don't stand behind what you say, then why in the world are you saying it?
This is a silly criticism. After all, as we all know here (right?), Atari itself fell on hard times. I was talking about the business model, not a specific business. Vendor lockdown and taking a cut of 3rd party software is clearly quite lucrative for vendors, and that's why they do it. There's of course no guarantee of success, but it's obvious why other vendors have emulated that business model.
It may be only for historical reasons that desktop computers aren't completely locked down too. It's a lot easier to lock down a new device class, like smartphones, than it is to lock down an existing open device class, without causing consumer outrage and rebellion.
I worry about the long term health of general-purpose computing. It's not going anywhere today, but I fear for future generations that will likely eventually never know the joy of bending a computer entirely to their will, because they'll have never known computing without guardrails.
If you explain all details about the advantages and disadvantages to them, I am sure they would think differently.
There are much more "hostile" smartphone apps that exfiltrate your data and sell it to the largest bidder than there are compromised executables these days. Also there are more profitable scams than compromising a PC system outside of industrial espionage.
PC in contrast to consoles always were a cost or usage factor. The difficulties of operating a PC isn't significant. It also heavily increases digital competency of the user for computer systems. If you really don't want that, you have other options.
that's a, frankly, stupid argument. the conclusion doesn't follow the premise.
then don't root your phone or download an .exe. having the ability to do something doesn't mean you are forced to do it.
not safe enough for you? fine! make the current status quo comfortable walled-garden-of-illusionary-fake-safety the default. for example, there's no reason windows needs to by default allow unsigned code to run. hell, even make it really annoying to turn off.
but the "safety" and "easy to use" arguments against right-to-repair, digital rights, ownership, etc. is simply nonsense. there is literally ZERO negative safety or usability impact to anyone else's device because i'd like to own mine.
it's also an insulting and disingenuous argument to hear anyone on this forum make: our careers and entire segment of the economy would not exist if it were not for open systems. and it's insulting to basically say "bubba/granny is too dumb to be trusted" with owning their own device.
We're heading the opposite way of not being able to buy anything "dangerous" thanks to consumers that you're describing. I've been using a Xiaomi phone that stopped receiving updates in 2020, and have since been running LineageOS, which was made possible by the unlocked bootloader. Xiaomi has since changed its policy and it's basically impossible to unlock the bootloader on newer devices.
If not for the "dangerous" unlocking, I would have to run with dozens of severe vulnerabilities right now, all five years worth of them. A decent phone costs large amounts of money here, the hardware on mine is still very good, and so I would have used it regardless. (Yes, I understand that the firmware does not receive updates, but it's still much better than nothing.)
My guess is that you're assuming, wrongly, that vendor locked devices are "safe" and unlocked devices are "unsafe".
All computers that are connected to the internet are unsafe in some ways. The most dangerous apps on your computer are the vendor's own built-in web browser and messaging app.
Also, the vendor-controlled software stores are unsafe cesspools. You will never find a more wretched hive of scum and villainy. Moreover, the vendors deliberately make it impossible for you to protect yourself. For example, iOS makes it difficult or impossible to inspect the file system directly, and you can't install software such as Little Snitch on iOS that stops 3rd party apps—as well as 1st party apps!—from phoning home.
In any case, most computers, including Apple computers, have parental controls and the like, so you can lock down your own device to your heart's content if you don't trust yourself, or you don't trust the family member that you're gifting the device.
Today, yes, I can lock down the iPhone I give to my son, but if it can be unlocked to run arbitrary software then he can in theory unlock it. Yes, it is on me to continue to monitor the device to make sure he hasn't done it, but the point stands
And the assumption you refer to, there are varying definitions for "safe". Is a device with a locked bootloader 100% safe in all use cases and all circumstances? Of course not. But me being able to reasonably trust that someone hasn't put a compromised version of the OS on the device, or, won't be able to put a different firmware on the device to brute force my encrypted contents is a bit of safety in a certain set of circumstances that I want in my device
If Apple, or anyone else, were precluded from locking the boot loader yes, I would be forced to buy a device that the FBI or anyone else could in theory poke around on enough to try to get at my data
> Today, yes, I can lock down the iPhone I give to my son, but if it can be unlocked to run arbitrary software then he can in theory unlock it. Yes, it is on me to continue to monitor the device to make sure he hasn't done it, but the point stands
You're scared of the wrong thing. The greater danger isn't arbitrary software but rather your son running up massive App Store charges on IAP of exploitative games and other scams. And if you think Apple will refund you, think again. Locking the device to the crApp Store isn't the solution. To the contrary, the solution is to enable parental controls to prevent access to the crApp Store.
> But me being able to reasonably trust that someone hasn't put a compromised version of the OS on the device, or, won't be able to put a different firmware on the device to brute force my encrypted contents is a bit of safety in a certain set of circumstances that I want in my device
These are possible without vendor lockdown. Devices can be and are designed so that the consumer can lock the device down and prevent modification, etc. Of course you can't constrain yourself, if you have the credentials to unlock the device, but you can constrain everyone else, whether they're children on the one hand or thieves/attackers on the other.
> but if it can be unlocked to run arbitrary software then he can in theory unlock it.
I'm effectively the admin several machines with many users on them. I have root access. I'm not at all concerned that they'll gain root access. Just make yourself admin on your child's phone, I don't see the issue. Apple and Google can even make gaining root access require some technical (but documented) methods. Look at the requirements to gain root on an android phone currently. You should be comfortable going into a terminal and using ADB. I'm not worried about the average user doing this nor even the average smart child. Hell, follow Apple's lead and require a 1hr lockout if you're really concerned about someone getting root on your device. How often will that happen if it requires being connected to a computer for an hour?
> Yes, consumers do sometimes harm themselves by using these products. That's the price of freedom.
"Freedom" is also a terrible argument for this. What does it even mean? Freedom from what? Freedom to do what? It's such a meaningless word you're going to lose half your audience just by bringing it up.
"Freedom - the condition of being free; the power to act or speak or think without externally imposed restraints"
When the context is "digital devices", it becomes pretty clear what it means. You should be free to use it however you want, without externally imposed restraints.
Locking down the device so much so users cannot run applications they've written themselves without the approval of the company who made it, isn't "freedom" as the required approval from the company breaks the "without externally imposed restraints" part.
> "Freedom - the condition of being free; the power to act or speak or think without externally imposed restraints"
Yea, this is a nice thought if you don't live in society. However, it falls apart pretty rapidly once you realize "your freedom to stops at my freedom from". So it's a non-starter.
> Yea, this is a nice thought if you don't live in society
Well, democracies are societies, and you have much freedom in (most of) those. Not sure where you live, but if possible, you can always try to vacation in one to experience it yourself :)
> "your freedom to stops at my freedom from"
I don't understand what this means, nor how it relates to having root access on your digital devices. Could you possibly explain this again? I want to understand.
I detest Google, but I do think they made the right call with Android devices and Chromebooks. You can unlock either as long as you are willing to totally wipe the device first and start over as a new device under a new security context.
This removes the risk of this being abused to compromise the data of stolen devices or evil maid attacks unless a user that knows what they are doing has explicitly opted themselves into that risk.
I contacted the Google through the BBB. Made the statement that lack of ability to install and configure a Kernel level firewall, edit the HOSTS file, and remove unwanted bloat-ware reduces the security of the product. Google agreed their actions do this and said they find the lack of security acceptable. Having a firewall like Little Snitch should be acceptable to know where the phone is communicate, with whom, and how to prevent it.
Re-imaging with a rooted image is not acceptable because this also reduces the device's security by prevent OTA updates!
Gated community is broken when the end user cannot improve the security of the device above and beyond the lack polices of Google and Apple. For instant there should be no reason my device ever communicates with organizations I do not support such as Facebook or X-Twitter. X-Twitter is often used as command and control service in plain site.
It is not just out-wards communicate to monitor but in-wards too. I've used Zone Alarm in the past at an international company to help find the infected servers and computers that where serving up viruses and other malware.
*I would argue that the "Gated Community" analogy is flawed. A real world gated community still allows for the home owner to improve the security. By installing cameras, security system, and guards. Apple & Google prevent such actions.
There are indeed software firewalls on Android that use the VPN functionality to implement something like this so they don't even require root, I believe Glasswire offers one.
It does create an interesting choice, though. For example, certain apps will enforce attestation based on the bootloader status. Even if the user wipes their device and relocks their bootloader with their own keys, this doesn't count as fully secure per the bootloader status. Only Google's keys count. Of course, it is also almost prohibitively difficult to deliver yourself OTA updates after this point. I worry that one day I will have to keep two mobile phones; one for bank apps, which has not been altered from the vendor's security defaults, and one for everything else, that I am actually allowed to modify.
At the moment, I just run GrapheneOS and don't bother with any modification. It is not worth the hassle. I've already had my bank account locked out because a Google Store-bought Pixel phone was flagged as "stolen", probably due to some attestation measure (they could not tell me why). They recommended that I purchase a new phone.
Right now, although it's possible to use Android with either root or a third party ROM, attestation breaks all sorts of little things. Today this is mostly banking apps, and anything that involves NFC, but this isn't where it's going to end.
Attestation requirements are only going to become more prevalent. I predict that in a few years basically all proprietary software for Android will require attestation.
So... you may still be able to unlock the device and make it yours, but you'll also be locked out of the ever expanding and ever-more-isolated walled garden.
If you can live off of GrapheneOS and F-Droid, that's great, but for a lot of users this won't be a real choice, because you increasingly need proprietary software for access to real things in the physical world (i.e. I needed to install a special app for event tickets recently).
The problem with bootloader unlocking on modern Android devices is that they have a hypervisor that you don't get to ever unlock but that will snitch on you and make some apps, like some banking ones, refuse to work because the "integrity" of your device could not be verified. In other words, because these apps can no longer be certain they are able to hide data from you the device owner.
Magisk exists, yes, but it's a flimsy temporary solution. It only works because it's able to lie to Google that your device doesn't support hardware attestation. As soon as Google starts requiring that all devices support hardware attestation, it will stop working.
If software doesn't want to run on your hardware because it can't make sure you're not tampering with it, why is it wrong for doing so? You're not necessarily entitled to the ability to run the software right? I understand the implications this has on ones ability to create custom operating systems is troubling (eg this could destroy desktop Linux), but at the end of the day I guess it is just a choice the developer is allowed to make. It's not like they distribute the binary with no strings attached.
And there are some real strong reasons why you benefit from this sort of ability, such as preventing folks from cheating in competitive games. I can't say that all uses seem to have good reasons to use it, but that seems like more of a vote with your wallet sort of situation. Perhaps the play store should also have stricter requirements on acceptable use of attestation and ensure they are upheld.
> If software doesn't want to run on your hardware because it can't make sure you're not tampering with it, why is it wrong for doing so?
It's not the software, it's that the hardware itself, that I bought to own, still serves someone else in a way that's detrimental to my interests, and that can't be overridden because those stupid encryption keys used to sign attestation reports are burned into the silicon and only accessible to that TrustZone hypervisor that can't be unlocked.
> And there are some real strong reasons why you benefit from this sort of ability, such as preventing folks from cheating in competitive games.
Maybe playing such games on general-purpose devices is a bad idea to begin with. You know, consoles are already locked down pretty tight. But then there are PCs that have no hardware roots of trust at all yet you can play anything on them and sometimes even compete with console players. So go figure.
Because in some countries you must run some government sanctioned apps that require a "blessed" device, or you are a de facto non-citizen?
If Americans had anything like BankID or MitID which would refuse to run on their devices and they would be prevented from paying a bill, transferring money, buying tickets, or reading their mail they would go apeshit in 5 seconds.
Some apps are no longer optional in the world we are living in.
They require hardware certification for the Pixel Screenshots app... and for anything that uses Gemini Nano (Call recorder summary, weather, pixel screenshots, etc).
Lol, I've had my Pixel 9 Pro for a month but I forgot about that pixel screenshots app. The other features are unavailable in my country anyway, especially anything that has to do with calls.
I agree useful rooting should be easier, but it's definitely possible and not super hard to hide rooting.
I'm typing this on a rooted phone where all (banking) apps work just fine. All it takes is downloading an app (magisk) and add apps to a list that need to have rooting hidden.
> it's definitely possible and not super hard to hide rooting.
Worth noting that this could change with every update. It's an unstable situation right now, which is undesirable.
For that reason, e.g. the GrapheneOS team isn't employing measures to fake compliance at all. They'd really like to get SafetyNet compliance for their operating system (you need that to get Google Pay/Wallet to work), but funamentally can't get it. Right now, they could just fake it, but that's not guaranteed to work reliably, forever (and doing so would probably threaten their official BasicIntegrity compliance).
Magisk only works because Google still supports devices that don't support hardware attestation. Very soon you won't be able to fool Play Integrity without hacking the TEE
> We don't just need root access, we need undetectable root access.
At some point the argument morphs from 'I should be able to do whatever I want with my device' to 'I should be able to access your service/device with whatever I want'.
The fact that Google allows this shows that
1. Apple could do it with zero security impact on anyone who doesn't opt in
2. They could keep any service-based profit source intact
But they still would never do it. Because it's not only service based profit they want to protect. They want to restrict customers from running competitor's software on their hardware, to ensure they get their cut.
> At some point the argument morphs from 'I should be able to do whatever I want with my device' to 'I should be able to access your service/device with whatever I want'.
I'm not demanding to be able to log in to your service/device and replace IIS with Apache on it. I'm just demanding to be able to access it as a normal user with Firefox instead of Chrome.
Agreed, that's a good solution. I can root my phone immediately when I buy it, or I can leave it locked if that's my choice. That's the best of both worlds.
I would argue that the best of both worlds is being able to add your own keys and then relock the bootloader. Which Pixel devices also do:) Not sure about Chromebooks; I kinda think you maybe could reflash the firmware and then put back the write-protect screw?
The reason why this will never happen is simply due to things like DRM.
We right now have ENCRYPTED signal going from our computer to our displays, not just computers, but phones too SIMPLY to prevent people from dumping raw data.
All of that extra processing done just so you're allowed to for ex: watch netflix with a resolution higher than 720p. Then comically there's Chinese capture cards that you plug your GPU into, use mirroring mode and completely bypass it.
DRM is just one example, there's many more motivations such as preventing paid apps / pay for currency games from having these things given for free. This is the primary reason why iOS devices make significantly more money than android as it's near impossible to pirate / hack / crack for an average user.
I think in the not too far future, every electronics device will be locked down. Laptops and desktops can be locked down too. The technology is already there. They can also throw in AI for recommendations i.e. lock down users' mind too. Think what this is going to do to the next generation if they start using electronic products from 6.
For example, if anyone is interested, check out the computers Chinese governments are using right now. They are basically large mobile phones running some sort of Linux, but the whole thing is locked down. Fortunately things are OK on the commercial side but again it's more and more difficult to root or unlock a device.
And now the Western states are following suite, except it's the corporations that are leading the charge.
If they achieve this, and wipe out all commercial electronics distributors such as Mouse, then we need another underground railway movement to teach people to scavenge and build computers in that Dark Age.
I'm not joking. This could be real. It's already shaping.
I suspect DRM will eventually be self defeating. For example, I prefer to torrent content just so that I can get stuff to play using my media player of choice (and the instant seeks) without any hassle. Most normal people probably aren't even aware this is an option.
But with cryptocurrencies normalizing it's only a matter of time before a paid piracy service emerges that is both cheaper, simpler and better than Netflix or any other streamer. Some arguably already have.
DRM was being broken for years without even a monetary incentive, with one it won't stand a chance.
I'm a senior person who looks after content protection and anti-piracy at a major streaming company.
The idealism of those who want to see the demise of DRM doesn't actually hold up in the face of reality. Even when we remove restrictions and give global access to content, for free, pirates don't give up. One of the reasons is that many pirate sites get ad revenue, piracy is a business for many folk and they get the benefit of not paying for the most expensive part. They also don't have legal/regulatory compliance, taxes and will often operate their infrastructure using stolen credit cards or accounts (we can see this).
Then you have people who are selling legitimately and trying to provide the best service for customers, but who have to pay for the content, competing with people who don't have any such responsibilities. So, customers take the cheap deal.
Some folk are also under the assumption that streaming services are money grabbing. Except when you actually look, most streaming services are running at a loss, or barely profitable.
I'm just working to protect our company and reduce losses, ultimately I am not preventing people getting access to fresh food or water. I am protecting premium goods from being illegitimately exploited and protecting the jobs of my colleagues when we're already under significant cost pressures.
One reason I post about these things on the internet is in the hope that one day we might have a constructive dialogue about how to balance freedoms AND enable commerce. But at the moment we have extremism, libertarian ideals against company lawyers.
I'm sure you are aware that there are groups (scenes) which break your DRM as a hobby, they sacrifice device keys for 4K HDR content. And they do it for just the reputation.
More money than ever flows into piracy these days.
Even with complete monolithic control (which is an unlikely objective) over the entire chain from distribution to display there will be a way to obtain good quality output from a hijacked LCD controller if nothing else. There is no win condition for you.
Perhaps it imposes some restrictions, like using TPMs, but I don't think it excludes what the author is suggesting, which is the ability to run as root.
Case in point: every popular desktop PC let's you run as root, and also watch DRM content. They aren't totally mutually exclusive.
You can't play 4K Netflix on Linux, period. Because of DRM. Before you say "this is just a Netflix issue" - you can't play 4K Prime Video on Linux either. Nor 4K Disney+. And many other services. Piracy is the only way to watch most 4K streaming content on Linux. You may have the most capable and up-to-date hardware on the market, you still can't.
Yeah, that realization is what killed my attempt at replacing my Nvidia Shield TV with my home-built SteamOS box. I got everything "working" in the most technical sense, but I was limited to 720p on Hulu, and that ended up driving us crazy. I know that the box is capable of streaming 4K video just fine, because I was able to stream my 4K Blu-ray rips from my Jellyfin server just fine, so this limitation is purely artificial.
I did do some experimentation with VMs and emulation and whatnot, but I never got anything that worked consistently enough to use full-time, so I bit the bullet and plugged my Nvidia Shield TV back in.
There is nothing stopping anyone from selling an HSM (hardware security module) that can decode their protected video without fisting the control into the computer itself
These are sort of prevented by signing the hardware, you have a module on your computer that creates a web request identifying that this module is present.
OP here. Really glad to see others engaging with this topic, I wrote up this post because I felt like there wasn't anything out there that was advocating for unlocked hardware as part of the discussion on "right to repair".
As someone that works in security, I fully understand the need for sane defaults that protect the average user. I even advocate in the article that we should keep these defaults in place for the most part.
What I tend to not understand is the argument that there should be no option for more enterprising users to access their hardware at the lowest levels because we need to protect the average consumer. It may be a footgun for some, but that's sort of the point. I expect to be able to modify something I own, whether it's to my detriment or not.
My argument isn't that root access should be the default, but at the very least it should be an option. I just don't think it's right that we've normalized corporations blocking the ability to load / inspect software, which often is marketed as a safety or privacy thing, but is arguably more a business decision meant to protect profit margins.
The way to balance security and freedom is with a hardware switch. By default, keep secure boot etc. But if someone opens the case, takes out the battery, and moves a little switch on the board? Start with a fresh, unprotected context. Because it's a hardware switch, it can't be remotely hacked. An adversary who gets the hardware anyway can get control (are we going to pretend otherwise?). So just do the right thing and make it easier for people to take over their own hardware.
> I believe consumers, as a right, should be able to install software of their choosing to any computing device that is owned outright.
While I agree, I think even legislation will not fix this, because what is a computing device, and who decides what is and what is not ? I'm sure apple will argue that nothing they sell should be considered computing devices. While the hacker will consider anything they can trick into arbitrary code to be one (is your fridge a computing device?)
If we go the legal route, I think the only way is to give the right to flash firmware of _ANYTHING_ that has programmable bits, and that's probably not going to fly either because lots of legislation already dictates users should be prohibited and prevented.
> While I agree, I think even legislation will not fix this, because what is a computing device, and who decides what is and what is not ?
If there is legislation, it will contain a definition of what is a computing device and what isn't. It will be imperfect, and the edge cases will be contested in courts. Courts deal with blurry boundaries all the time.
That's how it always is with legal matters, and doesn't mean we have to demand that anything with a firmware must be flashable.
What I mean is that I think this is the fastest way to end the era of widely-available general-purpose-computing devices that we are currently in (and that is currently ending, but at a glacial speed).
It's not that hard to imagine a version of the world where computers as we know them do not exist, but are mere appliances (like tablets and smartphones), and if companies feel threatened that they might be forced to open up their computing devices, they will be quick to make them not fall under the definition.
Instead of a smartphone, you will get a "Can telephone and access facebook and instasnap" device with whatever technical cripplement is needed to make it not a computing device and be exempt from the law. And as the general public and justice system is pretty ignorant with regard to technology, it's going to be pretty resource intensive to convince a judge why every gadget around that suddenly identifies as "not a computing device" is in fact on anyway.
Just scope the law to any device that can run code, and have the criteria for control be "the user must never have less control over code execution as the manufacturer does after the sale".
So, for example, if someone buys a phone from Apple they will get full control of the entire device (SEP/TEE included) because Apple has the ability to exercise post-sale code execution control to that level (they hold the private keys required).
Does that apply to those biometric readers issued to me by the government? If not, why not? Can I have a root on my car to disable ISA? Why not, if not? Do you see the problem?
I don't really see the problem. I find it perfectly acceptable for people to be able to change every single thing about their cars. If it's an illegal mod you can let the law deal with it.
German here - I do believe this legislation already exists - the owner of a thing has full rights of disposal and no other entity is allowed to interfere (except for the state itself). And this is part of the common property rights. afaik the property rights in the US are even stronger.
But i wonder, why these rights do not seem to be enforced on computing devices. Either everyone is failing to assert their property rights or i am in the wrong here. Probably the latter.
I have talked about this before. The issue goes further in my opinion and starts to effect property rights themselves. In particular locked down hardware starts to effect the owners right of exclusion. The right of exclusion loosely is the right include or exclude something from/usesing some property. When the hardware is locked down the owner can know longer solely make those decisions. Instead in the instance of like an iDevice Apple makes those choices instead of the owner by only allowed code they have signed or signatures they allow.
thanks for sharing. the loss of property rights is an aspect I hadn’t considered… would be great to brainstorm further with an actual lawyer on these topics
The problem is larger than that, it's the IT industry's obsession with denying users the ability to evaluate their own risks and take their own responsibility. You do that all the time every day in most other areas of life, but somehow interacting with technology is different. The manufacturer always knows better. Don't want to have a time component to your biometric authentication because you know your risks? Too bad. Google and Apple know better. Password is required to unlock Touch ID.
I doubt any Apple engineer is very against the idea that an iPad user roots it. It's more like the legal and financial mindset. Legal doesn't want trouble and can shoot anyone who it doesn't like with law bullets, Finance just want MAW $$$.
Yes, please!
Unfortunately, your smartphone doesn't (really) belong to you. It's a shared property between the hardware maker, the low level software producer (Qualcomm or Apple), the os owner (Google or Apple) and maybe finally you.
Undocumented hardware plus closed source drivers for almost everything make all this possible.
The preference has been stated thousands of times. There's nothing to debate. They won't give you root and power. The only question is what you will do to change things:
Do you:
- Buy open devices?
- Sponsor development of open devices?
- Start open device companies?
- Develop open software that competes with walled gardens in quality and ease of use?
- Sponsor open software?
- Use open software?
- Engage in lobbying?
- Drop exploits (that would be worth a pile of gold) to let people jailbreak devices?
At the very least there should be the right to unlock and use a device after it loses support. A whole ecosystem of software could exist (and does in some cases) to help support or repurpose old devices. If the hardware is still good for something, let it be used! I'm still using my MacBook Pro 2013 and it is fine. I worry I will not be able to do this with Apple's newer laptops. In addition, I want to be able to use my Sonos hardware after Sonos inevitably discontinues support. More realistically I'll eventually have to stop using my Sonos speaker, and realizing this I will never buy another Sonos product.
Fully agreed. I was thinking of something similar, only I was calling it "Right to execute", similarly to the "Right to repair". I'm buying a general computing device. It's ridiculous I'm artificially limited in using it for the main purpose of making shareholders rich.
Ideally I'd add a mandatory toolchain to that. At least a C compiler which should be able to target a device I own.
> I believe consumers, as a right, should be able to install software of their choosing to any computing device that is owned outright.
I agree with this, as well as most (or all) of the other stuff mentioned in that article.
However, sometimes it might be reasonable to have a switch inside that you must unscrew it (using a commonly available screwdriver, rather than an obscure one) to switch it (and then later be able to switch it back), in order to enable some functions (e.g. to be able to upgrade EEPROM, or to bypass a secure boot loader). If the user puts glitter on it, then this allows the user to detect tampering, while remaining secure and allowing full control.
In the back of the device there's a sticker over a screwhole. You'll need to poke a screwdriver in there, but inside is not a screw. It's an electrical contact to make the ROM read-write which your screwdriver needs to bridge, and stay bridged while the firmware is flashing (It is harmless if the contact is loss in the process, but I don't know if it would be safe to abort at that point) Pretty hard to convince someone to do that process, yet doable by anyone with a flathead screwdriver.
I guess nowadays it become Samsung's e-fuse where if you flash it blow an e-fuse and the status of the fuse is now detectable with software. Then apps can refuse to service people just because the custom firmware fuse was blown.
No. This does not happen with anything else except phones. If you brick a laptop it is a 30 dollar fix on any local shop. The same cannot be said with any locked down device.
That's a slippery slope, because they list devices as sold, not as rented. So they can't claim that. Some still try, especially using copyright on software on the devices as leverage.
More than likely, they’ll instead just start actually renting you the products. See Apple’s 2-year rental program for a (relatively) non-predatory implementation, or NZXT’s for a very predatory implementation.
If locking the bootloader and comparing signatures against keys burned into a secure enclave allow Apple to make certain security guarantees that helps them sell products, I'm all for their freedom to do so.
Why doesn't OP merely champion competition, instead of encouraging regulation of what software others can write, what hardware others can ship?
I too am afraid of general purpose computing going by the wayside, and I have the Precursor phone and Raptor Talos PowerPC machines on my wishlist, just as soon as I wrap my head around secure boot chains in general before having to implement one myself. But niche hardware is expensive to produce, so we're likely left with what AMD, Intel and Apple provides us.
I guess one quirk that IMO is fair to criticize is that it's not necessarily consumers who are demanding to be locked out of their administrator privileges (the average computer user is of course not aware of the distinction of signed vs unsigned binaries), so I don't know where the pressure for secure enclaves really comes from. Is it the data centers buying thousands of chips that don't want to be pwned? government customers who refuse to buy a single die if they can't verify the bootloader? Or just patriotic engineers sensitive to a cybersecurity regime that demands we keep our guard up against enemies, foreign and domestic?
> so I don't know where the pressure for secure enclaves really comes from.
The pressure comes from shareholders. User control means users can use their device in a way that benefits them, e.g. blocking invasive tracking. This benefit provides zero or negative shareholder value.
> so I don't know where the pressure for secure enclaves really comes from.
In my experience, security engineers who see them as finally solving the “root of trust” problem. Generally (ime) it’s security engineers/teams that have been pushing for things like ssl/tls, global certificate stores, signed updates of those stores, signed kernels validating those updates. But if you break the kernel (or compromise the bootloader or EFI/BIOS) then it’s all for naught. A secure enclave solves that problem (unless you find a bug in it/its implementation) - your bootloader is validated, which validates your kernel, which validates all the userland components you care about. Security teams rejoice.
Because the phone market is a duopoly that was impenetrable for even Microsoft. This "why don't you just make your own Apple?" argument is ridiculous, it will not happen no matter how much you "encourage" it as a consumer. You even contradict this yourself by saying "we're left with what <megacrops> provide us".
I'm actually confused about why banks are so aggressive in denying users the ability to use their apps while rooted. Unlike Google and Apple I can't think of any financial incentives for this, and the security argument is quite obviously nonsense, as I don't think there has been a single person in history who managed to fall for a scam that made them follow the complicated procedure of rooting a smartphone. Nevertheless there is a clear continuous effort in developing new root detection methods to keep me from using their apps.
I believe the root detection is a form of security-by-obscurity. Bank applications are required to be obfuscated, so you can't simply statically decompile them. The other way to do that is to run the app and set runtime breakpoints, which you can't do on production firmware.
Once the application is decompiled the attacker then can proceed to pentest the bank backend, or find any frontend-only security measures to bypass. One attack I heard in local news is not even a hack at all - they simply make script that use the mobile application API to automatically move money between sock puppet bank accounts. Once a victim get scammed, the money move around quickly. For privacy banks do not provide information about unrelated cross-bank transfers so even cops can't easily trace the multiple hops. That specific bank got in the news for that "weak security"
Security of banking shouldn't depend on the client software, it should be enforced at the interface the clients use to talk to the bank. It shouldn't matter whether the banking app can be disassembled or not. As much as I detest browser-based authentication in general online banking websites got it right: you just use a browser (and it's in your best interest to use a trusted browser -- one trusted by you) but all the bank cares about is that the user has the necessary pieces for authentication, be it numerical codes, passwords, and 2FA tokens. The browser doesn't have to be a bank-signed edition of MS Edge, it can be Firefox or even a browser you wrote yourself. But a banking app is basically a black box that you would have to allow to run in your system in order for the bank to talk with the software the bank itself trusts.
> to fall for a scam that made them follow the complicated procedure of rooting
If you are unable to imagine how a 3rd party might root a device without the principal being aware of it, then maybe it is a shortcoming of your risk survey, not theirs.
Rooting an Android device generally requires completely wiping it and reinstalling the OS. It's quite impractical to do secretly!
I think in any scenario where the principal can do that without you noticing (which means things like reinstalling & logging you back into all your apps, logging the device into your google account successfully, restoring all your device settings, re-adding your fingerprint or device pin to unlock the device, etc) then it's game over regardless. If they can do that, they could get into your bank app anyway, or they could easily just replace your phone with another one entirely, and now you're just logging into your bank on a stranger's phone.
Barring a _very_ major Android zero-day (which probably would evade attestation anyway) unexpected rooting of your device is really not a plausible attack scenario.
I'd like a capability that I can run any application in a tight container that absolutely sees nothing what's on my phone. I can give it a real or fake or filtered network if needed, and anything else the app sees like contacts or files would look like a real phone but just originate from a fake null source. There's a mutual distrust with users and manufacturers and application vendors and technology can solve that.
Namely, that's what I do with proprietary software on my desktop. Nothing that's closed runs with access to my files. Further, a banking app shouldn't need to know I'm running a rooted device. For some reason, I can do banking with an open source browser on a rooted phone just fine. It's just the proprietary blob that comes with TPM shackles, and I think I should be the owner of those shackles because I own my phone.
What's even worse is that some won't even let you take a damn screenshot. "Disabled by your administrator." If that doesn't scream the fact that my device is in fact owned by someone else, I don't know what is.
And I'd go even further, and limit the ability of devices to not be "owned outright", since that sound like a loophole. I do not want a EULA interfere with these rights.
You don't have to enforce it for every single manufacturer and every single product. If you enforce it for Apple, Samsung and a couple other big Android brands, you'll get most smartphones.
Now whether it is a good idea or not, I don't know. But if you wanted to enforce it, I think it wouldn't be too hard. Banks get a ton of regulations and have to apply them all the time. Surely Apple or Samsung could do that, too.
I think a definition would exclude µC, dsps asics, fpga, etc. While most of those have the ability to program themselves, not all programs support such features.
And for external programming you need programming devices who would also need physical interfaces where no standard connector exists.
But a definition for general purpose computing devices wouldn't be harder than to craft legislation that requires my banking app to run on a certified shitty app environment.
Right -- and that I don't agree with. My concern is laws that make it illegal to modify products you own or to offer advice or tools to people who wish to modify products they own. Similarly, arrangements like leases and limited licenses that convey some aspects of "ownership" but not complete ownership should also be severely restricted.
In general if there are no legal recourses to go after people who modify their devices, then it makes it more appealing for manufacturers to allow a supported opt-out method because that reduces the incentives to compromise their products with out-of-band methods.
So I'd prefer to take an indirect route of empowering customers rather than requiring manufacturers to do additional work to free up their products.
> The main exception to this, I believe, would be for critical systems where compromising operation through software modification presents too high a risk. Examples I'm thinking of include:
> certain medical devices, such as implants and insulin pumps
> subsets of electronic control units for cars
These are precisely the opposite of what should be exceptions. If you have a pacemaker implanted in your body, you need the right to replace whatever software is running on it before the manufacturer goes out of business and takes the signing keys with them.
There exists no thing where the owner of the device should not have the right to replace the software it runs, and the more safety-critical the device the more important the right.
Disclaimer I don't have a pacemaker: As a biohacker, I think this is a really bad take. I regularly put things [1] in my body of questionable provenance, and then cut them out of myself without anesthesia when they don't suit me anymore, but I like being alive too much to mess with a medical device like a pacemaker. Pacemaker hacking sounds hardcore and like, respect to anyone who does it but I don't think it should be easy to flash one of those things because I'd really prefer nobody do that to me in my sleep.
You pretty obviously don't want to mess with it frivolously. But if there's something wrong with it, and you have to fix it? That seems better than the alternative where you can't. Note that the right to modify it doesn't imply that you're required to in the absence of any reason to.
Also, if someone wants to kill you in your sleep, they... don't need you to even have a pacemaker. And the security of medical devices is notoriously bad, so if you're worried about that sort of thing, be more worried that the status quo doesn't allow you to fix the existing remotely exploitable wireless security vulnerabilities.
That's just it though, in my opinion being able to flash the thing at all would count as a remotely exploitable wireless security vulnerability. The first thing I'd do if mine was flashable is lock it down to make sure it was no longer flashable. Does that make sense? I might not be articulating myself well here.
Removing the ability to flash it seems like a bad idea. Suppose they find a bug and release an official patch. You want it so you don't die, right? The alternative to flashing the one that's in you is that you need chest surgery again to replace it.
If it has a mechanism to flash it then they can give you the password for yours so that you can always do it yourself (or have someone do it) in the event that the manufacturer goes out of business before anyone finds the bug.
And if you really want to remove the ability to flash it, you could use your right to flash it to remove that feature, whereas the status quo is that it supports it -- insecurely -- and you aren't allowed to change it.
Reminds me of how certain macOS APIs are gated behind the paid apple developer program so you can’t, for example, write a macOS app using Network Extensions on your own Mac until you join the Apple developer program (100USD/year in the US). I understand why they do it but it does feel weird that I can’t write certain code for my own Mac unless I pay for a subscription (on top of already having paid for the Mac).
not sure if your in the US, but we can't even get net neutrality. Unfortuantely the likelyhood of this is a hell freezing over situation.
I would start with, laws should be logical and informed and go from there... the number of prerequisite changes required to come mildly close to this is unreal. Including but not limited too: copyright law, insurance law, patents, contract law, federal vs state law, an agency competent enough to enforce this, lobby from the most powerful companies in the world, and more.
I agree states can more quickly change laws, but thats a far cry from what's proposed here. Can you imagine enforcement of such a law? Can you imagine a state employee checking for root access? You break at least five California laws an hour there just walking around as the beauracratic machine consumes any reasonable enforcement. This would be considered absolutely niche and no politician would likely care because it wont get them reelected. Call me cynical I guess but our track record of change for the better is pretty poor in this arena.
In my dream we abolish copyright all together but alas we live in a profit oriented society not a knowledge oriented one.
Let's not have more of a single state passing de facto laws for the entire country, please. You're right that this happens, but it's an awful thing that we need to fight, not promote more of.
Computers have sharp edges, and people often throw hissy fits when things break. Manufacturers infantalize consumers to dodge lawsuits. Then companies like Apple take it too far by building walled gardens where devices don't have proper file systems, locking you out of your own files.
I used to think this way but then I saw how non-techy people use their devices.
Something like this would inevitably be abused and result in wave of malware so massive that it would render the internet too hostile for all but the most careful, knowledgable and paranoid users.
How do you propose we limit access to general-purpose computing without doing so across the board for everyone? How would you block bad actors from general-purpose computing? Require licensing to use development tools, and track what they're used for so you can react when someone crosses that "line"? Congratulations, you've just destroyed computing.
While obviously not perfect, what we have today is
> limit[ing] access to general-purpose computing without doing so across the board for everyone
I imagine the vast majority of people on this site run (or at least have used) Linux, *BSD, etc. on a daily basis. No average person is going to set up Arch on their main PC, but lots of us do. Your average person enjoys their locked-down Samsungs and iPhones, while we enjoy unlocked problem-ridden (but personally solvable) Linux environments.
Would it be better if every person who could buy an iPhone was sufficiently technically competent to not install malware on their easily-rootable phone? Yes. But that’s not the world we live in. Maybe I’m succumbing to the “First They Came...” mindset, but until I can’t run Linux on a PC I’ve built with components of my choosing, I don’t really care if my phone is a semi-closed semi-black box. I used to enjoy jailbreaking iOS devices, but eventually decided I’m happy with my phone just being a phone. If I want to tinker, I’ve got a rack full of servers and 3 different workstations I have the freedom to break whenever I want. It’s nice to have a phone that can just do phone things 100% of the time - I haven’t had an iOS system crash (or any bug preventing use) in about 5 years, which was the last time I had a jailbroken phone.
Even with access controls, people do things like download chrome from random web sites, then do their banking with the result. If the fake-chrome requested admin access then you'd never be able trust anything on that computer ever again. Even re-installing the OS wouldn't fix it.
Would you prefer for your technologically illiterate relatives (think grandparents/etc getting their first computing device):
- A computer that is compromised by malware
- A computer that doesn’t permit the user to install malware, and as a consequence, possibly alternative operating systems
Your phrasing implies that “it would no longer be our computer” is equivalent to “one that’s not ours from the start.” As far as I know, Microsoft and Apple aren’t going to ransomware your computer/phone to make a few bucks. You just can’t root an iPhone. Equating the two is arguing in bad faith, at best.
> A computer that doesn’t permit the user to install malware
Let me know when that's ever invented. It's certainly not iPhone. The crApp Store is full of scams and ripoffs, costing consumers millions if not billions of dollars.
It's also worth noting how much goodware that users are not permitted to install, due to vendor lockdown and arbitrary restrictions, often motivated purely by the desire to squealch competition. Security, or in this case security theater, always has tradeoffs. Unfortunately, consumers often don't even know what they're missing, because the vendor restricts what they're allowed to see, but we developers know what kind of software that we can't make on locked down platforms.
Vendor lockdown is a tool for authoritarian regimes that enables censorship. For example, these regimes force the vendors to remove VPN apps from their "curated" stores, and since sideloading is forbidden, there are no alternatives for the poor users under these regimes.
> Would you prefer for your technologically illiterate relatives (think grandparents/etc getting their first computing device):
Ah, the good old "think of the grandma".
HN's version of "think of the children". And just like the conservative pearl clutchers, I don't think there's much sincerity in those thoughts. You are more afraid of the possibility that there would be less users to squeeze through adtech and crapware sold on appstores. Most HNers make a living from making the world an even worse place.
There's a lot of devs here very happy to have a captive audience of people too ignorant to know better, either exposing their eyeballs to constant onslaught of ads on "free" apps, or paying for really basic functionality.
I will never forget that show HN post that talked about selling a clone of Handbrake, a program that exists just to set a few flags on FFMPEG, and making a living out of it, because the Apple audience has been brainwashed to take out their wallet for the dumbest of things.
The average value add of people constantly bemoaning the idea that there could be regulations to stop them from enslaving new generations to tech is lower than that of scammers.
No, I don't think that's true. We got used to insecure systems, and then accepted Big Brother as a security model. We can have secure devices that aren't owned by a corporation, but judging by the comments section here, nobody knows that.
How might that work? You personally have the keys to the TPM? Then some confidence trickster will tell a naive user that to make big$buck$ on the internet you'll need to handover your TPM key. And people will.
If the user is doing banking on fake-chrome then admin is pointless; https://xkcd.com/1200/
> then you'd never be able trust anything on that computer ever again. Even re-installing the OS wouldn't fix it.
Why not? If the hardware is under user control, they just reimage the firmware, reimage the OS, and then it's clean. (Or, in practice, perhaps they hire the local computer shop to do so, but I don't think that changes anything.)
Let me clarify .. if you have an unlocked device, then software vendors should be able to ensure that their software is non-functional on such a device.
Given that, then anything very useful would be rendered non-functional, resulting in the device probably being useless.
Because it’s their software? It is well within your bank’s rights to deny you access to their online banking system for pretty much any (technical) reason they choose; why are you entitled to run their app on what they deem to be an insecure platform? If you don’t like it, either pick a different bank or deal with not having access to their software.
Freedom cuts both ways here; if you want absolute freedom to do whatever you want with your device, why should software vendors not have absolute freedom to choose what platforms their software is permitted to run on?
> why are you entitled to run their app on what they deem to be an insecure platform?
For starters, because they're wrong, and they're wrong in a way that makes their users less secure. Allowing the use of Windows or Android with open CVEs while blocking completely up-to-date Linux or aftermarket Android ROMs clearly shows that this nonsense is contrary to security.
> If you don’t like it, either pick a different bank or deal with not having access to their software.
And that's the next biggest reason: Customers don't have the same amount of power that the companies have, so it's perfectly reasonable to tilt things in the customer's favor.
They do have that right as long as they do not sabotage the integrity of the system. Personally I would quickly use other software developers and banks. Competition would sort that out quickly.
Today banks don't want to be liable for user error that lead to scams. They adhere to such policies out of fear, but they don't help as the most prominent victims of scams are still the same people, open system or not.
The vendor can install a Trojan horse on every phone, then activate it silently when his customers does something suspicious. It also very useful feature for law enforcement agencies and foreign spies.
From a consumer rights and environmental standpoint I agree. But it's important that only the actual owner of the device can do this, and not just the person in possession of the device. You don't want to make stolen devices be anything but paperweights. But so long as I (the owner) select the passphrase for unlocking it I don't see a problem.
Personally, I think everything should be hackable, however...
Limiting the ability to _easily_ modify what's running on a system is more about public cyber-health than the individual's freedom. Viruses + malware much more easily infect systems when they are running outside of a sandbox.
The argument that an iPad runs substantially the same OS and hardware as a MacBook weakens the authors case instead of strengthening it.
You can buy from Apple a computer that's locked down (an iPad), or a computer that's not locked down by the author's definition (a MacBook). It's a matter of consumer choice, not the company insisting on control of your devices.
The non-locked-down machines come in a different form factor than the locked-down ones - they usually have a physical keyboard and a larger form factor to accommodate that. This is partly for historical reasons, but very largely also for consumer choice - and also it makes sense that on the more flexible machine users are more likely to need a keyboard. This is all fine, you can't expect every company to sell their products with every combination of features.
I find more convincing the arguments about e-waste, but they need to be framed like that: sometimes we should mandate consumers get something they don't particularly want, for the greater good.
I don't really agree with this. There's no shortage of computing platforms in a variety of form factors (including tablets) for which root access is possible. When you buy an iPad, you do so knowing what you're getting and what you're not getting. It's a truly optional purchase, because no one really needs an iPad.
I'd be concerned with a move away from root access across the board, but that doesn't appear to be happening.
Every time this issue comes up, an army of people who've never unlocked a phone comes out of the woodwork to talk about their theoretical fears of malware and piracy and rain falling from the sky if you dare to own your device.
If you've never unlocked a phone, please educate yourself on how the process works before opining. It's really not as terrifying as you imagine.
> Root access refers to the highest level of privileges a user can be granted to a computer system.
This is no longer true. You might have root access on your smartphone, but you still don't have access to the TEE (on ARM this is implemented using the "TrustZone" "feature").
Also, AVF is coming to Android, and protected VMs won't work with unlocked bootloader.. so expect the situation to deteriorate further once manufacturers make use of pVMs..
Yeah I mean theres not even any way to really know what your communicating with without taking the thing apart and following the traces + an ocilliscope. If it was worth obfuscating to a determined company it would be hell to figure out.
You could just be a sandbox root which is pointed at a guest user in a higher namespace.
There's not nearly enough awareness of this. Even with root access, on modern Android you have to set up a virtual USB connection just to get at the files in the data folders of android apps if you want to, for example, sync the savegames between your mobile emulators and your desktop emulators. It's fucking disgusting. With every new edition they shave off a little more user agency.
A PC can access those folders but even with root and "all files access" android file managers can't on recent versions of Android. Shizuku or apps like it allow file managers to access those folders by pretending to be a PC. Folders like the contents of android/data for each app. Without it you just see empty folders. It's ridiculous.
If I buy my grandma a device that is locked down in a way that NOBODY (including her, under the patient guidance of a scammer on the phone) can root, that is my choice. If you want a device that can be unlocked, the world is full of android tablets. Go wild. Some of us want an un-rootable device!
> Devices that are locked become e-waste once a manufacturer stops supporting them. This keeps happening like clockwork:
> Spotify’s Car Thing
Contrary to the author's claim, Car Thing is a great example of what can happen with abandoned hardware. The device did not become e-waste when the manufacturer stopped supporting it. There is a lively community of people modifying and updating software and doing really interesting things. I lack one only because I missed the $15 price nadir and they are relatively high priced in the secondary market.
What the author calls "double standard" could be reframed as "consumer choice." A vendor sells different devices. Some have provisions to make changes to the bootloader and some don't.
The phrase "have provisions to make changes" is on purpose. It has not yet been proven to me that a change to an iPad's bootloader function is impossible. It certainly isn't as easy as that of Mac, but the skill/effort required is a gradient.
This is similar to "soldered" storage. This was commonly thought of as impossible until it was demonstrated that a Mac will happily accept updated storage changed out with a hot air rework device. This method is certainly higher skill/effort/risk than remembering when to terminate 50pin SCSI, but shows that when a hacker has a will, there is a way.
Is it ironic that as computing devices become easier to use, they also require higher skill to fix and modify? No, more likely there is an iron rule that when a device's external complexity is contained, the inner workings of that thing become more complex. Complexity did not decrease but was hidden.
Does a grandfather clock and a tourbillon wristwatch encapsulate the same/similar general principles of timekeeping? Sure. If one has the skill to update parts of the grandfather clock, are those same skills sufficient for the wristwatch? Probably not. Should wristwatches be banned because people who update grandfather clocks do not have the skills to modify them? Surely not, that would be absurd.
Likewise, demanding root in a form you find acceptable is absurd. If you can't take root one a device you possess, it's a skill issue for you to address not the vendor.
Contrarian take: you bought the device, that you knew already did not provide that, from a company who has priced in not having to support rooted devices, and who had priced in your future revenue from extras. The company can't complain if you find a way to root it (and they don't), but they're under no obligation to add in this extra feature you're asking for. If you want a mostly-open handheld device, they're for sale, you should buy one of those.
I agree. The gist of the main arguments I see in this thread are that average people should be trusted to choose whether or not they will unlock their device. Yet, a group of (presumably) some of the most technically savvy people on the internet can’t figure out how to buy open products? If you don’t want a locked-down computer, don’t buy a Mac. Clearly, many, many people do, so why should they have that taken away from them because of someone else’s ideals?
It is possible for something to still be very good despite restrictions (e.g. m series MacBooks). It’s also reasonable to “complain” if the device you purchased isn’t truly in “your” control (eg forced OTA updates).
Also a minor nit:
> Yet, a group of (presumably) some of the most technically savvy people on the internet can’t figure out how to buy open products?
This isn’t necessarily fair. When I bought my iPad (my first apple device) I could try it at an Apple store where there’s only so much you can realise. How are the speakers? Is the screen bright? How quickly does YouTube/websites open?
What I did not realise until much later was that even something as “basic” as downloading a couple of MP3 and playing them is bizarrely hard on iOS. Or that alarms have no option to gently rise, and are designed to give you a heart attack. Or that you can’t even set your own song/music as an alarm (unless you’ve paid apple money for their music services, very conveniently).
Would I have still bought an iPad if I knew all these? Maybe. But maybe I’d have gotten a basic model. Or maybe I’d just use an android tablet - the details are irrelevant here. But there’s a lot of things about an OS that are obscure.
Correct me if I’m wrong but the base AOSP android rom allows bootloader unlocking by default. OEMs need to go out of their way to restrict it.
Also, most manufacturers consider the warranty void of the bootloader has been unlocked. There’s definitely no service (costs) for them on that front, and I’d be amazed if even 0.1% of all customers unlocked the bootloader (for it to be big enough to notice for any company).
> you bought the device, that you knew already did not provide that, from a company who has priced in not having to support rooted devices, and who had priced in your future revenue from extras.
This argument falls apart when 99% of the desirable devices do not have this option. It's not even about compromising on optional-but-important features, like having access to your bank - the "1%" devices usually do not even have a secure hardware element that handles full-disk encryption, leaving your most-personal data (that you carry with you everywhere you go, thus exposing it to additional risk) vulnerable.
> If you want a mostly-open handheld device, they're for sale, you should buy one of those.
Yeah, almost none of those are actually compelling. There's the Pixel series and GrapheneOS, but these devices are huuuge (they simply don't fit in my single hand!), and I don't want to give even more money to Google :S
In an ideal world, you should be able to simply root any of your devices on demand, in exchange for wiping the storage clean, losing your warranty, and any expectation of protection/privacy/extra features. Then it's up to you (and/or a third-party OS provider) to take care of that yourself.
The problem with that route is that only a tiny fraction of users are actually interested in that, there's value to lose in accidental rooting (users get angry about lost features), and there's no value to gain.
I'm always amazed at the number of folks who are really, really mad that there exist products and services not intended for them.
If you are a very technical person, and you want to have root on every device you own, then iOS is not for you. That's ok. Android exists!
But iOS as an appliance-level, walled-garden environment is absolutely perfect FOR MOST USERS. And that's fine.
Nursing a grudge because Apple makes products that include choices that YOU PERSONALLY don't like is incredibly weird and entitled. Just buy something else! There are options!
A libertarian and consumer friendly right-to-repair (RTR) / right-to-own (RTO) governance model could be something like Orthodox Union or UL but for consumer devices
The Right-to-Repair union RTR-U could be a simple authority with access to the keys to unlock the device if the vendor breaches certain commitments. Various levels of commitment could be offered similar to copy-left. The basic / lowest level would be "can unlock if the company dies". Higher commitments could be
will unlock if ...
company starts telemetry
company changes licensing
company stops providing timely firmware updates.
This way consumers are guaranteed a certain quality of service and access on their devices. Then vendors get a stamp of approval (like OU or UL) with the level of certification like RTR-open, RTR-private , RTR-long-terms-support etc.
This governance operates within private enterprise while consumers are offered the option to buy into vendors who commit to right-to-repair and right-to-own.
We should thank all the major copyright holders for that. They are the reason many devices have been crippled. Because for them, the real enemy is the user, - that is you, my dear friend. They assume you to be a criminal by default, you know, like some crazy feminists assume all men are rapists just because they have a penis. Chop your penis off to prove you are not a rapist! Use your device without root access to prove you are not a filthy pirate who is "stealing" money from poor copyright holders who can barely afford women, yaughts and coke they are used to.
The amount of hand-waving in this comments section is truly outstanding. Then again, it shouldn't really surprise me considering how many HN users work for the very companies that profit from vendor lockdown. "You will own nothing and you will be happy!"
That is absolutely not what the author is saying here, just that users should have the ability to install their own software on their own hardware, and that locked bootloaders and the like should not be allowed.
Yes, that is exactly what the author is saying, wanting government regulation. For one, the government won't simply say "keep the device open for the end user" (or a "smart" government wouldn't), but even if they did, you've now opened that device to any attacker, law enforcement included.
So because I can sudo on my computer, my computer is open to any attacker? What's wrong with this comments section? Has anyone here used a computer before?
What does the operating system have to do with an attacker? What's wrong with this comments section? Has anyone seen Israeli offline attacks of iPhones before?
I want to install my own software on my smartphone. However, I don't want others with physical access to be able to do this... here we hit a problem, because if the device allows extracting the data, bruteforce becomes feasible..
Also I don't want others to be able to use my phone after stealing it.. here FRP lock helps me but in order for it to work it must also limit how I can use the phone.
I wish we stopped falling for this technical trap and finally focus on the substance - hardware/software companies get away with anti-user features that run on the user's device. This shouldn't be allowed.
I don't need an unlocked bootloader, I just don't want the preinstalled Google spyware. Google should not be allowed to hold my device hostage like this.
For some, the absolute locked down-ness is a selling point. Why should those who want to buy something that can't be messed with not be able to?
If you don't want to buy something you can't install whatever you want onto, don't buy it. 100% the ability or inability to modify the firmware of a device should be disclosed, but if it's disclosed the seller should be able to set the policy to whatever they want
This is an extremely weak argument, and I'd like to stop seeing it perpetuated. If you don't want an unlocked bootloader, just don't unlock your bootloader. Why should we remove the ability to unlock the bootloader entirely just because some people don't want to use it?
Because the fact that it can't be unlocked makes me reasonably reassured that I can trust the software running on it comes from the vendor of the device
It's the same reason I don't want "the good guys" to have decryption keys to my messaging service, because even if I did trust the FBI, the fact that there is a backdoor at all means it could be exploited by someone I don't trust
Again, if you don't want to use a device that has a locked bootloader, don't buy it. I fail to see how this business model should be legally foreclosed upon. You'll always have the option to buy a device that can be unlocked, someone will always sell such a device. But if you can't lock them, then I can't buy one even if I want to
Phones with unlockable bootloaders aren't going to be sold for much longer just like dumb TVs aren't sold anymore. There's just too much profit to be earned by corporations locking devices, plus banks and governments want to lock down phones. And once they lock down phones they'll go for desktops as well.
Maybe in the US, but not in my country. I tried looking for "signage displays" but all I could find was Samsung professional monitors that still had the smart stuff
Yeah, this is just a fundamental misunderstanding of how bootloader unlocking works. The people repeating this argument seem to think that their bootloader will unlock if they look at their phone wrong, when in reality the bootloader unlock process can be made such that the user must consent. If some malware can bypass that, then it could bypass your bootloader in the first place.
It's not just about malware you might accidentally download, it's also about adversaries that may have physical access to your device and can provide that consent
No matter how convoluted you make the rube goldberg machine to bypass the cryptography, if there's a way to bypass it it will be bypassed
There are ways to do it so that 'bypass' means you effectively wipe the device. If that's not good enough, how do you protect against them just replacing your device with a compromised one that looks similar?
> it's also about adversaries that may have physical access to your device and can provide that consent. No matter how convoluted you make the rube goldberg machine to bypass the cryptography, if there's a way to bypass it it will be bypassed
You claimed that an adversary with physical access to your device can compromise your unlockable phone, but presumably this won't happen with a phone that can't be unlocked. Is that not what you claim? If so, please detail how.
I was talking about a device with an unlockable bootloader, not one that cannot be unlocked
Wanting an uncompromisable bootloader is about more than just protection against malware that might modify the software on the device, it's about protecting a phone that can be unlocked from having the software modified by someone with the ability to provide the consent that the end-user would normally give. For example when I hand my phone over in customs, or if it's seized by the police. If my bootloader is not unlockable, I haven't provided them with the keys to unlock the software, and those keys are reasonably strong, then I can be reasonably confident they haven't compromised by device
But, if they can unlock the bootloader for whatever reason, I have no idea now what is running on the device or what was run on it even if they restore it back to a locked condition
This is why I had mentioned in another comment, that it might make sense to require opening it with a screwdriver to enable/disable some features, and that you can add glitter or something like that if you want to detect physical tampering.
Every device I've ever unlocked warns you on boot that it's unlocked. So if that's your threat model, just reboot the phone after the maid hands it back to you and see if you get a scary warning.
At least historically, that wasn't always fool-proof :-) – I know at least some Motorolas from around ten years ago where the bootloader warning was simply an alternative boot animation, so you could suppress that message by overwriting the "bootloader unlocked" animation with the regular boot animation.
> If you don't want an unlocked bootloader, just don't unlock your bootloader.
That kind of logic cuts both ways: "If you don't want a device with a locked boot loader, just don't buy a device with a locked bootloader".
Unfortunately, as consumers, we're trapped between a rock and a hard place. On the one hand, I would want 100% freedom to use my device exactly as I see fit and run any software I want, without any form of curation from the manufacturer.
On the other hand, there are plenty of software companies who do shitty things when given absolute freedom over what to do in a user's device (tracking / spying / etc) and I welcome buying a device where the manufacturer helps me fight some of that.
So I can absolutely see both arguments. And I think both types can coexist. I am happy my iPhone doesn't allow Meta to say "to use WhatsApp, you must install the MetaStore®, give it root and install it from there".
I would not be happy with those restrictions on my desktop.
> I am happy my iPhone doesn't allow Meta to say "to use WhatsApp, you must install the MetaStore®, give it root and install it from there".
I think the inverse is a much more credible threat, though. "Sorry, you cant sign in to your bank because you are using Linux. Please try again on windows 11 with secure boot turned on" doesn't seem far fetched at all.
> "Sorry, you cant sign in to your bank because you are using Linux."
That's not an hypothetical for us here in Brazil: online banking was Windows-only for quite some time, because there was no Linux version of the invasive "security plugin" banks require for online banking (the current version of that "security plugin" has a Linux version).
Not that I would agree with such a policy (I currently do online bank using Linux), but why is it not within the bank’s rights to make that restriction? If they determine (with whatever degree of accuracy) that online banking from Linux/rooted androids/jailbroken iphones is too risky, why should they be required to allow it?
I don't think I asserted that it isn't within their rights. But this is the direction things are headed, and it is a threat to free and libre computing.
> I am happy my iPhone doesn't allow Meta to say "to use WhatsApp, you must install the MetaStore®, give it root and install it from there". I would not be happy with those restrictions on my desktop.
You fix that by making root access inconvenient enough that companies can't rely on the average random user having it enabled.
For example force you to wipe the device to unlock it as another person said in another comment. Or make it so that if you don't unlock it within 7 days of the device purchase and first boot, you cannot unlock it anymore.
> You fix that by making root access inconvenient enough that companies can't rely on the average random user having it enabled.
AI TikTok voice “Hey guys, if you just bought a new iPhone, make sure you remove Apple’s restriction locks so they can’t control what you install. Just follow these easy steps, but make sure you do it as soon as possible, since you’ll have to set up your phone again!”
With the comments filled with people talking about how terrible Apple is for locking down their phones, everyone’s an idiot for buying such a locked down phone so they better at least unlock the bootloader, etc.
This is not a far-fetched scenario based on some videos I’ve seen sent to me by friends.
Don't forget in the video to tell them that it will allow them to install apps that get them more performance, better battery life, better cell signal, etc.
I would also be happy with those restrictions on a traditional PC-class computing device (laptop or desktop). Would I personally buy one? Probably not, but I'd feel a whole hell of a lot better if my non-techie wife or mother or brother were using one and they were no more susceptible to some kind of exploit on their PC device than they were on their phone
I could see Microsoft saying "we're only allowing apps installed through our 'store', for safety/security reasons, unless you opt out (gated by some scary warning that doing so is unsafe).
Even if they never charged a fee for running the store, I bet this would raise a lot of eyebrows.
They are on your desktop. Have you tried installing any game you bought through Steam lately? They all install a custom launcher / updater / stuff that ends up in startup.
Literally 0 games I own on steam have any startup items. Custom launchers yes, but not startup items. A few games have kernel anticheat, but they all start with the game.
The exception is FaceIt for counter strike, but that’s not distributed through steam and is entirely third-party.
I think that making a suitable operating system design can help with avoiding some of these problems (and others mentioned elsewhere) (I had mentioned some of my ideas about operating system design before on Hacker News). In combination with this, there is also hardware design to consider (including considerations having to do with the instruction set), and you can also have a package manager with a package repository where whoever manages the package repository will verify them (something that is already done in many systems, although the verification that is already done is often not good enough in some ways); this package repository management is not actually necessary for the security features of the system but makes it more difficult for authors of programs to work around these security features.
The reason that computers are locked down by the vendors is not that computers are somehow more dangerous than other things we buy. The reason is simply that it's technically possible to lock down computers, and vendors have found that it's massively, MASSIVELY profitable to do so. It's all about protecting their profits, not protecting us. We know that the crApp Store is full of scams that steal literally millions of dollars from consumers, and we know that the computer vendors violate our privacy by phoning home with "analytics" covering everything we do on the devices. This is not intended for our benefit but rather for theirs.
reply