Hacker News new | past | comments | ask | show | jobs | submit login
Microsoft Tells FTC Repair Poses a Cyber Risk (securepairs.org)
177 points by walterbell 9 months ago | hide | past | web | favorite | 118 comments



Right to repair is just one of the many cases of "security" being used as an excuse to take away freedom, and it's one of the ways the corporations and governments (nearly the same thing, really) gradually build their dystopian vision of complete control over the population. It's so alluring, because "who doesn't want to be safe and secure?" On the surface it's appealing, but if you think about it, much like "a world without (cyber)crime", the ultimate result of striving for perfect security is dystopia.

Then, much as perfect freedom is unattainable, so is perfect security; and I think it's time we rebelled against this desire to achieve it. Unfortunately, it seems desiring any sort of security is only going to give the companies more leverage to use it against us, so the way to go is to reject completely their notion of (centralised, authoritarian) security, and make freedom the highest priority. Insecurity is freedom, and that is what we should fight for.

That infamous Benjamin Franklin quote has really taken on a deeper meaning recently.


Security against institutions is just as important as security against individuals, because big institutions aren't any more moral than the average individual. Everyone thinks about security against criminals, but the reason we don't live in a third world dictatorship is our constitutionally enshrined security against the police.


> Security against institutions is just as important as security against individuals

This is very well phrased.

It gets across something I was having trouble putting into words -- that security and freedom aren't always just opposite goals that conflict with each other. Freedom isn't an ideal, it's a practical system designed to guard against tangible harms.


I want perfect security, but I don't want perfect "security". Please allow me to explain.

I find it particularly frustrating that in conversations about security it's difficult to differentiate between the different uses of the word.

There's user security against other users (same system), criminals (remote), criminals (physical access), large entities (ie governments and corporations), and probably a few others.

Then there's owner (or admin?) security, for example IT vs employees who do dumb things and other similar scenarios.

There's also vendor security against user control of devices. Sometimes this is to protect a walled garden, but other times it's just a cheap way to prevent damage to their brand if clueless users do dumb things and then publicly blame the company for their own negligence.

Arguably the first category (user security) is a good thing, but those designing the devices fall almost exclusively into the second two (I would argue mostly bad) categories.


>Then there's owner (or admin?) security, for example IT vs employees who do dumb things and other similar scenarios.

I only see this argument made by Apple fans, it is always related with branding, something like "we don't want someone phone or laptop to catch fire and destroy Apples perfect brand" . the ironic thing was that after we had this discussion on HN a few days back we had the news that genuine Apple laptops catch fire anyway.

Would Microsoft like not to be able to repair the servers they run on their cloud? not to be able ro run whatever code then want on the server? be forced to run the server hardware manufacturer code so they make sure MS is not running the servers incorrectly and make them explode?


> Then there's owner (or admin?) security, for example IT vs employees who do dumb things and other similar scenarios.

I think the best way to balance this vs. security is something analogous to sudo. Design for people who don't know what they're doing by default and don't say you didn't warn them if they break their system with superuser control.


Children are also used in this same disgusting persuasion tactic.


Technology needs an immune system. 'Tamper evident' is the right approach, but achieving this in a way that consumers can use is hard and requires building blocks that don't exist yet.

How can you verify that the motherboard you sent out is the one you get back and it doesn't have extra hardware inside the case?

msft isn't wrong that creating a trusted supply chain for hardware is difficult, but I think that 'how can I trust the repair shop' and 'how can I trust MS' are the same problem.


> How can you verify that the motherboard you sent out is the one you get back and it doesn't have extra hardware inside the case?

It is certainly not by obscuring the contents and revisions of that motherboard, such that end users can only trust it as an opaque blob.

In software land, we're smitten with this assumption that security can be made absolute. While this is true for software (Curry-Howard), it is not true for anything else (eg it's easy to cut a car's brake lines). Real world safety depends on local control, post-facto enforcement, and a general lack of maliciousness. Whereas with software we're dealing with remote access and extreme scalability (eg remotely disable the brakes on all cars of a certain make/year for the lulz, untraceably).

Attempting to extend the software security paradigm (public key authoritarianism) onto hardware is completely wrong. Device security needs to be focused on mitigating software's own "winner take all" failure mode, rather than attempting to move even more eggs into one basket.


If you cut a car brake line, as long as you're allowed to look under the hood/vehicle then you'll see the brake fluid, and see the reservoir is low.


The real threat is not tampering, the real threat is from the manufacturer and the corporations that already have keys to the data kingdom and control the entire stack. I have yet to hear of a single case of hardware tampering as software tampering is much much easier and infinitely more effective... and even if hardware tampering happens it's an extreme edge case and almost irrelevant.


This kind of blurs the line between hardware and software but there have been recent attacks where groups have written attack firmwares to motherboards and hard drives of cloud computing baremetal servers.


It's still software. It's just software that most application programmers ignore by inclination.


Are there known cases of hardware tampering exploits, and how many of them are done by state actors as opposed to private ones? Also, how many of the known cases are indeterminate in that way? (state vs private) Restricting the public access to hardware will encourage more state hardware tampering.


Are you asking if things have gotten more subtle since 1945? https://en.wikipedia.org/wiki/The_Thing_(listening_device)


> How can you verify that the motherboard you sent out is the one you get back

Have your TPM cryptographically validate the components it's attached to. Coupled with a robust blinding implementation, you're only really trusting the chip manufacturers (and their tamper proofing of the silicon) at that point.

Add an ownership model to the TPM and you've solved anti-theft. Then allow for the addition and removal of authorized components by the owner and you've preserved the right to repair.

> and it doesn't have extra hardware inside the case

This is the real issue I suppose. I prefer choosing to trust a particular repair shop over gluing it all together and hoping for the best.


The argument they're making is obviously true, both in theory and in fact (look up how Best Buy repair technicians act as informants for the government).

>if Microsoft wants to make devices that nobody can service and repair without breaking their security model, they’re entitled to do that. They can make Surface Pros so hardened and tamper proof that merely opening them will destroy them.

>What they can’t do is make devices that are repairable, and then lock out everyone but their own service technicians.

There is no distinction here. The tamper proofing is what locks out everyone but their own service technicians.


Will look this up. And not surprised.

Anecdotally, I went to Best Buy for an iPhone screen replacement (a year ago in nyc), the tech wanted my pin code, I of course declined. Their reasoning was that they needed it for their test fixture to make sure the replacement worked. I declined again. The tech said that in that case they would have to wipe my phone, I said that’s fine and much preferred to the alternative.

When I came back a few hours later to get my device, it had not been wiped, and the new screen worked just fine.

I was always curious why the heck would they need my pin!

I imagine >95% give their pin.


The last time my family repaired an iPad, I backed up the thing and wiped it, only setting up enough stuff to keep it from being stolen.

That's a lot of work and takes quite a bit of time.


> The argument they're making is obviously true, both in theory and in fact (look up how Best Buy repair technicians act as informants for the government).

I can also look up how repair technicians have affixed GPS trackers to cars or defrauded consumers during repairs. But I'm still glad that I can take my car to a non-manufacturer repair shop.

The "TPM's are all going to fail" argument is the "terrorists will win" argument, except for HN instead of regular citizens. In most cases this is an unnecessary conflict -- compromising the TPM wouldn't be necessary for many of the repairs consumers want to make.

From the article:

> After all, TPMs are in Dell computers. Dell makes diagnostic software and diagnostic codes and schematics available for their hardware and I haven’t heard Microsoft or anybody else suggest that a TPM on a repairable Dell laptop is any less secure than the TPM on an unrepairable Microsoft Surface.

We could make a lot of progress on right to repair without ever coming in conflict with existing TPMs. If that's Microsoft's concern, then fine -- I'm OK starting with the noncontroversial, unsecured stuff first, like keyboards, glass, and batteries.


  We could make a lot of progress
  on right to repair without ever
  coming in conflict with existing
  TPMs.
Maybe not TPMs - but the iPhone Secure Enclave is going to crop up pretty early in any such discussions, after iOS 9 disabled iphones where a third party replaced the home/touch id button [1] - often because a broken screen had been replaced. A later apple update walked this back to merely permanently disabling touch id and apple pay on such phones.

[1] https://www.macworld.co.uk/feature/iphone/what-is-iphone-err...


You're right that iPhone is an interesting case, but even with iPhone we could still make a lot of progress.

To be clear, I do not have a lot of sympathy for Apple's arguments around secure enclave. As far as I'm concerned, right-to-repair is a tangible issue that effects nearly everyone, and having the government install a separate touch id button is a largely theoretical risk for most people. But, even the iPhone has parts that aren't effected by secure enclave -- so if it takes this argument off the table, I'm willing to let it go.

We still have Apple blocking replacement parts as counterfeits when they enter the country; and we still have a bunch of diagnostic information, specs, and tools that are only available to authorized technicians.

It's also worth noting that Error 53 wasn't forcing consumers to make a choice between the fingerprint reader and a 3rd-party screen replacement -- it was "bricking" the phone. That Apple was willing to roll that back and change their design so that Secure Enclave only disabled fingerprints, says to me that even Apple has admitted that secure enclave doesn't need to monitor every single part of their phones to keep consumers safe.


They could also make them merely tamper-evident. There's plenty of ways to allow device owners to unlock their own devices, without also letting others do so. Manufacturers just aren't interested in that, and are all to happy to lie and claim it's impossible.


> There's plenty of ways to allow device owners to unlock their own devices, without also letting others do so.

Such as?


My Android phone checks software integrity on boot, but I can override it if I want to install a custom ROM.

The setting can only be changed from within the OS, so an attacker or service technician can't do it without my permission. Some[0] Android phones even allow you to re-lock the bootloader afterwards by replacing the chain of trust, so as far as I know there's no technical reason why we couldn't have the best of both worlds.

[0]: But not all of them. Don't blindly re-lock your bootloader unless you know for certain that your phone supports it, or you'll risk basically bricking your device.


The mentioned tamper-evident-ness. E.g. the device can display at boot time a warning if any changes were made, or if keys were added to the trusted list. Adding extra keys can easily be locked behind a password that the owner can change.

Note that this is not hypothetical - lots of unlockable Android phones will display a warning that they were rooted on boot.


> There is no distinction here.

Yes there is. The question is now "how does MSFT decide _who_ gets the right to repair their devices?"

If it's a reasonable and non-discriminatory process, then that might be fine, if they're arbitrarily deciding, then that's likely an issue the FTC _should_ investigate.


I need to get a repair authorized in 20 years. Can I still do it?


Deciding who gets the right is the very definition of discrimination.


Microsoft does exactly that, they're called Secure Access Workstations, and they're used for all sensitive work. You can find some details here: https://docs.microsoft.com/en-us/windows-server/identity/sec...

They're described as PAWs in the document, but they are called SAWs internally (as one of the notes explains down the page).


> There is no distinction here. The tamper proofing is what locks out everyone but their own service technicians.

Then what locks out their own service technicians? Does Microsoft not comply with government requests, and give some promise of that in an enforceable way? Would a court even enforce such a promise?

I fail to see any reason why someone should trust Microsoft any more than Best Buy. And even then what forces you to give your device to Best Buy instead of someone you do trust?


Bought a Surface Pro 5th generation. Less than 8 months later the battery barely holds a charge. (<5 minutes) You're basically stuck because everything is glued together so you don't have any options other than to buy a brand new device. Feels a little like the forced Windows 10 updates. Good luck running Linux unless you are willing to live with all kinds of fail, due to poor camera support.

In the end it was far easier to just switch over to a Thinkpad.


The one year warranty should have covered your surface

I do support right to repair for the record. I just think this isn’t a great example of how the right to repair would have actually helped someone


The thing being asked for with the right to repair and why these companies are against it is that devices arent so convoluted you cannot repair them. We would see bigger devices that can be repaired because not everything is soldered together. At least thats what I remember reading a year ago. It would force companies to redesign their devices. Which is sensibly why they are against it all.


IANL however my understanding of the legislation behind it boils down to:

Right to repair essentially boils down to the product needs to be repairable in a reproducible way. It doesn’t state how that repair needs to be done or what exactly that means.

The big one is for repair 𝙜𝙪𝙞𝙙𝙚𝙨 𝙖𝙣𝙙 𝙞𝙣𝙨𝙩𝙧𝙪𝙘𝙩𝙞𝙤𝙣𝙨 and 𝙖𝙫𝙖𝙞𝙡𝙖𝙗𝙞𝙡𝙞𝙩𝙮 𝙤𝙛 𝙧𝙚𝙥𝙖𝙞𝙧 𝙥𝙖𝙧𝙩𝙨 is what is most important. So that a repair shop or savvy consumer could indeed do their repairs without push back from the manufacturer. It also means the specifications for those repairs and parts would be open(ish)

Having to openly document repair procedure and have open part availability is what I think wrongly companies are attempting to protect. Because third parties could manufacturer spec compliant parts that wouldn't void the overall warranty of a product

I think redesign is a non significant factor

EDIT: I just want to note that in the right to repair debate product lifecycle planned obelesance is actually a legitimate argument for right to repair. I don't think devices are built to live longer then a product generations lifecycle (for instance, an iPhone has a typical generation lifecycle of 3-5 years I believe)

Right to repair would reverse that trend at least somewhat


Battery warranty is one of the more grey areas of warranty coverage. Depending on local legislation and the specific coverage for the model you can get 6-12 months (it's considered a consumable) and the warranty might start when the battery was manufactured rather than the sale date (like Dell did some years ago, not sure if it's still a practice).


Consumables are readily replaced at low cost. Batteries (in phones/tablets) are essential central components.

If you need tools or instructions then the parts aren't really consumables.


Part of the issue is the trend of integrating parts that used to be easily replaceable. My first phone/laptop each had batteries that could be swapped in under 10 seconds with no tools. My latest do not.


Whole device is a consumable. Six months of usage and off to a landfill it goes.


In my experience it's always tagged to the device at time of sale like any other warranty. Usually this comes into play if the device and the battery somehow have different warrantie.

However I am hardly an expert on this. I can speak to the surface though (we use them at work) and most definitely is covered


For those who thought Microsoft was an alternative to Apple with regards to right to repair: think again. The mid-range and high-end HP laptops (not Lenovo) are the easiest serviceable these days.


I use a hp elitebook 8470p i got for 100 usd from some guys garage and it is simply amazing. No screws, not even one to get the bottom cover off! You just slide it off and bam, your cpu/heatsink, fan, ram, and hdd are accessible on pretty much the same layer.


There are some sweet spots of old laptops which are serviceable. Some are even Macs. There's also some which have an Intel ME which can be disabled. Some are even x86-64. Its getting less common though.


My wife had a hp once that had the thermal paste dry up. To fix it would have required taking apart everything from the keyboard to the monitor clips, top and bottom. We just bought a new laptop.


Lenovo ThinkPads at least up to the --30 generation are very repairable/user-serviceable, more so than HPs.

And, unfortunately, at least for certain purposes, X230s/T430s are the most recent reasonable laptops anyway: after that it's currently impossible to replace any of the Intel/AMD firmware with something inspectable.


I was also quite impressed by the construction of Dell's 2-in-1 devices.


So you mean HP isn't an alternative?


> If the TPM or other hardware or software protections were compromised by a malicious or unqualified repair vendor, those security protections would be rendered ineffective and consumers’ data and control of the device would be at risk

Then make it so the consumer can always tell if the security became ineffective.


I'm getting Vista flashbacks.


Ah, Theres the old Steve Balmer Microsoft we all know and love.


To be fair, Steve's Microsoft wasn't the one that introduced and popularised invasive telemetry, "treat the user like an idiot" design (including things like forced updates), nor the feverish desire to lock down the PC platform in the name of "security", but it may have just been a matter of time.


So, about the same as the Steve Jobs Apple then?


What about Tesla? IIRC they also prevent owners and third parties from working on their cars.


I can understand the rationale that modified software could be a major danger in their cars.


We've had modified cars and the associated culture for over a century, and no major danger has arisen from it (no more than the usual attributed to bad drivers, anyway.)


Mechanics are not software engineers.

Now I do believe we should be able to modify the software, but the infrastructure is not there right now.


the batteries are fairly new technology though and do pose a serious hazard. i have no doubt as the field matures it'll become safer though, with established procedures. glad to see people like Rich Benoit are blazing the trail already.


Are the battery packs really more dangerous than playing with nitrous injection or race fuel? I don’t need any certifications or permission to do either of those in my shed.


Is there a lot of information online relating to those subjects? I'd guess it's a lot easier to find knowledge in these areas than lithium ion batteries for cars, which obviously makes it more dangerous because there's nobody to help if something goes wrong.


Remember not to buy Microsoft if you have alternatives


You forgot to include Apple in your do not buy list.


Is there a way to replace board firmware with a hacked one that hides its alterations ny emularing ROM accesses? It could then also patch the secure boot checks out of the OS during system boot. Such a firmware would only be detectable by desoldering the flash chips and using a dedicated hardware reader.

If that is the case, every computer could be fundamentally compromised if you left it out of sight for long enough.


You don't even have to replace the board firmware. You can just replace the board, or the entire device, with one that does whatever you want. Like emulate whatever the real device does until the user enters their PIN and then send it back to the attacker via wireless.

There are a lot of different threat models here. The problem is that for the serious ones, like a state-level attacker, anything but continuous physical security is hopeless. But for the less serious ones, all of this faux spycraft is nothing but an excuse for anti-consumer behavior because the path of least resistance for your kids to get into your phone is by shoulder surfing your PIN, not using 0-days to install custom device firmware etc.


Is this the "new, open-source friendly" Microsoft?


Microsoft isn't a monolithic entity. Within orgs of that size and legacy there can be (and regularly are) business units at extreme odds if not open philosophical and economic combat with each other.


Doesn't matter. Right now the evil portion seems to be in control of policy advocacy and that should rightfully be punished.


Seriously. It’s hard to understand why Microsoft has been getting so much fanfare from the OS community recently.


There has also been a lot of backslash but anti-Microsoft people are called tinfoil-hats and similar condescending stuff. Just like before PRISM all the people saying NSA surveilled everyone were called conspiracy theorists.


New generation of programmers, many of which would have been very young/not alive when Microsoft were at peak (anti-OSS)/EEE?


I am "one of them" Microsoft still has some old corporate people still in charge versus some of the new projects which are not managed necessarily by the same groups. Until all these types leave it will feel like Microsoft is passive agressive.


Blame it on the "old people" but you soon you will see that "young people" will pull the same nonsense as soon as they have to hit quarterly numbers. Big companies can't be your friend.


Microsoft's new strategy has not happened to the detriment of quarterly numbers, on the contrary.


Looks like a marketing strategy, more than lots of real genuine grassroots enthusiasm. Don't get me wrong, they are doing better in some respects recently, but I use Microsoft Windows because I have to, not because I love it. I won't be trusting them until the updates and telemetry are fully transparent and fully controllable.


I think so. A big part of the reason Microsoft looks friendly is because the rest of the industry caught up to their scale of evil. Worryingly, now that technology's effects on society are widely appreciated, the clueless control freak "lock everything down for security" mindset is better poised to enforce their broken paradigm. See: all that whinging about Huawei devices, when they're all insecure by design.


Heh, well just look at how open Open Compute is...


I knew Microsoft had become openly contemptuous of tinkerers and power users then they removed granular control of windows updates.


As usual, DRM proponents try to sell DRM as a security feature. In practice it's the opposite. DRM is the security risk.


This sort of of statement is one I made back in the lates 90s and early 00s. As an absolute statement it was wrong and still is wrong, and I think the enthusiast tech community's adoption of it has done a huge amount of harm. We reflexively opposed things like TPMs at all, rather then recognizing that the real question is who controls putting keys there. It would have been (and still would be) better to fight to make sure the standard is that the end people in charge of the device have the right to control the master keys on it. By going for a blanket "no chains of trust at all" we gave up a lot of useful capabilities and left the door open for companies to come in making use of them but with unpleasant and unnecessary extras tacked on.


> This sort of of statement is one I made back in the lates 90s and early 00s. As an absolute statement it was wrong and still is wrong, and I think the enthusiast tech community's adoption of it has done a huge amount of harm.

It's absolutely a correct statement from a user perspective.


TPM itself is neutral of course. Devices which are locked out from the user to control are already not. So when I refer to DRM, I refer to the later. Users should have the ability to break and remove DRM, that's the point of "right to repair".


This isn’t quite DRM.


I see no difference between Microsoft’s position and that of Apple. Except that Apple is a trendy brand.


Yes they're both on the wrong side of right to repair.


Microsoft love Open Source.

User freedom, not so much.


Microsoft isn't entirely wrong, and "right to repair" advocates conspicuous failure to acknowledge this tradeoff is one of the weaker and more irritating parts of the movement. Note tradeoff as the word, it's not as if good things don't come with it too, nor does it mean the same weighting applies to everyone. It doesn't mean that those who want to shouldn't be able to hack on their devices. A similar tradeoff applies to pure software signing, where I am very much in favor of a legal requirement that hardware "possessors" [1] should be able to load their own master signing keys into any cryptographic roots of trust, and thus be able to run software of their choice with no further relationship with the manufacturer even in a system that has full signing required.

But the power to do that is also the power to screw it up. If any 3rd party can repair something like biometric hardware, any 3rd party could also compromise said hardware. Some people may find the benefits of the former outweigh the latter, but I do think it should be a choice in the law, and that liability should adjust accordingly. Ie., you can upfront on buying a device request that it not enforce cryptographic reqs for hardware repairs (this could be controlled by a permanent fuse or many other ways), but alternately you can also request that continue to be a requirement. In my case for example I do not want to buy an iPhone that is "right to repair enabled". I don't want the magic circle of who can mess with it expanded any further than the minimum, which is Apple themselves who are by definition inside the tent anyway.

I also think the "right to repair" movement is a symptom rather then cause thing, and obscures the real problem for the vast majority of the market. The true core issue is that, in America in particular, legal standards for fitness for purpose are simply way, way too loose. The reasonable expectation most people have for buying something expensive is that it'll be in working order for a proportionate number of years. Not forever of course, but a $200-400+ bit of electronics shouldn't be dead in 13 months either. A $500-1k one should probably last at least 4-5 years without further cost, etc. The retail price should reflect whatever it takes to make that happen. Manufacturers are the ones in a position to deal with that, who have the best stats on failure rates, and can make decisions about tradeoffs between cheapness of repairs, more on QA, where more expensive repair reqs might reduce repair need even farther in the first place (or provide other feature value), etc. It simply shouldn't be the consumers' problem at all, beyond evaluating the retail price. Yet it's completely standard to have something ludicrous like a single year warranty.

Essentially, it's a classic market Externality problem. Everyone does expect their devices to last, but are forced to roll the dice on whether they specifically have bad luck and have to pay expensive rare repairs all by themselves. The manufacturer gets to advertise an artificially low price but externalizing the failure rate percentage onto the customers, and even charge extra for what should be standard. Instead warranties should be adaptive, something like "1 month for every $12 retail price up to a maximum of 5 years" say. Then let the market sort it out from there. I'm worried "right to repair" will ultimately be camouflage that let manufacturers skip out on their real responsibilities in some cases. Right to repair could make out of warranty repairs cheaper, but you shouldn't have to pay at all within a reasonable normal life.

----

1: which would be defined as something alone the lines of "anyone who has paid, either upfront or on an ongoing plan basis, for hardware and possesses the right to control access to it." Lawyers could make it watertight, but basically a definition which would explicitly not allow any sort of "oh we were only leasing it to them!" loopholes.


> "right to repair" advocates conspicuous failure to acknowledge this tradeoff

Rossman, the youtube figurehead of this movement, talks about this at length in every single rant of his that I've listened to on the subject.

iFixit, the corporate figurehead of the right to repair movement, has this to say regarding Error 53, one of the first major invocations of security by the anti-repair lobby to excuse their behavior:

> Obviously, the Touch ID sensor is an important security measure...

So: what are you talking about? The RtR movement doesn't only acknowledge the tradeoff, they tackle it head on, and thanks to Error 53 they have extensive first-hand experience informing customers of the issue and surveying their opinions.


You are deluding yourself if you think cryptography can be the solution to an attacker with access to the physical machine. No crypto in the world can help you if I simply bug the fricking physical keyboard!


>You are deluding yourself if you think cryptography can be the solution to an attacker with access to the physical machine.

Of course it can be part of a solution. Responses like yours, which display zero consideration of threat scenarios, time/info/resource costs, or any potential improvements are somewhat tiresome these days.

>No crypto in the world can help you if I simply bug the fricking physical keyboard!

I'm sorry, can you point to that "fricking physical keyboard" on my iPhone? I'll wait.

And even if you want to talk purely about PCs and attached keyboards, there is no inherent reason those couldn't be locked down too were it a general problem, and have a system refuse to trust any peripheral by default. HSMs can also be part of an overall plan that can mitigate some damage even from a keyboard bug. Furthermore, breaking into a private residence or business is a significantly more time/information/resource intensive problem then subverting a centralized repair place through which lots of hardware passes and is left unattended, and which will not in fact generally include peripherals (if my computer is broken, I bring just the computer in, not the keyboard/mouse/monitor).


To get it out of the way, I disagree with the GP.

If you've looked through any number of the NSA or CIA leaks (Vault 7, for example) you'll have seen a device that looks like a standard Ethernet magjackn+ USB connector that has a little tiny processor embedded inside of it (it leaches power from the USB power rails, IIRC) that can be used to exfiltrate packets and the like.

It's not impossible to get bugged when you send your PC in for repair. It's also probably unlikely considering that there's many other easier ways to grab your data.


>If you've looked through any number of the NSA or CIA leaks (Vault 7, for example) you'll have seen a device that looks like a standard Ethernet magjackn+ USB connector that has a little tiny processor embedded inside of it (it leaches power from the USB power rails, IIRC) that can be used to exfiltrate packets and the like.

I don't see how that's related? PCs right now don't make use of the kind of cryptographic lockdown iPhones do. In the case of desktops that's likely because it's assumed physical security can be taken as more of a baseline, though in the case of notebooks it's probably more just inertia. The whole "right to repair" movement is precisely because this has been changing. I mean, what you wrote directly argues the case for locking down of hardware, encrypting everything along the busses between chips or having destructive tripwires or both, and so on.

>It's not impossible to get bugged when you send your PC in for repair. It's also probably unlikely considering that there's many other easier ways to grab your data.

The nature of security though is that it's worth being forward looking to what will come down the line in the future. Particularly if we're talking about legislating it and putting the force of law behind making one decision or another, which tends to then be quite difficult to change. Isn't it worth being cautious about?


> I'm sorry, can you point to that "fricking physical keyboard" on my iPhone? I'll wait.

Well, you have the digitizer, but since the buttons are highlighted when pressed, the display would work as well.


At the end of the day, there is a piece of flex carrying the raw signal into whatever chip will postprocess it. This is equally true for your touch input as it is for the physical keyboard. Your failure to understand this dooms any hope you have of making the case for cryptographically secure hardware.


So what you're saying is that you don't think there need to be any right to repair at all right? Because by your very argument here, none of it can possibly get in the way. Which means that no legislation is required and companies can keep on locking things down as hard as they want because you say that can't possibly ever work.


> If any 3rd party can repair something like biometric hardware, any 3rd party could also compromise said hardware.

As pointed out in the article, you have to trust any third party you hire. This isn't unique to electronics.

A cleaning service could secretly install cameras in my house. An auto mechanic could put a GPS tracker in my car. The local pizza parlor could deliver using bugged cardboard boxes!

Or perhaps you're worried about a random stranger at the local coffee shop dismantling your phone, installing a bugged fingerprint reader, and fully reassembling it, all in the time it took you to use the restroom and with no one noticing? Yeah, I didn't think so.

We don't need to lose our right to repair. Just don't take your electronics to unreputable service centers.


>As pointed out in the article, you have to trust any third party you hire. This isn't unique to electronics.

But the whole point is what about 3rd parties I don't hire? The point of "right to repair", at least in the formulations I've seen so far, is that the manufacturer must offer privileges to any 3rd party to perform repairs. How well though will all those 3rd parties that can now perform repairs verify that it is, in fact, the owner that does it? How does the owner really verify things? In the real world amongst normal people, how well can it really be expected vs the first party, which is the one with the most easily understood incentives?

As I said in my original post, I do not oppose people that want to be able to grant 3rd parties the right to repair. But I also want to make sure that, like right now, owners can also forbid anyone from having the cryptographic rights to alter their device hardware except the original manufacturer. As-in, even if it's stolen from them or they later decide they "want" to give permission (which isn't distinguishable from a social engineering attack). That's why I propose make it a build-to-order option, with some small external identifier the case. I don't simply want Apple or whomever to be forced to give keys away to any that ask.

>Just don't take your electronics to unreputable service centers.

This is a very shallow snobbish tech elite answer, and the same one our whole industry gave (and in many cases still gives) to people when it came to malware and the like. "Oh, just don't go to disreputable websites. Don't go installing software willy-nilly. Haha dumb lusers, PEBKAC." And then Apple came along and offered something different, the shocking concept that maybe it shouldn't be regular users' fault, that they're not fundamentally stupid, that why shouldn't they just be able to browser anywhere and install absolutely anything that catches their eye and never have to deal with any long lasting unknown effects? Even when Apple's execution has failed, the fundamental question there has been pretty successful.

Of course, Apple has also destroyed a lot of valuable freedoms along the way, many of them unnecessarily I think. They clearly also have cross incentives to misuse their control, and have. But I can't help but think that a lot of that is our fault, our industries fault, for not trying to find a better balance of answers and making it standard long, long before the iPhone.


> But the whole point is what about 3rd parties I don't hire?

What exactly is the threat model you're trying to protect against here? Real life isn't a Hollywood spy movie. If you didn't hire them, then how do they have your device and why haven't you noticed that it's missing?

Granted that a robust hardware based anti-theft solution would be nice to have. That's largely unrelated to the current topic though, provided that it's implemented in a reasonable manner.

> or they later decide they "want" to give permission (which isn't distinguishable from a social engineering attack)

This is the classic argument that users are too stupid to be provided freedom. Outside of technology, this is typically recognized as authoritarian.

> > Just don't take your electronics to unreputable service centers.

> This is a very shallow snobbish tech elite answer

No, it's literally a practical answer. Just take your device to Best Buy or the local equivalent. Reputation is literally how the entire market manages to function on a day to day basis. Tech isn't even slightly unique in this regard.

My point here is that a local brick and mortar repair shop has invested a significant amount of capital to set up their operation, so they presumably care about their reputation. Furthermore, committing crimes under such an arrangement would be ill advised due to how easily they could be tied back to the responsible party.

In contrast, the malicious web site or software example you bring up isn't even remotely comparable. In your example the bad actor exists outside of the local legal jurisdiction, the "storefront" is cheap and easy to set up (ie there's a low barrier to entry), and the responsible party can't be easily tracked down. Low risk, high reward, and a low barrier to entry. Coupled with inexperienced users this is a recipe for disaster, and indeed malware still shows up on even well curated app stores from time to time.


If I upgrade my phone every 12 or 18 months, I shouldn't have to pay extra for a 3 year warranty. It should be optional to purchase that warranty, just like it is today.


Yes, you should. If you choose to throw that phone in the garbage after 12-18 months, you should pay a price for wasting something that should last much longer. If like a normal person you instead just wipe it and sell it (or pass it on) then you will recover remaining value anyway.


Nothing needs to be thrown in the garbage. My point is only that insurance costs money, and not everyone may see the need to pay it. So it shouldn't be built into the cost of everything. I myself currently uses an iPhone 6 and recently paid to have its battery replaced.


not from a warranty though; some kind of pollution tax instead


God I hate this argument (even when I worked at Apple). There are many valid (eg remotely plausible) reasons, but “haxors cybering your security” is not one of them.

[autocorrect wanted to change haxors to hackers. I’m not sure if that’s good or not)


People seem to hate this comment.

Here's what I mean by plausible: "If we filled it entirely with resin and glue it is less likely to break and hence less likely to need repair, but doing that makes repair impossible"

I do not mean good reasons, I am specifically saying that there are "reasons" that are more plausible than "security"


It's time we also remove the "Windows"-button (edit: the branded concept, not the physical button) from all the keyboards. We should have the right to repair in the form of also removing corporate advertising from our devices.


You can buy keyboards without it or pop the button off. I don't think this is a great example of being unable to repair something.


How would you go about and rebrand the button on a common laptop? Stickers wear down, look ugly, break backlighting. Using acetone will leave an ugly transparent spot or ruin the texture. There's basically no nice way to fix many of the keyboards. Imagine if every single one of your PCs had to show the windows logo on startup, right now it's just on your keyboard.


Buy a laptop with the button you like.


Can you please provide a link?


Imagine if every single one of your PCs had to show the windows logo on startup

My keyboards have a dollar symbol on the keyboard, a currency I don't use and have never used and no major country in this continent (Europe) uses. It's high time we had the right to remove this nationalist/capitalist advertising from our devices. Imagine if every single one of your computers had to show a dollar symbol on startup, right now it's just on your keyboard.


Yeah, and that dollar symbol isn't the only symbol on the key, also provides you the information what the key does and allows you to type it and is used in many programming languages in addition to countries with the currency. The super key on the other hand with Windows logo provides no practical purpose, doesn't describe what the button does and is absolutely meaningless when you're not using Windows. The logo is just advertising, the dollar symbol is not.


You can get little Linux themed stickers to cover the Windows logo. Also Linux distros use the button for things...


You can use the button, but it shouldn't be "the Windows button", we should remove the branded concept.


I agree in principle, having scraped their logo off my former favorite keyboard, including the big one with "Designed for Windows 98" on the bottom. But I also remember that we can now buy dozens of gigabytes of RAM for much less than the GDP of a small country, partly because MS helped bring up and keep up the demand across decades, justifying so much R&D. Now that key's name is just an annoying side effect of the way history went, and it's not the one I choose to focus on anymore.


I agree. I like calling it the super key myself.


Buy a chromebook, they don't have a super button. Shop carefully to make sure you get one that's well supported by 3rd party firmware so you can run the os of your choice.


I was skeptical and checked some images. Wow.

No number pad on laptops was bad enough (but understandable). Now no super, home, or end keys. Caps lock (probably the only key I never use) seems to have been replaced by a search button. An external keyboard is now an absolute requirement for power users I guess.

Oddly enough, there does seem to be some sort of Google branded key on the left side of the Pixelbook keyboard. Not on any of the others I checked though.


Come again? The “Windows”-button is just a super/command key that exists on many other keyboards and OSes outside of Windows.


As I understand, it's only about removing the Win logo from the button, which is long due on officially Linux-supporting laptops.


It exists on many keyboards outside of Windows, but always has Windows's branding (except with macs basically). It's anticompetitive to force such branding.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: