My tractor is from 1996, my hay baler from the early 70s and my hay cutter is even older. These all work just fine even if a little slow. The challenge of not being able to work on my equipment currently outweighs the marginal speed gains I would see on a farm our size. The same goes for nearly all of my neighbors.
The last "new" tractor in the local area was bought a couple years ago by one of the older guys who was buying the last tractor he will ever own and he thought it would be nice to have a top of the line, new, tractor just once.
FWIW we've got an '81 ford for similar reasons (also new tractors are effin expensive and hold their value very well).
We were having this very discussion today when picking up some hay to help us get through till spring with a guy that cuts about 500 acres of hay 3 times a year. He just keeps rebuilding his equipment rather than get sucked into the new stuff he can't work on.
This is becoming another huge issue that affects smaller farms more than it affects larger corporate type farms. When a piece of equipment goes down it's needed back in service in hours to days, not the potential weeks that hauling it to a repair depot can cause.
The embedded field is strange because the lock-in is generally on the side of the hardware design and tooling. Nobody is going to try to clone a tractor because they have the source code for it. They also need the CAD drawings, machine paths, tooling, supply chain, etc... Yet just about every company in the space feels that their code is absolutely sacred above all else. Just about everything you own running embedded software that has some update functionality has a bootloader containing cryptographic keys such that the firmware patches/images cannot even be disassembled. For a few devices like routers, some toys, a couple handheld radios and the like, people have been able to modify them through exploits that cause firmware dumps from the uC's memory but otherwise it's typically prohibitively time intensive to rewrite new firmware from scratch without having access to the device schematics (and even then). Worse, just about all these devices have some access to JTAG ports such that you could easily program them yourself if you wanted to and had something useful to flash them with. I've run into so many products I use on a regular basis with bugs or simple but badly needed features that I could have easily fixed/hacked in with access to the source code but alas, I cannot.
"Worse, just about all these devices have some access to JTAG ports such that you could easily program them yourself if you wanted to and had something useful to flash them with. I've run into so many products I use on a regular basis with bugs or simple but badly needed features that I could have easily fixed/hacked in with access to the source code but alas, I cannot."
IMO, it is a massive waste. Short-sighted, short-term thinking.
There are people who for whatever reason do not want you to have access to the software ("firmware") and to read/fix/improve upon it.
Some of these people comment on HN. They become active during debates over whether users would derive any benefit from being able to read and edit source code.
Big old companies have CEOs and boards that see embedded software in the same way many legislators and judges see it. Firstly they don't really understand it. But in any case it's part of the product and people have no reason to tinker with it. Now software is being used to enforce what manufacturers could previously have only dreamed of: monopolizing repairs.
With hindsight it seems obvious that things converged onto the embedded computers of consumer products: think microwaves and TVs rather than general purpose computers. The problem with tractors is that all the farmers I know have a much better idea of how a tractor works and how to fix it than the average motorist does about their vehicle.
I am reminded of the era when windscreen-mounted GPS navigation systems started becoming mainstream. Most of the models had very similar hardware but the software was all that differed. I never looked into it but I suspect almost zero had source code published.
To the more typical consumer, the unit is as a whole and the software comes with it. If it routinely takes you the wrong way down a one way street or tries to take you down a set of steps, it's defective and needs to be replaced.
To the more technical consumer it's clear that either the software or the data is at fault and either could be fixed with the right resources. I still feel a sadness at this, yet this waste is insignificant compared to a tractor or any other vehicle.
Because it is? I mean a lot of the automotive embedded industry is moving toward separation of software from hardware (AUTOSAR Runtime Environment), which is not working as well as expected, because embedded and separating software is a bit tricky. But the goal is to have more suppliers for software than for the hardware. The only trade secret these companies will hold is the software IP.
I think farmers should be able to fix their own equipment, but software being a trade secret is a fact, not something only big old stuffy companies see that way.
The other thing is there are many regulations to comply with. Companies don't want to get sued for some hacks by customers.
Lots of companies in other industries prosper in that situation, without locking everything down. RHT, for instance, has performed really well over the last year. These software subs could likewise go to a service model or literally anything other than "our software must be installed in crippled fashion". It's not as though e.g. Honda is going to decide to "pirate" their shit.
The embedded team where I work struggles with the same thing. Frankly this should be included with the first delivery of any industrial system. Under NDA if they absolutely must, though I don't like it.
I also don't like consumer systems being locked up, but at least it's somewhat defensible. But treating farmers as dumb, inbred "hold my beer and watch this firmware hack" hillbillies and not the educated industry professionals that they are is beyond condescending, disrespectful and is completely indefensible.
Exactly who is deemed qualified to hack farm equipment firmware is something for the farming trade associations and their insurance companies to work out, but as far as John Deere and the regulators are concerned, we're talking about professional embedded developers writing code to run on their equipment to support an industrial process, something that is completely uncontroversial in many other fields. That you're not supposed to intentionally or inadvertently convert your combine into a man-eating killbot is surely already covered by insurance policy clauses, labour laws and public safety laws.
I work for a robotics company making indoor self-driving vehicles, and we worry about exactly this issue— from an ROI perspective, our product is a slam dunk, so it's safety and reliability are the first and second most significant questions which have to be answered during the sales process. When we invest substantially in associating our brand with those values, we really can't afford the possibility that an unauthorized user modification compromises them in a way that's outside of our control. Hence closed source, FDE, and all the rest of it.
And obviously before electronic control, there were still a hundred ways you could break any given device and have it generate undeserved bad press for the manufacturer.
But I contend that the scope of things that can go wrong when you mess with the software, and the degree of damage that can be done, are both worse than with purely mechanical modifications. There's huge potential for unintended consequences, and bad press puts the original manufacturer in the awkward position of having to disavow their own product, bearing their symbol, because "the user modified it in an unsupported manner."
All this gets even worse if the manufacturer actively supplies source code and an SDK for making such modifications. How do you delineate supported vs. unsupported mods?
Sounds great, but it's something the abandonware scene has been fighting for over the past 20 years, to no avail.
So give it another 40 years.
Thus is the funniest thing I've read in ages. It's also 100% correct and this whole situation shows perhaps clearer than many, how Stallman's warnings were accurate.
In the automotive industry, people and companies will do this (at the component level). Enabling features via that the OEM has disabled due to its marked segmentation strategy via aftermarket components is done (and would be done to a greater extend if OEMs did not protect the part of their diagnostic stack that handles feature activation).
For example, XCP or UDS security access modules are sometimes distributed by the Tier 1 as a DLL so that even the OEM has no source code for the concrete authentication mechanism that is used.
awesome! my botnet malware that exploits and roots your smart meter or set top box can now install new firmware with a permanent backdoor and nothing will complain because all the malware binaries are signed - by the manufacturer's key, no less - so they must be secure, right?
People are asking for keys to their exact units so they and only they can sign software for them.
It's not similar at all. The equivalent would not be the certificate but the session key, and you do get your own session key in order to prevent what the person above is describing. My HN session key is useless for decrypting your HN password even if I could intercept the traffic.
There is no technical reason why the devices can't ship with not only the manufacturer's public key, but also a key pair generated for each unit that comes off the assembly line for the customers to use to sign their own firmware images (and if they wish, delete the manufacturer's public key). But they will never be able to sign images for any device but their own because they simply don't have any signing keys that can produce a signature that will be accepted by any device but their own.
I've had OEM's:
- refusing to provide interface specs of a device of which we just ordered hundreds of units. (But hey, here is the binary windows (!) driver for this embedded device)
- refusing to provide the calculation method of a checksum value their device returned.
- Stopping software development (tooling, firmware, drivers) on embedded products, even when there are known issues.
- Dropping support on a device, even when hundreds of them are still being used in the field.
It's not just tractors, its everywhere.
From a business standpoint, it's just so dangerous to work with hardware vendors who don't offer implementation details or source code. If they drop support or stop existing, you're screwed...
For new ag machinery, embedded control systems are absolutely essential for the machine to do its job. The ECU on the engine is generally reliable and have long life. The other stuff, I'm highly dubious of the long-term maintainability of it. On our farm, we ran a lot of older equipment and did a lot of repairs ourselves. For new equipment, that will be near impossible as the electronic control systems are black boxes. Even the cabling is complicated and is generally not documented. After the machine gets a decade or two old, good luck fixing it.
I suppose you could just drive the machine in the junk pile. However, when a new combine harvester is costing around 3/4 of a million USD, that seems a bit wasteful. I don't know what the solution is though. The farmers buying new equipment don't care so much, their resale value is not suffering too badly. The manufacturers are looking for ways to sell new equipment and adding more features via electronic controls is a relatively cheap way to enhance the product.
Most farmers say they don't want all these new electronics on their machines. When Deere announced their S700 series combines, they had a video bragging about all the new improvements:
If you study it, you will notice that most of the new features are due to software changes or to electronic control systems. Little of what the machine actually does to harvest the grain has changed (metal parts, etc). This automation features are nice when they work but are a disaster when something goes wrong. The operator will have trouble to determine what the machine is doing (never mind trying to figure out how to fix it).
However, even though many farmers say they don't want it, the new machines are selling well. So, maybe you could argue the market is working. To me, it feels like there could be some externalization of costs going on. These new machines are very much less valuable as they get older vs the previous era of farm equipment. I guess the same thing has happened to automobiles. You can take a car from the 1940s and fix it up into perfect condition. If you take a 2017 car loaded with electronics, would you have any hope of fixing it to new condition in 50 years from now?
On top of all that, the manual outright lied about features. It stated that oil could be checked by holding down a button on the instrument cluster, but after trying it then trying all the other buttons, none of them actually did anything outside their normal use (e.g. swapping between total and trip odo). There was no way for the owner to check the oil in this car. Additionally, it is heavily implied that the owner can't even change the oil themselves, though that could be done (with relative difficulty -- the "correct" way was with one of those expensive oil vacuum systems that no one actually owns).
The manual also described Bluetooth features that the car did not have, with no disclaimer like, "This feature may not be available in all models." It also had all the buttons for it and generally looked like it has Bluetooth, but none of the buttons actually did anything.
It's really no wonder she traded her BMW for a Camry.
The same will probably happen to tractors and whatnot. Since the cost side of the equation is far different for business and the problem has already been solved in the automotive space so it may happen quicker once it starts happening.
So we eventually tracked down someone who had it, and we integrated as best we can.
Basically they were just putting up a wall to people creating integrations, which is the entire method by which their software is used. So... if you can't contract someone to integrate, why would anyone choose that product going forward?
And yet they do.
Also, you will be sued if you hire one of those companies, as you will be breaking the many NDA's you had to sign to be allowed to use the hardware in the first place.
I've seen this happen, more than once.
You hack or change software on a vehicle and cause the auto-guidance to run over and kill someone or you circumvent a safety check in software unbeknownst to you due to your software changes and a spinny sharp thing that should have stopped doesn't and hurts or mames you or calibration changes cause engine to to over-heat and burn up and generally the company is held responsible for it.
As much time is spent testing software as is writing software. All our testing is there to make sure the software works as expected. Millions are spent on test equipment and millions of hours and man hours are spent making sure software works as expected.
I happen to work for a major agricultural company and I'm a lead software architect for their guidance + autonomous vehicles + precision farming embedded devices. We are one of John Deere's major competitors.
But it's utterly fallacious. It's quite clear that if certain changes result in different behavior, then whomever made those specific changes is responsible for the resulting behavior. About the only leg the argument can stand on is if the code is so sloppy that someone attempting to make a simple change tickles a bug somewhere else, but enabling such lack of quality to persist is the exact wrong approach for a safety-critical system!
Furthermore, regardless of its actual merit, this is precisely the kind of Schelling point collusion that is in the collective interest to bust up. "Selling" someone a piece of capital equipment that is deliberately designed to prevent its own servicing is plainly fraudulent.
You'd be amazed the amount of carnage running a brushhog at 1000rpm will do when it was designed at 540rpm. They will happily throw the 3-4 foot blade right though 1/2in steel up to 300-400ft in whatever direction the wind takes them.
When that process is controlled by software all it takes is flipping the wrong high order bit in the PTO controller. On a manual tractor it's very clear when the PTO lever is in 1000rpm vs 540rpm(and usually on a separate lever from engagement) vs an opaque software stack.
To address another objection of yours about unknown modifications, this too is an aspect where openness buttresses safety. Rather than blindly trusting that the software has not been modified (because such methods are unknown to you), the proper way to verify integrity is by being able to compare checksums or reflash, knowing there is no hidden state.
No, disagree. If I laid out our source code for you, there is no way you would have any way to understand it without all the internal documentation from which the software was architected and developed from. Non computer people always have this misconception that if they see the code, they see everything it's doing. To a degree you can see what code is doing but understanding why certain logic exists is not always clear without supporting documentations.
Additionally, some of our code is 3rd party licensed and that adds an entire new wrinkle to things.
For an over simplified example, our CAN buses have thousands of signals. A single CAN frame may have 20 to 30 internal signals and several signals across many CAN buses are examined to make decisions and I've only scratched the surface here. There is no amount of code study you can do to understand this without some coaching or review of associated internal specifications.
For me or any company to support someone's quest to understand the code would be a very costly endeavor that you're simply not entitled too or nor would it be a reasonable request.
This isn't a forum for "non computer people" - I've been a professional embedded developer, hardware and software.
This stance is wrong and terrifying. The code, safety critical code especially, should be clear and should itself document the quirks of whatever it interacts with. Embedded has a culture of punting on abstractions due to the pervasiveness of cross cutting and global concerns (peripheral allocation, interrupt usage, tight latencies, etc), coupled with traditionally smaller code sizes. But we're no longer talking about devices with little memory or cycle by cycle timing, but 32-bit devices with dynamic allocation performing high level logical processing. The embedded culture of fiddling with stateful hacks, verification primarily through system-level testing, and then zipping up the source tree for revision control needs to be burnt at the stake. Leave the bit banging to single-purpose 8 bit micros at the edge that cannot hide actual complexity - the core is a full computer, and needs a proper type system.
I'm not saying this change is going to happen over night, but we should be aiming for the correct approach. Every time a company says that they cannot release code because it is too complex, wouldn't be understandable, have third party dependencies, etc - what they're really saying is that their code base is a sloppy hack, they have no idea how it works, and they really do not want to own up to this state of affairs.
> For me or any company to support someone's quest to understand the code would be a very costly endeavor that you're simply not entitled too or nor would it be a reasonable request.
It might not be desirable according to a company's business interests, but what is actually unreasonable is to force purchasers and users of heavy equipment to blindly trust the software controlling those devices.
You also seem to be conflating right to repair vs. right to inspect code because you feel you have the right to do so for some reason (safety, don't like how it works, etc...). You don't have that right just like you don't have the right to walk into an assembly plant to see how your vehicle is assembled.
"The code, safety critical code especially, should be clear and should itself document the quirks of whatever it interacts with"
Agree and it does and we are audited by a 3rd party trained to looked at code written according to functional safety guidelines (ISO26262). Companies, like mine, don't have resources to respond to folks untrained on such matters. And for the non-safety aspects of the code, that's our intellectual property and our competitive advantage.
"Embedded has a culture of punting on abstractions due to the pervasiveness of cross cutting and global concerns"
You have data or proof of this or are you just talking from your personal experience? I assure you we take safety very seriously. Multiple millions are spent, as I said above, on testing to make sure code performs as expected; especially when a 3rd party audit and billions are on the line and, more importantly, the safety of our customers. You're inspection of the same code simply can't compete with the level of testing & auditing. You're really kidding yourself if you think otherwise.
"but 32-bit devices with dynamic allocation performing high level logical processing"
Certain ASIL ratings dictate thou shall not perform dynamic allocations. You don't appear to be up to speed on functional safety matters.
"unreasonable is to force purchasers and users of heavy equipment to blindly trust the software controlling those devices."
Not true, disagree. It's unreasonable to assume we don't take safety seriously and demand to see our code when you can look at any company's functional safety audit records (they are public). To presume a right to change the code because you're not happy with some aspect of some functionality or you're concerned about safety is not your right and no company has the resources to support these requests from the general customer or public base. Moreover, as I already pointed out, it's our IP.
IP type code in this field is heavily guarded. Swath generation, swath acquisition, path planning are all very interesting algorithms that represent a company's IP.
When we bought our '81 Ford there was a ton of bodge jobs and half-assed repairs(that helped us talk down the price) that I could easily verify by visual inspection. I knew because I could see them that they weren't inherently dangerous. Software doesn't have the same analogy in this space.
Someone has to do it. What makes John Deere's embedded software engineers more qualified to write software for a factory on wheels than the embedded software engineers working directly for the farms?
Especially when we can't even check the former group's work.
Simple, John Deere, probably like us, has spent more testing the software than you could ever do on your own.
You can't judge complex software just by looking at it. It must be tested. People who fail to understand this wind up making these uneducated comments. Go study ISO26262 and the MTB calculations required to pass it and maybe you'll change your tune.
Tested software runs on real vehicles and simulators at large companies, as I stated, running into the multi-million dollar price range. There is zero amount of code inspection the general public can do that would give the same assurances you get through testing.
Bottom line: just looking at code is not enough. You must test it. This is unit tests, simulations, field tests, harsh temperature, vibration, power cycle, high side/low side power distribution, code coverage, etc... None of this is possible by you inspecting code.
You can't both try and chase the premium for a professional product, while locking down the professionals using it.
This has a lot more to do with manufacturers unethical approach to closed and obfuscated opaque software than any inherent problem or limitation of the field.
You were able to recognize the problems or lack thereof for your car because a) you could actually see it, and b) you have more than passing familiarity with how cars work. There's no reason both of those can't be true for the vehicle firmware as well.
If there weren't rules that you have to protect your software from manipulation. But allowing manipulation leads to lawsuits alleging that the manufacturer is still at fault for any manipulations he allowed. So they don't allow any.
Umm, well people lie. Or you make the change for yourself and your fellow farmer. Your fellow farmer is killed and his wife sues the tractor company. Do you need more examples or is it clear now?
This is just my opinion. Customers are not entitled to source code or what I call the work product. You're entitled to the end product you purchased, the binary, which the source code is compiled to produce.
Yes, like the companies that turn their products into black boxes, and rely on tricked human instincts to write off the magnitude of what the "tiny chip" contains.
It's so weird when the spectre of dishonesty is used to justify more secrecy, when sunlight is the best disinfectant!
Does the checksum of the software match an official one that company has released (and not given notice of deprecation)? If not, then how did the image get into this state? That's a clear indicator of responsibility, but requires a paradigm that reifies the fact that software can be changed.
> Your fellow farmer is killed and his wife sues the tractor company
Anybody can sue anybody at any time, regardless of merit. Bringing up this one possibility is fallacious.
> You're entitled to the end product you purchased, the binary
And yet companies constantly refrain that the end user hasn't purchased the binary, but only a license. Needless to say, this is all based on commercial laws (as opposed to manifest natural laws), and are thus straightforward to change when they're no longer sensible. As far as I'm concerned, copyright should require registering the full source code - the output of a compiler is not a novel creative work!
The product the customer purchased is a tractor, part of its implementation being software. As such, the owner has the right to inspect and repair its functionality and parts. Anything less is dishonest and will hopefully not persist very long given the modern prominence of software.
This happens not-too-infrequently in the car modding/tuning world --- the former, that is; to my knowledge no one has blamed the original manufacturer, much less successfully sued, after him/herself modding the ECU of a car and blowing up the engine.
Don't forget that a lot of other safety-critical components of cars (e.g. brakes, tires, etc.) have aftermarket replacements --- many of which actually perform better than stock.
If this "safety" attitude was prevalent since the beginning, we'd have cars that can only use approved fuel, no aftermarket parts would exist at all (their manufacturers being sued out of existence), and it'd be illegal to drive them on roads not approved by the manufacturer.
I will resist the urge to post that famous Benjamin Franklin quote this time.
Tractors and Combines use DEF and I hear reports of people defeating it. Our software has checks to look for such activity.
The current situation however is more like the tractor bricking itself and never working again if you attempt to open the oil reservoir yourself, without the per-requisite authenticated repairman keys
Companies also use software to exert control over users and extract more money. By moving functionality from hardware into software, they can use both secrecy and laws like DMCA to restrict users' knowledge, ability, and rights to repair and modify their stuff.
The safety argument only makes any sense because so much of the system has moved from hardware into software, and this often makes the product worse, not better (e.g. your Jeep example). Compared to hardware, software is brittle, unreliable, complicated, and exploitable. However, it allows for the feature creep that marketers love and, perhaps as a byproduct or perhaps not, it transfers legal and operational control from the "owner" to the manufacturer.
If you are as committed to user freedom and well-being as you are to safety, then I urge you to modularize your vehicle design so that the safety-critical software you describe, such as auto-guidance, is completely separate from software used to, e.g., diagnose and repair mechanical problems. Then you can keep restrictions around the first kind but not the second. It sounds like, based on frustration with John Deere, you may be rewarded by the marketplace.
Keep in mind that these are factories on wheels which means not only do we consume a tremendous amount of data, we also generate it. For example, 3D maps are built on the fly to show the customer where product has been sprayed, seeds planted, etc... On these vehicles, it's quite typical to see 1GB Data/hour generated which for an embedded device with no traditional hard drive, that's a lot. Flash memory is susceptible to extreme heat/cold so we have to be creative in this regard too (I can't talk much about this but suffice to say it's challenging).
That being a case of many, there is always a trade-off between performance and what you can reasonable keep separated either within the same PCB or between separate modules.
Thus, as much as I would like to keep separated, you simply can't sometimes. Current state of the art only allows for so much.
Washing machines can malfunction and cause fires, hypothetically. Just because one device is designed in a way that is particularly dangerous doesn't mean the same reasoning can't be applied to another device that is dangerous to a lesser degree.
Not just hypothetically: https://news.ycombinator.com/item?id=12610218
Mind you, those were unmodified so the maker does take the blame; but if the cause is known, modifications can make them safer too.
Very much this! I have a large agricultural tractor that has a transmission controlled by software. While it works great most of the time, there is a bug where it will occasionally not disengage the clutch for about a minute when this condition occurs. One time, due to the tractor showing no signs of going anywhere, I accidentally left the tractor in gear and left the cab. Everything seemed fine until about a minute later when it finally kicked into gear and started moving with me standing beside it.
Fortunately, I happened to be in a low gear and was just creeping along when it started moving. I was able to get back in it and get it stopped. If it had been going faster, who knows how bad the outcome could have been. This is something I would be fixing if I had access to the code.
Even the compact/subcompacts regularly maim/kill people. I can totally understand wanting to minimize every possible new vector.
The concern is legitimate.
0 - https://www.wired.com/2016/03/way-go-fcc-now-manufacturers-l...
1 - https://hackaday.com/2016/02/26/fcc-locks-down-router-firmwa...
Even if it's just refurbished pre-drm tractors.
Why wouldn't corporations, which are sociopathic "persons" defined exclusively by their avarice not want to do so? It is more profitable to prevent repair. It more profitable to force you into buying more.
Apple have been fighting the ability to run your own code on the device since day one.
This is part of the business model of the "app store" purchase model and not very surprising.
I'm sure even you have heard that Microsoft have been anti-consumer once or twice over the years.
Risking to cite myself:
If this weeding systems is profitable and becomes popular, these farmers are screwed.
Here's a nice longread on that company. It's a bit fluffy, but I recommend it.
In this case their general lack of effort around security (see the whole IoT mess) may be a good thing, as it means any DRM is more hackable.
The flipside of that is the firmware may also be buggier, but then again, you're also more likely to be able to fix those bugs due to the hackability. The old Douglas Adams quote comes to mind: "The major difference between a thing that might go wrong and a thing that cannot possibly go wrong is that when a thing that cannot possibly go wrong goes wrong it usually turns out to be impossible to get at and repair."
- Support: If you want to make your product more accessible to repair, you need to provide support for that. This costs developer time (making the code nice enough to be externally visible, maybe putting together an SDK, making sure any actual secret sauce or anti-counterfeiting systems are blocked off), customer support time (you WILL get calls when somebody buys a used system with modified firmware that doesn't work), and increases the hardware costs (want to provide a JTAG port? that's PCB space and BOM cost).
- Brand reflection: We've all seen how much of a ripple the wrong person having a bad experience with a product can have on the market. Say somebody buys a modified Initech Widget (tm) that has a bug or burns some other expensive equipment that it has to work with. That person complains on their friendly local social media network that Initech Widgets have a habit of (say) turning off furnaces in the middle of winter so the user's pipes freeze. Suddenly Initech is dealing with a media storm because they allowed modification of their software, which brings us to...
- Liability. What happens if somebody modifies the software in my product and somebody dies, burns down their house, or (* forbid) makes it possible to cause harmful interference and the FCC finds out? How do I prove to that user's (or the FCC's) lawyers that I shouldn't have to pay the damages? Let's assume that, as in the case of tractors or mobile baseband modems (which seem to get brought up a lot in this case) that software is a FUNDAMENTAL part of ensuring the safety/compliance of the unit.
Every time I've been through that calculus, the answer at the end has been, "I should do everything in my power to prevent the firmware of my devices from being modified".
I can understand and sympathize with the arguments in the other direction, and as a consumer I'm both qualified and interested in modifying the software on my widgets, but it would take a lot of change in the way our society works before most companies will put themselves at risk by supporting or even allowing it for their products.