Hacker News new | past | comments | ask | show | jobs | submit login

Companies generally don't want you in the software for safety reasons. While true there is intellectual property around the guidance and swath generating algorithms, companies hate lawsuits. Lawsuits tie up resources, can cause stop shipments, generally lost profits, and tarnish the brand. Look at the Chrysler Jeep news...

You hack or change software on a vehicle and cause the auto-guidance to run over and kill someone or you circumvent a safety check in software unbeknownst to you due to your software changes and a spinny sharp thing that should have stopped doesn't and hurts or mames you or calibration changes cause engine to to over-heat and burn up and generally the company is held responsible for it.

As much time is spent testing software as is writing software. All our testing is there to make sure the software works as expected. Millions are spent on test equipment and millions of hours and man hours are spent making sure software works as expected.

I happen to work for a major agricultural company and I'm a lead software architect for their guidance + autonomous vehicles + precision farming embedded devices. We are one of John Deere's major competitors.




This is indeed one of the specific instantiations of the generic single-actor control-delusion that has always been with us, but has been greatly exacerbated by the rise of software. And I've no doubt that it's enough to keep the engineers' cognitive dissonance stoked - "It is difficult to get a man to understand something, when his salary depends upon his not understanding it" plus a bit of ego stroking from the implication that being able to work on this particular code is really special.

But it's utterly fallacious. It's quite clear that if certain changes result in different behavior, then whomever made those specific changes is responsible for the resulting behavior. About the only leg the argument can stand on is if the code is so sloppy that someone attempting to make a simple change tickles a bug somewhere else, but enabling such lack of quality to persist is the exact wrong approach for a safety-critical system!

Furthermore, regardless of its actual merit, this is precisely the kind of Schelling point collusion that is in the collective interest to bust up. "Selling" someone a piece of capital equipment that is deliberately designed to prevent its own servicing is plainly fraudulent.


Except that tractors are more dangerous than your car/appliance/whatever.

You'd be amazed the amount of carnage running a brushhog at 1000rpm will do when it was designed at 540rpm. They will happily throw the 3-4 foot blade right though 1/2in steel up to 300-400ft in whatever direction the wind takes them.

When that process is controlled by software all it takes is flipping the wrong high order bit in the PTO controller. On a manual tractor it's very clear when the PTO lever is in 1000rpm vs 540rpm(and usually on a separate lever from engagement) vs an opaque software stack.


Safety is exactly one of the reasons why you'd want the complexity of software laid bare out in the open, rather than hidden as a black box of unknown size! Your own comment laments the clarity of two entire levers to control PTO versus an "opaque software stack" - why is it a forgone conclusion that that stack must be opaque?!

To address another objection of yours about unknown modifications, this too is an aspect where openness buttresses safety. Rather than blindly trusting that the software has not been modified (because such methods are unknown to you), the proper way to verify integrity is by being able to compare checksums or reflash, knowing there is no hidden state.


"Safety is exactly one of the reasons why you'd want the complexity of software laid bare out in the open"

No, disagree. If I laid out our source code for you, there is no way you would have any way to understand it without all the internal documentation from which the software was architected and developed from. Non computer people always have this misconception that if they see the code, they see everything it's doing. To a degree you can see what code is doing but understanding why certain logic exists is not always clear without supporting documentations.

Additionally, some of our code is 3rd party licensed and that adds an entire new wrinkle to things.

For an over simplified example, our CAN buses have thousands of signals. A single CAN frame may have 20 to 30 internal signals and several signals across many CAN buses are examined to make decisions and I've only scratched the surface here. There is no amount of code study you can do to understand this without some coaching or review of associated internal specifications.

For me or any company to support someone's quest to understand the code would be a very costly endeavor that you're simply not entitled too or nor would it be a reasonable request.


> If I laid out our source code for you, there is no way you would have any way to understand it without all the internal documentation from which the software was architected and developed from. Non computer people always have this misconception that if they see the code, they see everything it's doing

This isn't a forum for "non computer people" - I've been a professional embedded developer, hardware and software.

This stance is wrong and terrifying. The code, safety critical code especially, should be clear and should itself document the quirks of whatever it interacts with. Embedded has a culture of punting on abstractions due to the pervasiveness of cross cutting and global concerns (peripheral allocation, interrupt usage, tight latencies, etc), coupled with traditionally smaller code sizes. But we're no longer talking about devices with little memory or cycle by cycle timing, but 32-bit devices with dynamic allocation performing high level logical processing. The embedded culture of fiddling with stateful hacks, verification primarily through system-level testing, and then zipping up the source tree for revision control needs to be burnt at the stake. Leave the bit banging to single-purpose 8 bit micros at the edge that cannot hide actual complexity - the core is a full computer, and needs a proper type system.

I'm not saying this change is going to happen over night, but we should be aiming for the correct approach. Every time a company says that they cannot release code because it is too complex, wouldn't be understandable, have third party dependencies, etc - what they're really saying is that their code base is a sloppy hack, they have no idea how it works, and they really do not want to own up to this state of affairs.

> For me or any company to support someone's quest to understand the code would be a very costly endeavor that you're simply not entitled too or nor would it be a reasonable request.

It might not be desirable according to a company's business interests, but what is actually unreasonable is to force purchasers and users of heavy equipment to blindly trust the software controlling those devices.


The non-computer people comment refers to the folks in the video; not this forum.

You also seem to be conflating right to repair vs. right to inspect code because you feel you have the right to do so for some reason (safety, don't like how it works, etc...). You don't have that right just like you don't have the right to walk into an assembly plant to see how your vehicle is assembled.

"The code, safety critical code especially, should be clear and should itself document the quirks of whatever it interacts with"

Agree and it does and we are audited by a 3rd party trained to looked at code written according to functional safety guidelines (ISO26262). Companies, like mine, don't have resources to respond to folks untrained on such matters. And for the non-safety aspects of the code, that's our intellectual property and our competitive advantage.

"Embedded has a culture of punting on abstractions due to the pervasiveness of cross cutting and global concerns"

You have data or proof of this or are you just talking from your personal experience? I assure you we take safety very seriously. Multiple millions are spent, as I said above, on testing to make sure code performs as expected; especially when a 3rd party audit and billions are on the line and, more importantly, the safety of our customers. You're inspection of the same code simply can't compete with the level of testing & auditing. You're really kidding yourself if you think otherwise.

"but 32-bit devices with dynamic allocation performing high level logical processing" Certain ASIL ratings dictate thou shall not perform dynamic allocations. You don't appear to be up to speed on functional safety matters.

"unreasonable is to force purchasers and users of heavy equipment to blindly trust the software controlling those devices."

Not true, disagree. It's unreasonable to assume we don't take safety seriously and demand to see our code when you can look at any company's functional safety audit records (they are public). To presume a right to change the code because you're not happy with some aspect of some functionality or you're concerned about safety is not your right and no company has the resources to support these requests from the general customer or public base. Moreover, as I already pointed out, it's our IP.

IP type code in this field is heavily guarded. Swath generation, swath acquisition, path planning are all very interesting algorithms that represent a company's IP.


What is your point? I have a relative that lost his arm to a combine. There was no lawsuit, he did not seek damages. Farmers are well aware of how dangerous this equipment is.


The point is I don't think allowing the modification of software in what is essentially a factory on wheels is a safe thing to do.

When we bought our '81 Ford there was a ton of bodge jobs and half-assed repairs(that helped us talk down the price) that I could easily verify by visual inspection. I knew because I could see them that they weren't inherently dangerous. Software doesn't have the same analogy in this space.


> The point is I don't think allowing the modification of software in what is essentially a factory on wheels is a safe thing to do.

Someone has to do it. What makes John Deere's embedded software engineers more qualified to write software for a factory on wheels than the embedded software engineers working directly for the farms?

Especially when we can't even check the former group's work.


"Someone has to do it. What makes John Deere's embedded software engineers more qualified..."

Simple, John Deere, probably like us, has spent more testing the software than you could ever do on your own.

You can't judge complex software just by looking at it. It must be tested. People who fail to understand this wind up making these uneducated comments. Go study ISO26262 and the MTB calculations required to pass it and maybe you'll change your tune.

Tested software runs on real vehicles and simulators at large companies, as I stated, running into the multi-million dollar price range. There is zero amount of code inspection the general public can do that would give the same assurances you get through testing.

Bottom line: just looking at code is not enough. You must test it. This is unit tests, simulations, field tests, harsh temperature, vibration, power cycle, high side/low side power distribution, code coverage, etc... None of this is possible by you inspecting code.


If this was a consumer product, sure. Tractors aren't consumer products and come with manuals for a reason.

You can't both try and chase the premium for a professional product, while locking down the professionals using it.


In such a scenario, if you were worried that the software was modified, you could flash the ROM with the OEMs software.


not if the modified code also modified the boot loader.


> I knew because I could see them that they weren't inherently dangerous. Software doesn't have the same analogy in this space.

This has a lot more to do with manufacturers unethical approach to closed and obfuscated opaque software than any inherent problem or limitation of the field.

You were able to recognize the problems or lack thereof for your car because a) you could actually see it, and b) you have more than passing familiarity with how cars work. There's no reason both of those can't be true for the vehicle firmware as well.


perhaps, but if you sold the equipment without telling the new owner they were running your code and they got hurt or killed, as I said above, companies don't want tie up their resources fighting these claims or dealing with it in the first place.


there is the doctrine of an 'attractive nuisance' [0] though, which gives precedent for making owners liable for damages caused to third parties by some dangerous object they did not sufficiently prevent access to... that easily hackable by the user marketing checkbox, combined with a sharp and pointy heavy agricultural spinning tool of death might count?

0. https://en.wikipedia.org/wiki/Attractive_nuisance_doctrine


That seems to apply to children.... I mean you should add some safeguards, to prevent the code from being modified by children messing around, but I dont think this would apply to an adult farmer trying to modify his tractor.


Very well said, but I still like the poignant - "All professions are conspiracies against the laity” (George Bernard-Shaw), which is likely based on Adam Smith's earlier writing: "People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices."


>But it's utterly fallacious. It's quite clear that if certain changes result in different behavior, then whomever made those specific changes is responsible for the resulting behavior. About the only leg the argument can stand on is if the code is so sloppy that someone attempting to make a simple change tickles a bug somewhere else, but enabling such lack of quality to persist is the exact wrong approach for a safety-critical system!

If there weren't rules that you have to protect your software from manipulation. But allowing manipulation leads to lawsuits alleging that the manufacturer is still at fault for any manipulations he allowed. So they don't allow any.


Are there any examples happening of this happening in recent years? (80's-now)


" It's quite clear that if certain changes result in different behavior, then whomever made those specific changes is responsible for the resulting behavior."

Umm, well people lie. Or you make the change for yourself and your fellow farmer. Your fellow farmer is killed and his wife sues the tractor company. Do you need more examples or is it clear now?

This is just my opinion. Customers are not entitled to source code or what I call the work product. You're entitled to the end product you purchased, the binary, which the source code is compiled to produce.


> Umm, well people lie

Yes, like the companies that turn their products into black boxes, and rely on tricked human instincts to write off the magnitude of what the "tiny chip" contains.

It's so weird when the spectre of dishonesty is used to justify more secrecy, when sunlight is the best disinfectant!

Does the checksum of the software match an official one that company has released (and not given notice of deprecation)? If not, then how did the image get into this state? That's a clear indicator of responsibility, but requires a paradigm that reifies the fact that software can be changed.

> Your fellow farmer is killed and his wife sues the tractor company

Anybody can sue anybody at any time, regardless of merit. Bringing up this one possibility is fallacious.

> You're entitled to the end product you purchased, the binary

And yet companies constantly refrain that the end user hasn't purchased the binary, but only a license. Needless to say, this is all based on commercial laws (as opposed to manifest natural laws), and are thus straightforward to change when they're no longer sensible. As far as I'm concerned, copyright should require registering the full source code - the output of a compiler is not a novel creative work!

The product the customer purchased is a tractor, part of its implementation being software. As such, the owner has the right to inspect and repair its functionality and parts. Anything less is dishonest and will hopefully not persist very long given the modern prominence of software.


or calibration changes cause engine to to over-heat and burn up and generally the company is held responsible for it

This happens not-too-infrequently in the car modding/tuning world --- the former, that is; to my knowledge no one has blamed the original manufacturer, much less successfully sued, after him/herself modding the ECU of a car and blowing up the engine.

Don't forget that a lot of other safety-critical components of cars (e.g. brakes, tires, etc.) have aftermarket replacements --- many of which actually perform better than stock.

If this "safety" attitude was prevalent since the beginning, we'd have cars that can only use approved fuel, no aftermarket parts would exist at all (their manufacturers being sued out of existence), and it'd be illegal to drive them on roads not approved by the manufacturer.

I will resist the urge to post that famous Benjamin Franklin quote this time.


Tractors & Combines are way more expensive than cars so it's not a fair comparison. You can't use a cherry picker to replace a tractor engine... At least the large kind.

Tractors and Combines use DEF and I hear reports of people defeating it. Our software has checks to look for such activity.


We already have tractors that are required to use only factory oil. The use of any other oil voids the powertrain warranty.


Voiding the warranty is perfectly fine. I don't think it's the burden of Deere to support aftermarket modifications.

The current situation however is more like the tractor bricking itself and never working again if you attempt to open the oil reservoir yourself, without the per-requisite authenticated repairman keys


I won't say the safety argument is meritless. But it is not the whole story.

Companies also use software to exert control over users and extract more money. By moving functionality from hardware into software, they can use both secrecy and laws like DMCA to restrict users' knowledge, ability, and rights to repair and modify their stuff.

The safety argument only makes any sense because so much of the system has moved from hardware into software, and this often makes the product worse, not better (e.g. your Jeep example). Compared to hardware, software is brittle, unreliable, complicated, and exploitable. However, it allows for the feature creep that marketers love and, perhaps as a byproduct or perhaps not, it transfers legal and operational control from the "owner" to the manufacturer.

If you are as committed to user freedom and well-being as you are to safety, then I urge you to modularize your vehicle design so that the safety-critical software you describe, such as auto-guidance, is completely separate from software used to, e.g., diagnose and repair mechanical problems. Then you can keep restrictions around the first kind but not the second. It sounds like, based on frustration with John Deere, you may be rewarded by the marketplace.


I actually agree with this statement and to your credit, a lot of the design work I do follows some of this.

Keep in mind that these are factories on wheels which means not only do we consume a tremendous amount of data, we also generate it. For example, 3D maps are built on the fly to show the customer where product has been sprayed, seeds planted, etc... On these vehicles, it's quite typical to see 1GB Data/hour generated which for an embedded device with no traditional hard drive, that's a lot. Flash memory is susceptible to extreme heat/cold so we have to be creative in this regard too (I can't talk much about this but suffice to say it's challenging).

That being a case of many, there is always a trade-off between performance and what you can reasonable keep separated either within the same PCB or between separate modules.

Thus, as much as I would like to keep separated, you simply can't sometimes. Current state of the art only allows for so much.


That's a poor argument in my opinion. Should my washing machine have no serviceable parts because I might electrocute myself?


No, but both the law and morality are allowed to utilize thresholds for determining acceptable risks; we weigh the risk of someone hurting themselves or others against the benefit of allowing people the ability to modify the hardware. Obviously we will all have different opinions about the relative value of risk vs freedom, but we should be able to settle on some median for how we value each. No matter what, it isn't the case that aggreeing with the idea that some machines are so dangerous we should prevent modification of the software forces us to agree that we are right to prevent modification of software on machines that have any danger at all.


OP was talking about tractors software and machines that can run people over and kill them, not washing machines or serviceable parts.


Shouldn't that liability come with owning the device? I don't understand how we've come to think, as a legal system, that the maker of a tool is responsible for the behavior of modifications made to their tool. Maybe that's not even true, but it seems to be believed so. If it is, I find it preposterous. Perhaps there's some money to follow.

Washing machines can malfunction and cause fires, hypothetically. Just because one device is designed in a way that is particularly dangerous doesn't mean the same reasoning can't be applied to another device that is dangerous to a lesser degree.


Washing machines can malfunction and cause fires, hypothetically.

Not just hypothetically: https://news.ycombinator.com/item?id=12610218

Mind you, those were unmodified so the maker does take the blame; but if the cause is known, modifications can make them safer too.


> modifications can make them safer too.

Very much this! I have a large agricultural tractor that has a transmission controlled by software. While it works great most of the time, there is a bug where it will occasionally not disengage the clutch for about a minute when this condition occurs. One time, due to the tractor showing no signs of going anywhere, I accidentally left the tractor in gear and left the cab. Everything seemed fine until about a minute later when it finally kicked into gear and started moving with me standing beside it.

Fortunately, I happened to be in a low gear and was just creeping along when it started moving. I was able to get back in it and get it stopped. If it had been going faster, who knows how bad the outcome could have been. This is something I would be fixing if I had access to the code.


hmm, sounds suspicious, in that the seat sensor was disabled? Even our vehicles won't move, when in gear if no one is in the seat. Seat senors cut the drive train. Even my 15 year old el-cheap-o Sears garden tractor does the same thing: cuts the engine off if I stand up out of the seat while in gear.


Yup, tractors and most farm equipment are fucking dangerous.

Even the compact/subcompacts regularly maim/kill people. I can totally understand wanting to minimize every possible new vector.


I agree; unlike washing machines, apparently, these tractors are made to be unserviceable.


Anything with mains power can set your house on fire.


Name two suits where someone argued that Alice is liable for what happens when Bob modifies her product.


I don't think your response is fair because even if those suits haven't been filed yet you can't honestly say that you know for certain they won't ever in the future.

The concern is legitimate.


The possibility that a large company might have to fight and win a lawsuit is not a plausible and certainly not an acceptable excuse for destroying the repairability of their products.


it's not and you're right. The right to repair effects agricultural equipment. Hacking the software to add features or disable things is not repairing it!


Yes I believe that is one pillar of the argument. The other is whether or not companies are using reparability (or lack thereof) to increase revenue or protect channels.


The solution to this is obviously simple: just require hacked tractors to remove all branding and void the warranty once it's been hacked. Also include a liability waiver so that if someone tampers with the software or hardware, the company can not be held liable anymore.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: