An engineer did write this code. Almost certainly, many of them, working together. Now, their managers told them it was okay to do this, maybe that legal had given the go-ahead or that they had to do it to keep their jobs... But it's still pretty clear that this was not right. They still did it, and then didn't tell anyone. And that doesn't sit well with me.
In my opinion (and I welcome disagreement and debate) engineers have an obligation to say 'no' to wiring unethical code.
Ethics is about protecting the weak from the strong, not offering up the weak to get eaten by the strong. Only heroes put the good of the many over their own needs, and you can't expect everyone to be a hero.
Only within the safe, protecting bosom of a strong professional organization can we place moral obligations on workers. But it will never happen while the idea of a guild / union for software engineers is still anathema.
> it will never happen while the idea of a guild / union for software engineers is still anathema.
Sounds like you're waiting for a union and that's the only solution.
In the interim, we can and should encourage people to be heroes. Let's face it, there are other programming jobs out there. Do you truly need to work for an unethical boss or company?
Everyone may not come up with the same answer depending on their individual situation. We should at least support potential whistleblowers as a community. That's one first step towards becoming a unified group.
> Let's face it, there are other programming jobs out there.
OK, let's run with that for a moment. Where else would you have expected the VW whistleblowers to get jobs? Not every engineering job is a fungible web dev position. Those skills are useful for that one industry only, and that worker would have to uproot his family and move to get a less-desirable position.
No matter how you shake it, leaving a job is a major life decision for most people not in Silicon Valley, yes, even most engineers. And even in the fungible sectors like finance and the web, word gets around.
I have wondered if a distributed whistleblowing network ala Wikileaks but focused on putting public pressure on the most egregious offenders would be a worthwhile thing to build. Essentially we'd run outreach campaigns for high-impact sectors that use lots of software engineers and encourage them to spill the beans, which would end up into a select group of news reporters / lawyers who are positioned to take action.
I have a feeling that's not quite the right approach, but I think with a bit of refinement we could come up with something better.
Frankly, I think there is a bigger problem than job security itself. It's that the whistle will be ignored by the general population. Even if you are willing to sacrifice to do what is right, it may not matter because no-one wants to believe, or the spin is so good that no-one believes it anyway.
Perhaps WikiLeaks could be rebuilt. But I think we should try to learn something from v. 1.0 which became politicized very quickly. I think the takeaway is you want someone who is very trusted within our community to run it. An entrepreneur with a good track record of being honest, and perhaps also has a law degree.
In reality it's just not black and white like that. I expect a lot of engineers aren't sure if they should be blowing the whistle about a particular thing. And they have no one to turn to for advice, especially not with the NDAs they will typically sign.
So the position they'll find themselves is this: blow the whistle on a morally gray issue which may or may not justify doing so, and guarantee that you will be fired while not at all guaranteeing that the morally gray project will stop. A probable outcome is that the engineer gets fired, the ethics issues don't get escalated, and someone else is hired to continue the project.
Everything is stacked against the engineer. Telling them to fall on their swords for a maybe outcome is neither realistic nor ethical in itself.
I made it clear in the last paragraph it isn't black and white. Everyone has their own situation. I can still support people to come out before they actually do it.
So support - OK - support as in how? "like" on facebook? That is not enough to make it work.
Raising awareness can and does help make it work.
If you don't believe communication is effective then I don't know the point of this conversation.
Assuming that the only consequence is needing a new job is naive. Whistleblowers find it very difficult to get hired again.
Note there may be existing support groups for this such as the ACLU. I'm not exactly sure which non-profits employ a legal defense staff and cover tech issues.
I guarantee you that there were some engineers who recognized situation and left or were let go sooner. It is not just magical heroes. Normal people, men and women take decisions of the "less cool company, less cool project but it is fishy here so I leave" normally. Maybe we should praise and celebrate this sort of decisions more instead of automatically considered them losers. As of now, engineers nor valley don't really value doing the right thing nor are willing to discuss what is the right thing - it values winners no matter what path they took.
Eventually once you have enough money, you can create professional certification bodies. This is when you can really start ramping up your dues because then you'll be solving an actual industry problem. You could probably take a lot of money from big tech companies to help you out here once you're established.
Want your workers to work more, but you don't want to pay overtime, or you run into trouble with regulations about consecutive hours on the job? Just bump up how much work they have to get done, threaten to fire low performers, and make it clear that under no circumstances is anyone allowed to work overtime. Your workers will start working off the clock, and better yet they'll hide it from you, so you can legitimately plead ignorance if the law comes after you.
Want to cut corners on safety to save money? Tell your people that safety is the top priority but you need to see an X% reduction in costs, and it works itself out. They may fudge or falsify metrics, but if you're really lucky they'll find loopholes in the metrics instead.
(I have a friend who worked at a warehouse and fell victim to this. They were officially big on safety, which included bonuses for everyone if they went a certain period of time without any safety incidents. Unofficially, this meant that incidents wouldn't be reported unless it was unavoidable. The one way to ensure that an incident had to be reported was to see a doctor for your injury, so people were heavily encouraged to wait to see if they got better on their own before they got medical attention, which often made things much worse. I'm sure upper management's metrics looked great, though.)
As an added bonus, this sort of thing gives you a lot more control over workers. If you want to get rid of a troublemaker, do a little digging and you'll surely discover that they're violating safety rules or working off the clock or whatever. Everybody is, but selective enforcement is a wonderful thing.
The WIPP nuclear isolation site had a 15 year run so management began to cut corners. Then three unlikely events all lined up and nearly got people killed. It started with a truck catching fire, which prompted operators to bypass the HVAC's filtration system. They stopped the bypass for a few days to perform maintenance on the only underground radiation detection unit. That unit gave a false alarm during testing, but was fixed and placed back into service so they started ventilating again. Then a cask was breached ~midnight because someone upstream had used organic kitty litter instead of clay kitty litter. The operator assumed it was a false alarm due to the previous false positive and kept things running. It wasn't until the next morning that they realized they were blowing radioactive particles above ground.
Had management kept maintenance up, enforced protocol, or done more than the absolute bare minimum (i.e. installing multiple underground radiation detection units) US tax payers could have avoided paying $500 million dollars. And this isn't a one-off thing, there are dozens of instances just like this where the US dodged a bullet.
I would like to ask: What do you think should an ordinary employee do when they see a behavior like you describe from their management? How to efficiently protect themselves (and their colleagues) from this kind of treatment?
Much will depend on your job prospects and financial position. If you're a fancy programmer type who's constantly bugged by recruiters, move on until you find an ethical company. If you need this job to eat, you'll have to be a lot more careful.
In general, I'd say:
1. Point out the impossibility of the requirements to management. Gently if need be. It's possible they don't realize what they're doing.
2. Contact the local department of labor or whatever regulatory agency would be interested in what's going on. They may be able to take action if management is pushing violations in a quiet way like this. If not, they may be able to at least take action against the workplace if people have started breaking the rules.
3. If you can afford to risk the consequences, follow the rules as much as you can. Don't work off the clock, don't break safety rules, etc. If being fired will make you homeless then maybe this isn't an option.
4. Document everything. If regulators weren't interested originally, they may be interested once you can show a pattern. Upper management may be blissfully ignorant, and you may be able to get them involved once you can show them what's going on. Whatever happens, if things come to a head then it will probably be useful to be able to demonstrate that this wasn't your own doing.
I'd recommend the following order of operations:
Convey concern over associated risk to your immediate manager. Verbally first during the meeting, switch to written (email, a paper trail of opposition) if no action is taken.
When documenting the paper trail, simply reference meetings which you voiced opposition. Your notepads should also be able to back up the talking point you're referencing.
Being asked to briefly switch hours or work late is often listed in your job description so stopping suddenly at 5p can be considered insubordinate. Time should be compensated in a time off or paid OT arrangement promptly. If your verbal requests go without action, again, switch to email.
Simply documenting the events as they occur makes it easy for when you need to go above your immediate supervisor ( more senior manager, corporate hq, department of labor) for help.
Key is to be polite in all interactions. Innocent mistakes happen. Managers are under deadlines too. Paper trial should be maintained regardless of action or inaction.
Now you can argue efficacy, being able to pull the wool over the eyes of regulators etc but I think it is a good direction to head in.
All that is way beyond the industry described with institutionalised cheating in tests, it wounds more like the graphics card industry than one with regulators.
Still, even such an excuse should have raised alarm bells, but I assume most developers would just shrug their shoulders and develop the feature anyway, as they would've liked to keep their nice-paying job and juicy stock options.
In reality, Facebook only recently used the DDoS protection excuse for its datr cookie, well after it announced that the cookie would be used for advertising purposes, which also happened a few years after the cookie was introduced.
I imagine whatever Facebook told developers then was even less subtle than "using it for security purposes", and that most of the developers figured it out right then that the datr cookie would be one day used to track users across the web for advertising purposes.
The conversation would be more along the lines of "We need to track people who aren't logged in, suggestions?"
* Govt research agency awards contract to study smallpox, including stockpiling smallpox. Researchers are happy, they're protecting the world.
* Govt weapons agency gets notice from govt research agency that "it's ready now."
* Govt weapons agency confiscates smallpox stockpile and data, makes more, spoons it into the tips of missiles.
Original researchers are more or less legitimate victims, whose goal to help humanity was used to obscure the Govt weapon's agency's goal to kill humanity.
"...sociologist Diane Vaughan described a phenomenon inside engineering organizations that she called the “normalization of deviance.” In such cultures, she argued, there can be a tendency to slowly and progressively create rationales that justify ever-riskier behaviors... If the same pattern proves to have played out at Volkswagen, then the scandal may well have begun with a few lines of engine-tuning software. Perhaps it started with tweaks that optimized some aspect of diesel performance and then evolved over time: detect this, change that, optimize something else. At every step, the software changes might have seemed to be a slight “improvement” on what came before, but at no one step would it necessarily have felt like a vast, emissions-fixing conspiracy by Volkswagen engineers, or been identified by Volkswagen executives. Instead, it would have slowly and insidiously led to the development of the defeat device and its inclusion in cars that were sold to consumers."
"But assuming all of this is true, how could it have persisted for so long, and without more people stepping forward and speaking out?... Volkswagen was heavily invested, both financially and culturally, in producing a clean-diesel engine. That the company was failing to meet the standard required by American emissions tests would have been embarrassing and frustrating to its German engineers. Some may have seen those tests as arbitrary, and felt justified in “tuning” the engine software to perform differently during them—even as it now looks, to the outside world, like an obvious scandal."
And, IMO, yours is absolutely the correct one.
As engineers, we are in a unique position to understand the intricacies and associated risk to any one of our functional requirements we satisfy. True, we are under marching orders from a PM who may (or may not) have provided adequate backstory to understand the goals of the company, but it's our duty to express any concern of risk it may present to the company or society.
When verbal warnings don't work, go ahead and put it in writing.
In the US, part of my engineering curriculum included courses on engineering ethics case studies to drive this point home.
I've written exactly 2 of these letters in my 10yr professional career. On both, the PM sharply changed course, demonstrating their knowledge of failure of the ethical litmus. Putting written responsibility upon them to act encouraged further discussion and ultimately a better resolution for all.
I remember that I worked on system that allowed a financial firm to change a account value outside of all the accounting routine and a way to transfer money from a shared account to any random external account. Sounds fishy, but the explanation was reasonable: this was the way to model dividend. Actual dividend money was coming in one account, the payment/reinvestment from another with a reconciliation done later on. This reconciliation was done by that department operational floor from data coming from other departments, with systems upstream and downstream. I may have unknowingly created the final piece of a massive international fraud system, but I have no idea and all the people had a valid business reason and nothing to gain.
Because that's the point, if I knew and I was willing to actually participate in fraud, I would like a cut in it and a serious one as I would literally provide a gun with my real id from a regular shop knowing you will use it for a bank robbery.
When this general piece of software is set up to control a new engine from an OEM, a multi-year process of calibration starts. The engine starts of failing the emissions tests drastically and over time the calibration engineers tweak tens of thousands of parameters to try to get the emissions to be legally compliant.
This is not an easy process and the calibration engineers have a target, they have to get the vehicle certified for a certain emission class, which in turn is determined by the FTP-75 driving cycle. The news keeps talking about "defeat devices" because the law says that they are not allowed, but reality is much messier. There is nobody sitting there thinking about how to make crazy obfuscated defeat devices, rather there are a bunch of people trying to figure out how to pass the extremely demanding emissions requirement tests without any defeat devices.
They didn't write a defeat device to "fake" emissions, they wrote legitimate code aiming to make the car truly compliant with the requirements.
Then someone made this code not the default and, except when the car is under testing, loaded an alternate mapping of some key parameters which is more tuned for performance and mileage, at the expense of emissions. This is the criminal part, not the first.
E.g. detecting you are on a test stand, you might
* deactivate plausibility checks which would normally limit engine power
* deactivate air bags and other inherently dangerous systems
* ensure correct system behaviour even if sensors give conflicting information
* suppress error logs for certain failed plausibility checks
The only illegal thing is switching to a "massaged" parameter map under these conditions. And presumably a single person controlled this switch.
Support Ticket 57:
Assigned to: John
Hey, can you change this limit field here, currently it is a 5, can you make it a 9.
Maybe if it were more important I would have fought harder. But I honestly don't know.
I would imagine that Robert Bosch GMBH and co. are quietly developing a war chest of cash right now. They appear to have been caught writing deliberately naughty code to order.
Many different engine types will use the exact same firmware, the manufacture supplies a huge array of configuration values to make the ECU run the engine correctly.
All the firmware knows is that under some manufacture configurable situation, the engine will switch from one manufacture configurable mode of operation to another manufacture configurable mode of operation.
This presentation from 32C3 explains it in much more detail: https://media.ccc.de/v/32c3-7331-the_exhaust_emissions_scand...
I prefer to think about this. Management gets what it wants, you can always find someone to carry it out. I think we should be very careful about moving to blame engineers first.
If we had a good way to blow the whistle, without ending in jail, we would see less unethical code being written and less shit storms from greedy companies.
An older car on the road puts off more emissions than the modified VWs, yet older cars are still allowed on the roads.
Laws should be followed, so I am not excusing the behavior, but making it sound like the safety of the public was materially compromised is a bit of a stretch.
The sad truth is that many programmers have no ethics at all. If we all did, there would be no spam, no maleware, no Windows 10. (And no systemd!)
I respectfully disagree. Engineers have an obligation to do whatever is asked of them by management. If that means building nuke-hand-grenades so be it.
Being an engineer( be it software or mechanics ) should be a morally neutral thing. Do the job and keep your morals out of it. This is the only way we can have some semblance of sanity in the field.
I really really really strongly disagree with this. As the ones who are most knowledgable about the systems they're building, I definitely think engineers have a moral responsibility to make sure they're at the very least following applicable laws and regulations - I would argue that we have a moral responsibility to act ethically even in cases where it's not covered by laws or regulations, but I'll admit that's a bit more controversial.
Also, this is why we have engineering codes of ethics, at least in Canada and the US (and while I'm not familiar with elsewhere in the world, I would assume similar things hold in most first-world countries). We don't necessarily have to agree on everything, but there is a baseline for what we consider ethical, and engineers are expected to uphold that baseline, otherwise they are not permitted to practice engineering. Unfortunately the line between 'engineers' and other practicioners isn't as well-defined for software engineering as it is for most engineering fields - but that doesn't mean we should ignore it completely.
I don't understand this from a regulator's point of view: as a regulator, all you have to do is test for symptoms. You don't have to explain root causes. You drive the vehicle in conditions as close as possible to real ones, measure emissions, and decide whether or not they're above the norms.
Why would regulators do this in a lab? It's like health inspections that would ask restaurants to send food to be tested, instead of showing up anytime, unannounced.
Regulators should pick up real cars from real owners and test them on the road, at regular intervals.
Or, modern technology should allow to test a car all the time and report emissions and fuel efficiency, etc. during its lifetime.
People cheat, and if cheating is easy they cheat more. The one thing a regulator cannot do is trust the industry.
When tires are tested, apparently the manufacturers are asked to send in the tires to be used in the tests. Why on earth would a tire testing entity do that, unless they are receiving bribes from the tire manufacturers? 
Because the law on how regulators work was written by the auto lobby. (At least, that's how it is in the EU. Don't know about the US.)
No, that was a result of the Clean Air Act Amendments of 1990 for the reduction of acid rain. Though it was targeted towards industrial emissions of SO2 & NOx, but stricter regulation for vehicles were an additional effect.
>specifically disadvantaging European diesel cars
That's a weird argument given that diesel passenger vehicles in the US are held to the same standard as gasoline ones, but to a separate standard from their petrol counterparts in the EU. I mean, one could argue the opposite, that an EU emissions policy favorable to diesels amounted to an equivalent 13-16% import tariff. 
Several domestic rather than just foreign diesel engine manufacturers were also penalized for using defeat devices in 1998.
> (which are much cleaner overall)
That's quite arguable, trading lower CO2 & CO for increased NOx & PM.
Surely car manufacturers didn't have a say, which is why US and EU emission standards look like this https://longtailpipe.com/wp-content/uploads/2015/10/us-europ... .
>same standard as gasoline ones
Well duh, let's keep diesel cars to petrol standards so that their benefits don't matter and their disadvantages are prohibitive!
As per the source of the image says, "On the other hand, American regulators are focused on smog and health impacts of air pollution." Which the graphic you provided well indicates.
Look, California was probably the first governmental entity to regulate tailpipe emissions. Such so that it's written in the Clean Air Act by name to run its own regulatory scheme to enact stricter regulation(with federal waivers, but that's another issue). The reason being, that LA's unique geography makes smog worse. Heck, in the 1940s, they had an episode severe enough they thought they were under chemical attack by the Japanese.
As such, CARB's emission standards were focused on reducing the more directly harmful pollutants like hydrocarbons, ozone, NOx & PM.
So, given California's influence on the original 1970 Clean Air Act and the 1988 California Clean Air Act's influence on the subsequent amendment in 1990, I don't see how that graphic would support your argument.
I mean, had they such hypothetical power, they could have also blocked the banning of leaded gasoline that was in the same amendment.
>Well duh, let's keep diesel cars to petrol standards so that their benefits don't matter and their disadvantages are prohibitive!
Emissions vs fuel economy. You're being facetious, but if that argument was true, why bother importing diesel passenger vehicles into the states?
They didn't even start reintroducing diesels in America until they thought they could harmonize emissions from Euro 5 with Tier II Bin 5.
And there could be an appeals process whereas when a car fails it can be tested again, and the tests monitored by a 3rd party, etc.
This can become a privacy issue though. Suddenly, you're able to associate licenses with a set of persons with some probability. And you're able to collect side channel information about the vehicle being in motion, and the speed of the vehicle. And who knows what more you can read from the sensors of a modern vehicle. I know of a couple of vehicles with GPS sensors.
Overall, this discussion is very interesting to read, because this is the discussion about unit testing to a dot. We are unit-testing cars with a mock road - and unit testing fails with malicious and/or stupid workers building the unit. Every set of unit tests can be satisfied by a lookup table - this is happening right now in cars.
So now, the question is: Which of the bigger system control tools do we use. integration tests, so, taking cars on a race track? Property based tests - randomize the length of the test, define ranges of acceptable pollution. Live system monitoring? Maybe a pipeline of tests?
a piece of code which purpose is specifically defeating the testing cycle instead, run directly afoul of the regulations, no arguing against multibillionaire company needed.
Both the US & EU would have wanted this information from day one, and this whole fiasco cost VW billions in fines.
There would have been any number of engineers at VW and Bosch that knew exactly how this worked, but there was nothing in it for them to come clear about it. They were never going to get charged for writing that code, and they would have likely been out of a job or destroyed their career at those companies if they volunteered to authorities how this worked.
So why don't investigators just offer a huge cash prices to engineers at those companies who can provide details about exactly how this worked, along with immunity as long as they're forthcoming with information about who instructed them to implement this?
You'd have an army of engineers overnight willing to spill the beans, and you'd save millions in investigative costs, and quickly get to the real root cause of the corruption.
Instead some independent team of investigators is left digging through old firmware images posted on forums to reverse engineer how the defeat device worked.
Doing this would just guarantee that if they were eligible for lenient handling or immunity that they'd get a handsome cash payout by ratting on their management or giving investigators details relevant to the case, but which they didn't think to look for.
In the development model used here, it not like engineers just indepently decide to add a feature. An autogenerated two line procedure is going to have two pages of documentation.
The USA already has a very workable solution. It's called the "prisoner's dilemma".
They were never going to get charged for writing that code
Wrong. That's exactly what you threaten them with. Engineers aren't hardened criminals. When faced with the possibility of jail, 99% will instantly sing like canaries.
The problem is of jurisdiction. The engineers are in Germany and no fucking way will Germany extradite them. So the usual threats don't work.
Well, we did catch one moron who decided to travel to the USA for vacation even though he was a big player in the mess.
Edit: I'm presenting the current general situation for the USA. Your idea of bounties is not bad. It has worked spectacularly well in rewarding employees of Swiss Banks. Somewhat ironically, it's something that Germany has done: http://www.spiegel.de/international/germany/german-city-find...
This is crazy. And, as we move toward self-driving cars, dangerous.
Can someone point me to a well-regarded source in the industry that makes a cogent and convincing argument for why all the software running in a car (save maybe for the entertainment panel) should not be required to be open source?
To be clear: I'm not asking for speculation, or even an explanation from an expert based on years of experience. I'm asking for a publicly accessible reference based on research data that explains the security benefits of not releasing the source code for life-critical software in the car.
Cryptography. When trustworthiness is paramount, as in crypto (and IMO in safety-critical applications like this), being able to inspect the code helps a lot.
Let's say that the software is open source, and you can buy your own microcontroller and turn it into an ECU.
There will be people who make their own modifications to it to improve performance. Some of those modifications make the car fail an emissions test, so they make it so that the car acts like it's at stock settings during those tests.
An ECU is not actually that complicated. It reads a few tens of inputs (temperatures, air mass flow, lambda sensors, crank position, etc.), and controls a few tens of actuators (throttle body, fuel injectors, spark plugs, maybe variable valve timing), with time resolution measured in tens of microseconds. An Arduino doesn't break a sweat - see the Speeduino project.
Does anyone else feel our current government is not prepared to handle the changes pervasive software will create in the future?
I do wonder though if our way of achieving civilization will require significant change in the near future, since regulating code and encrypted code is going to be damned near impossible.
Alternatively, combine the lab tests with imprecise real-world tests as a sanity check, or keep the exact nature of the lab tests a secret and vary them over time. Really, the first of these seems like a good idea regardless of what else you do.
That seems like a good enough requirement to me. Define "a typical driver" as "out of a hundred random test drivers, no more than 20 exceed the thresholds, no more than 5 exceed the thresholds by more than a factor two." That forces the car manufacturers to have a sufficient safety margin in their emissions.
I don't know how difficult the actual measurement is, but maybe you could pay a couple thousand people a reasonable amount of money to have some devices attached to their cars for a month or two and collect data. Or make the car makers pay for the procedure.
I agree that a degree of randomness is likely a good idea to avoid defeat devices, but one also has to consider that it could have two unintended consequences: 1) more expensive cars, due to more stringent QA procedures 2) relaxing of standards to ensure that companies can still practically make cars that conform to 'standards.'
As ever, it's important to consider that a layman's "seems reasonable to me" is another experts "that's not how things work."
I'm sure they can find some way to address the cheating, but it's likely not the case that they are just missing some obvious simple solution.
(At this point, every one of us is saying, yeah, I have. It just took a long time, a lot of change requests, and many round trips with a customer. Which is the point.)
This means that they can detect emissions levels during real world driving tests. What's wrong with just making those tests the actual regulatory ones? So whatever ingenuity automakers can use will be put to minimizing emissions in the exact same scenarios that will be used in real life.
Nothing. But Germany has a strong position in the EU and to a large degree does what VW/BMW/Mercedes want and they wanted to avoid stronger or better tests. This topic is quite often in the news here and it's clear that there is no political will to establish real word tests.
There are also a lot of other cheats in this firmware:
- Below 14°C? Just blast the emissions out. It's not like these cars are driven in the winter.
- Autobahn? Go blast out the emissions!
- and so on...
This should be a far bigger scandal than it is now.
It doesn't take winter for temperatures to drop below 14 °C (57.2 °F). The morning commute takes place between 6 and 9 AM. Even in the summer, temperatures are probably below 14 °C more often than above it at these times.
This article  has a mention and photo of "A portable emissions measurement system at the Engine and Emissions Research Laboratory at West Virginia University."
"Mr. Carder and his team drew on their experience testing trucks when they got the contract to test cars in 2013. One challenge was to fit what amounts to a mobile laboratory in the car. At the time, the equipment available for such emissions testing had enough battery power only for short trips.
To make long hauls possible, the West Virginia University researchers bought portable gasoline generators at regular hardware stores and bolted them to the rear ends of the test cars. The generators made a terrible racket and frequently broke down because they were not designed to be bumped around."
 Researchers Who Exposed VW Gain Little Reward From Success:
One thing would be all the equipment and manpower every test facility would need to invest in (as DonHopkins points out below) - but also, in order to gain initial approval, you couldn't just rely on one or even a few random drives - in order for the numbers to make any sense, you'd need a large sample size - lots of different drivers, driving the cars under different conditions - until you had enough data points to come up with a meaningful figure.
Back up real world tests are a very good idea, but the legal limits need to be defined under standardized, laboratory conditions.
It's easier for automakers to throw resources at lobbying.
- Car users will be offered insurance against breaking emission standards
- Insurance companies will hire specialists to lower insurance rates of above insurance
- Consumers are disincentivized to modify their ECU in a non-compliant way
- Consumers are incentivized to buy cars offering more transparency in firmware (and car manufacturers will offer more transparency)
I'm a firm believer in: you buy it, then you're responsible for ascertaining its safety. If you can't do this, hire someone to do it for you.
People elect government to take care of this for them. We can't all be experts in everything.
I firmly believe that when we work together on solving problems, we are better off.
Putting the onus on the individual is exactly what big businesses would like. It gives them even more of a free pass towards short term profits. Get caught doing something wrong? Close up shop and reopen another, earning profit while fooling the public again under another name. No thank you!!!
This seems like one of those plans that goes
1. Deregulate everything
3. All irresponsible companies go out of business when intelligent consumers patronize responsible ones instead
Skipping over the "decades of death/disruption" in step 2, and being awfully optimistic about consumer intelligence in step 3.
Would you rather buy a locked down laptop which is factory protected against viruses (with a known set of them) but no chance to modify nor understand the virus protection system. Or, choose for a regulating system that punishes those who help spread viruses because no scanner was installed?
The latter teaches the users more and keeps them closer to the product. It creates a market where more parties can enter and provide security advice.
Lastly, the 'stepping over death and destruction' is obviously a straw man. We have seen many deregulations that we in dire need and worked out well. It all depends on transparency and communication.
Beyond that you would immediately get into jargon and confusing packaging for insurance and vehicle emissions. I was just buying dental insurance and the 4 different plans provided by a single provider had 30+ different fields each of which was subtly different. How am I, as a standard consumer, supposed to make sense of all of that... Much less compare it with the 3 other networks. Then add onto that another factor (the type of car I drive) into the same bag and it becomes just absurd.
I don't have a solution to the cheating but I feel like mandating open source software is a good step.
The point is, regulation here is failing because it is not community led. We need to take matters in our own hands to understand the difficulties, simplify and source it back to the government, if needed.
No one will offer insurance against this liability at a reasonable cost.
Instead, car manufactures will offer a guarantee that the car will not break emission standards when they sell the car to the consumer.
This guarantee will be void the second you do anything the manufacture doesn't approve of, such as missing a scheduled servicing appointment, which of course must take place at the manufacture certified servicing center.
Car manufactures will take the opportunity to lock down the firmware even further and block 3rd party servicing centers.
- Reduce emissions standards to levels that can be tested easily in nonstandard conditions.