Hacker News new | past | comments | ask | show | jobs | submit login

Some questions for the researchers, or anyone else who thinks this was okay:

1) Were public roadways and speeds of 70mph absolutely necessary to demo this?

2) What was the plan if the trucker approaching at 70mph hadn't seen the Jeep stalled early and had to swerve or panic stop, possibly crashing and injuring themselves or others?

3) Anyone notify the Missouri State Highway Patrol about this? They may be contacting the researchers with questions about this demo if they weren't consulted in advance.

4) What's the plan if they trigger a bug in the car software of the people they had tested this with earlier? The article mentions them tracking people remotely as they attempt to learn more about the exploit.

I could go on but why bother? In case any of you think this was cool or even remotely (no pun intended) ethical, I'd like to know if you have a problem with letting these two test this on a loved one's car. How about they remotely poke around your husband or wife's car and explore, as long as they promise not to intentionally trigger anything?

If I ever learned this had been tested on a vehicle I was in, I'd make sure this cost the researchers dearly.

EDIT: I've just phoned 'Troop C' of the Highway Patrol at their main number, +1-636-300-2800 and they seemed pretty keen to follow up. The fact that the vehicle was disabled where there was no shoulder, was impeding traffic, and the demo not cleared with them in advance has them concerned. I'm all for testing exploits and security research, but this isn't the right way to do it. And to film it and post it to a high traffic site is nuts.




Calling the police on security researchers...I honestly cannot believe this is considered acceptable behavior. A much less aggressive (and thoughtful) move would be to contact the researchers directly. Wow.

Back to the article, I think that this type of exploit will become more and more common as vehicles become more connected and automated. We need to know that we can trust the software and firmware running on the devices that literally have the power of life and death over us. Unfortunately, this is a VERY complicated issue, and no one has a solution yet AFAIK.

I watched a talk by Cory Doctorow last year where he suggested validation at the hardware level (a la trusted platform modules), but unlike the typical TPMs that only allow vendor software to be authenticated, these TPMs would allow the user to directly authenticate the firmware. If you know the firmware is good, then each layer can validate the next layer up all the way to the OS.

I have yet to hear of a system that allows the user to directly authenticate software/firmware at the hardware level. Is anybody working on research of this nature? Or are there insurmountable problems with this approach?


Too late to edit my original comment again so I'll post a reply here as a general reply to those who reacted negatively to my decision to phone the police.

While I strongly support free speech and believe security researchers should be given some extra latitude when appropriate, what I saw was not at all appropriate. I saw two well respected security researchers sitting in a room like Beavis and Butthead laughing and remotely disabling a vehicle on a multi-lane interstate highway, like it was a big joke. The reporter in the Jeep literally says "This is dangerous" and asks urgently for help. This all filmed and posted to Wired for the world to see, like they are proud of it.

Before working with computers I drove tractor-trailers for a while and was lucky to achieve a million-mile safe driving award. I have a pretty good idea of the dangers here and I know that stretch of road well, I've crossed it many times. I know from experience that a car stopped in the middle of a multi-lane interstate is one of the most dangerous situations you can be in. I've had people hit me who didn't see my huge trailer with flashers on and warning triangles out on a sunny day - it happens quite often. I've seen dozens of people killed in situations exactly like this. You see it coming and a random driver just plows into the stopped vehicle.

I exercised my judgement and decided to phone the local Highway Patrol office. I've read the negative comments and I disagree, I still think it was the correct thing to do. If you are a researcher and you do something this dangerous, and are foolish enough to then post it on a high-traffic site like Wired, I think you forfeit any right to a discreet warning and you deserve to have the police show up demanding answers to some tough questions.


I appreciate your call to the cops and your reasoning. I also have driven a significant number of miles for work and have seen a number of people killed in traffic accidents. This "test" was extremely irresponsible. I know I will be downvoted for saying this, but I think you made the correct decision.


Agreed. I missed the video the first time and didn't believe the text that described the shutdown, video shows the stupidity here, let alone release a recording of it. I expect that will come down soon.

Important research but very poorly tested. Wired and Chrysler (research was funded by Chrysler?) legal teams would not like the contents of this video.

edit: wired's link to video, jump to 2:00: http://dp8hsntg6do36.cloudfront.net/55ad80d461646d4db7000005...


Reporter: "Seriously, this is fucking dangerous. I need to move."

And that was while the security researchers caused the radio to blare so loud that he couldn't hear them on the other end of the phone. The more I see, the more I think they were really negligent in how they planned this out, and I was already firmly in that camp.


So watching the video, I don't see a vehicle stalled on the highway.

What I see is a vehicle slowed considerably, but at least nominally over the legal minimum speed of 40 MPH on highways, and without the driver being able to accelerate on his own. He's travelling in the rightmost lane, explicitly with his hazard lights on. This is not an unusual occurrence on highways. He's then told that to regain control he needs to stop and restart the car, which he does while remaining in motion.

I was surprised, since this is quite different from the way it's being talked about here, as if he was stopped in the middle of the freeway. See GGP comment about "a car stopped in the middle of a multi-lane interstate."

That's not what happened here.


Here's my attempt at a partial transcript starting from shortly after they disable the accelerator:

  Driver: "It says 43 miles an hour, but it's not really that fast."
  [voiceover omitted]
  Driver: "Guys, I'm stuck on the highway."
  Researcher A: "I think he's panicking."
  Researcher A: "He's not going to be able to hear us with that radio.  So loud."
  Driver: "Guys, I need the accelerator to work again."
  Researcher A: "The accelerator..."
  Researcher B: "It won't work!  You're doomed!"
  Driver:  "Seriously [beep] dangerous, I need to move."
  Researcher A: "You gotta turn the car off!"
Many cars can be seen passing them on the left in the video during the test.


Right, but the video never shows the car stalled on the highway. It's moving in every highway shot. It's in the righthand lane, not in the center. The driver is somewhat panicked. We can see how fast he's moving relative to the background.

This discussion has been distorted and sensationalized, and it has not been based on observable recorded facts.


A car stalling does not necessarily indicate it is stopped. Stalled can indicate the vehicle is stopped, or it can also indicate the motor has stopped. Airplanes stall, and obviously they are not entirely stopped, it's just an indication that the motor has stopped. It's unclear as to whether the motor actually stopped, but it's not without precedent to use "stall" to indicate no power available for propulsion.

I don't think this discussion has been distorted. It's based on the information they provided. They put a vehicle on a public highway traveling at the faster end of what's legal in the US on public roads, and then removed a large portion of the drivers ability to control the vehicle. It's unclear whether this affected the steering or brakes, which in a modern vehicle would both be power assisted, generally through the vacuum system of the vehicle. The vacuum is provided by the engine, so if the engine was actually off (which is unknown, but I think it's more likely they just forced the car into neutral), then they removed a large portion of his ability to control the car.

The bottom line is that they put a driver in a situation not only unsafe to himself (which they could have gotten consent to), but unsafe for the other drivers on the road. They did not have consent from the other people on the road to do this (indeed, it's not possible they could have), and if what they purport to happen in the article and video did happen, then they endangered those people. I've seen accidents from stopped cars being hit by others. If the highway is busy enough, the initial accident isn't even necessarily the largest damage, but it moves vehicles into even more obstructing positions and causes follow-on accidents.

https://www.google.com/search?q=stalled+car+accident&tbm=isc...


I can agree that the car is not shown at a full stall in the video, however it is the case that the driver reports that they are unable to control the vehicle during the test. I cannot agree that this would matter regarding the idea that this is "[beep] dangerous" as was stated by the driver, because that is supported by the driver's own statements as well as observable facts.


They've risked people's lives to produce real life looking footage documenting a life threatening event.

Without such event present in the footage, car manufacturers can just say "Meh - no big deal". And continue recklessly risking lives by manufacturing unsafe cars without air gap between CAN bus and Internet.

Remember, it's the car manufacturers that are the bad guys here, not the white hats... And just think how hard was this decision. It's a choice between risking lives and having footage that doesn't catch attention and thus allows car manufacturers to continue making unsafe cars with horrible security vulnerabilities. Amazing.


So demo it at a race track. The essential point here is that the uninvolved public were placed at real risk of maiming or death.

Your argument is ludicrous, because you're attempting to cast the actors as either good or bad. IMHO they are guys with a good idea and motivation who did a bad thing.


We are a very visual culture, unfortunately. Unless there's a video of your average Joe driving on a regular highway and a regular car going wild, everyone would just dismiss the problem as limited to "race track" and would not connect the vulnerability to his/her own car.

edit: as per the article "researchers already did test these exploits in controlled environments and presented these tests to auto manufacturers. Said tests were dismissed by said manufacturers.".


>We are a very visual culture, unfortunately. Unless there's a video of your average Joe driving on a regular highway and a regular car going wild, everyone would just dismiss the problem as limited to "race track" and would not connect the vulnerability to his/her own car.

If optics is your justification for this, then perhaps having these two irresponsible researchers arrested would bring even more attention to this.

>edit: as per the article "researchers already did test these exploits in controlled environments and presented these tests to auto manufacturers. Said tests were dismissed by said manufacturers.".

Where do you see that in the article? Only thing I read was manufacturers downplaying a wired-in attack they demoed.


> "researchers arrested would bring even more attention to this."

Yep.

> Where do you see that in the article? Only thing I read was manufacturers downplaying a wired-in attack they demoed.

No "air gap" between "CAN bus and Internet" equals vulnerable.

We know that. Auto manufacturers know that.

Yet they dismiss the possibility of a hack and continue producing unsafe vehicles. And the trend is toward more vulnerabilities.

I was to lazy to search a direct quote, but here it is now: "Miller and Valasek represent the second act in a good-cop/bad-cop routine. Carmakers who failed to heed polite warnings in 2011 now face the possibility of a public dump of their vehicles’ security flaws.".


That is very much NOT a quote from this article, if you are quoting another article by mistake please link it. As this article does not even use the word "presented"

In this article it mentions how Chrysler is working with them and has developed a patch, indicating that they did not dismiss previously done tests. So basically saying the opposite of what I take your point to be.


You don't get to say that it's fine to put me and my family in danger because hey, in the end it'll make someone somewhere pay attention.


Yeah, you and your family. Well, you are lucky. These researchers and this reporter had already risked their reputations, lives and their livelihoods. So you, now, don't have to. And maybe you'll be even able to benefit from all their hard work, because were would be fewer vulnerable cars around. Although you would probably never know that.


No. They absolutely did not have to produce a life threatening event. They could have done it 5MPH and car manufacturers would still take notice because it would still spread like wildfire on the Internet. What they did was supremely irresponsible and the cops should have been called.


They already did do it at slower speeds in parking lots. Manufacturers didn't care. They probably still won't care, which means that it's a matter of time before someone even less morally-bound decides to wreak havoc on traffic.


> Without such event present in the footage, car manufacturers can just say "Meh - no big deal". And continue recklessly risking lives by manufacturing unsafe cars without air gap between CAN bus and Internet.

Oh really, can you point to the responsible tests that were done in the past that proved inconsequential necessitating this reckless alternative? Or are you just inventing that the car manufacturers would ignore this and somehow the story would just go away?


Haha, in fact, from the article, Chrysler already fixed one of the issues.


That's borderline like saying using crash test dummies is useless because it's not realistic enough for car manufacturers to take it seriously


The actions - according to the article - of auto manufacturers in response to prior more-controlled tests is exactly equivalent to that. The manufacturers basically said "hey, thanks for showing us this crash-test footage that shows our vehicles are literal fucking coffins on wheels; we don't really care", leaving the researchers with no results after taking more "sane" measures.

Researchers perform controlled experiments. Controlled experiments are ignored. Researchers opt for more damning (though less controlled) experiments to further prove their point, and now they're suddenly the bad guys here.


Researchers opt for more damning (though less controlled) experiments to further prove their point, and now they're suddenly the bad guys here.

Much of the commentary here focuses on the recklessness of the highway test and doesn't weigh in too heavily on who the bad guys are.

I think people mostly find the idea of remotely exploitable and controllable cars so terrible that there isn't anything to discuss about that aspect of it, it's nearly universally considered unacceptable (hence the epic thread about the side issue).

Maybe try reading the comments without imputing a side that the writer is taking.


What they should have done was involve the police from step #1. If the video had been conducted on a closed section of roadway with ambulances standing by, police escorts, and lots of badges and sirens, it would have been even harder for the automakers to blow off.

It wouldn't have been difficult to do this right. Cops love drama and publicity. It wouldn't have taken much convincing to get them on board, and the video would gained a lot of credibility.


I agree completely; there were a lot of formalities that were neglected - and had they not be neglected, there would be less backlash against the researchers.

However, this doesn't change the fact that vulnerabilities were demonstrated, nor does it change the implication that auto manufacturers are excessively sluggish about security patches on things that can and do kill people on a regular basis. Even an imperfectly-conducted demonstration like this particular case is preferable to such a demonstration not occurring at all.


"I have to act bad because of the nature of my enemies." <-- says everyone

> And just think how hard was this decision

Given that they did the easy thing, it wasn't very hard at all.


Blocking the visibility through the windscreen, then shutting off the transmission of a car, that is driving on an interstate overpass in traffic, is not white hat by any stretch of the imagination.


Perhaps not, but it's necessary to get the attention of auto makers so that they stop building such trivially-compromisable systems. This was a couple of security researchers on one car for a proof-of-concept; better to demonstrate these flaws early and with a more limited sample than to watch the pileup of epic proportions that would happen should someone even less scrupulous acquire such control over vehicles on the road.

I don't exactly condone the ethics (or lack thereof) of the researchers, either, but if that's the only way to get proper attention (after previous, more polite and reasoned attempts were simply dismissed by manufacturers), then so be it.


Had that Jeep run into you or you ran into it as a result of this experiment, you may have found that you have a profoundly different threshold for what is, "necessary to get the attention of auto makers".

Just because automakers are seemingly keen on ignoring security vulnerabilities does not justify putting people's lives at risk. And let's face it – a multi-ton vehicle that is not entirely in its driver's control puts lives at risk in just about any situation. The reason you and others argue that the demo's methodology is effective is precisely because of the risks involved; not in spite of them.

It is the responsibility of researchers to demonstrate risks without exercising the extent of those risks. Imagine if virologists regularly demonstrated communicability risk by injecting humans with disease outside of the lab.


> Just because automakers are seemingly keen on ignoring security vulnerabilities does not justify putting people's lives at risk.

So condemn the auto manufacturers for putting hundreds of thousands - if not millions - of lives at risk instead of yammering about a couple of nerds who put at most 2 vehicles in probably-nonfatal danger in a worst-case scenario.


Why can't we condemn both?

And as busy as that highway was in the video, it was far more than just 2 vehicles, especially if one of those vehicles was the 18 wheeler.

At the very least they could have done this on a less busy stretch of highway that had a wide shoulder and with control vehicles in front and behind with paramedics at the ready (just like a movie production that is shooting on public streets). Instead the researchers and the journalist chose to be reckless.


> Why can't we condemn both?

Nobody's saying you can't. I certainly do (I strongly disagree with the researchers' obstruction of communication between themselves and their test subject).

My only point is that there's a massive difference in scale between a couple dented fenders and hundreds of thousands of dead/maimed innocents.


Difference of scale? Ok, I agree with you there, but characterizing the risk as "a couple dented fenders" is intellectually dishonest. A high speed accident on an interstate could easily involve serious, even fatal injuries.


It could in some situations, yes. This was not one of those situations.

We're talking about someone coasting uphill with absolutely no braking whatsoever. There's plenty of reaction time in such situations (as I happen to know firsthand, as was the case when my SUV ran out of gas and I had to coast a quarter-mile over a hill to get to the next offramp while merging from the fast lane to the far right at 70MPH). Even for semis, the reporter's car wouldn't mean having to slam on the brakes. Not to mention that the uphill helps with stopping.

The story would be different if the researchers slammed the car's brakes. If that were the case, then yes, death would be possible. That wasn't the case.

No intellectual dishonesty here. Just thorough examination of the situation as described by the author of the article.


Because scale. One is very limited in scope, ie: On one day, in one city, on one road, for a few minutes, one car caused a few other vehicles to make otherwise unnecessary lane-changes. vs the vulnerabilities exposed which affect tens or hundreds of thousands of vehicles in every city, every day, on almost every road, at almost any time.

Agreed, the researchers deserve some criticism, but let's not lose sight of the forest for these two goofball trees.


> it's necessary to get the attention of auto makers

That's mere conjecture. And it's an assertion you could easily test by first doing the remote hack in a controlled environment (e.g. a racetrack) and seeing if automakers respond before trying this on an actual freeway!


If you read the article, you'd know full well that the researchers already did test these exploits in controlled environments and presented these tests to auto manufacturers. Said tests were dismissed by said manufacturers.


I've read the article. Where does it mention controlled environments? The only mention of exploits being dismissed by manufacturers was in regard to a wired exploit, not a remote one.


The paragraphs after the photo of Charlie Miller describe the process of identifying and isolating wireless exploits, including remote-activation of windshield wipers on a vehicle in one of the researchers' driveways. This did admittedly escalate quickly to passive "tagging" of vulnerable vehicles by VIN, but that's a far cry from the experiment in question.

The findings before physical tests (identifying cars with a lack of airgapping or other basic security measures) were also reported to Cadillac (as one example among others); said findings were basically dismissed with a "well we've already released a newer Escalade model with some more security features, so whatever".

This isn't to mention that the wired exploits should've been enough to at least spark some level of concern.


You're reaching.

First, there's no indication in the article that the researchers or Wired presented the remote windshield wiper hack to the car's manufacturer and that they subsequently ignored it.

Second, there is plenty of indication that the exact opposite is true. The remote windshield wiper hack occurred this June, whereas the article states that they've been working with Chrysler on this for nearly nine months and that Chrysler released a patch prior to the publication of this article.

Third, the Cadillac anecdote isn't really relevant here. For starters, it looks like they were contacted by Wired, not the researchers, so it's unclear whether they were contacted before the dangerous freeway demonstration took place. And while the mention of the newer model is a bit odd, the statement also mentions devoting more resources and hiring a new cyber-security officer, making it unfair to characterize it as a "whatever" response.

Sure, it'd be nice if Cadillac was a little more proactive here, but keep in mind that the researchers hacked a Jeep (made by Chrysler), NOT a Cadillac (made by GM). The researchers think the Cadillac is also vulnerable based on its feature set, but absent a specific flaw to patch and given the short amount of time since the initial demonstration (less than two months), it's unclear what GM is supposed to do here.


My point wasn't about Chrysler specifically. My point was about auto manufacturers in general (and I've made this clear from the beginning). By pinning it to Chrysler alone, you're also reaching, I'd reckon.

Also, it's worth noting that the root flaw here - a hole in UConnect - is not limited to Chrysler. The article mentions tracking and surveilling GM vehicles, too (particularly Dodge), which makes sense, seeing as a lot of recent Dodge vehicles have UConnect as well (per http://www.driveuconnect.com/features/uconnect_access/packag...).

> For starters, it looks like they [Cadillac] were contacted by Wired, not the researchers, so it's unclear whether they were contacted before the dangerous freeway demonstration took place.

The article doesn't actually say that. Infiniti was contacted by Wired according to the article, but the initiator of Cadillac's response isn't specified (as far as I can tell).

If they were contacted in the same manner as Infiniti, then it's implied that said contact happened after the wireless hack, since the Infiniti contact involves a notification that the researchers' predictions were "borne out" in at least one of the three of them (in this case, Chrysler).


Did you miss the link in the article to the webpage where you can already download a fix? It's not the manufacturers they were trying to convince.


If you want to get their attention, you demonstrate it on a test track, for a court, as part of a lawsuit against them, for introducing such dangerous features into their vehicles.


I've seen Mr. Miller present at Black Hat and have talked with him about my own vulnerability reports to automotive vendors (I worked with one about two years ago to fix a rather embarrassing remotely exploitable flaw). However, I do not support testing or demonstrating any of the flaws on open public roads. Unforeseen things can and do happen. If the reporter would have been rear-ended this wouldn't have gone well for either researcher & that's enough right there to justify not doing this on public roads. The term "keep it to the track" which is often applied to the automotive racing scene is more than applicable here. The general public already tends to have a negative view of security researchers and performing "research" like this just re-enforces the perception.


You did the right thing. This was completely irresponsible. I'm shocked that Wired, the author, or either of the researchers have yet posted a "we screwed up, sorry" statement.

It's a shame because this is an incredible story and the work they did was great, but what a completely reckless stunt they pulled. Totally unnecessary too, the story would have been just as effective if the demo happened on a test track or empty parking lot.


No he did not do the wrong thing. Reporting them is completely wrong. When we report the people who protect us, well this sounds like a plot to a movie. PS: in movies usually a lot of people suffer before the resolution


You are missing the point by a mile.

These people did the exact opposite. They put others in potentially mortal danger.

They could have killed someone's daughter, son, mom or dad.

Stop and think about that for 10 minutes before you continue posting with this unreasonable point of view. Would your mom, dad or siblings life be worth this test? Imagine they collided with this car and died. Close your eyes and imagine that for a moment. Imagine receiving that call. Going to the hospital. Seeing the, all torn-up and suffering befor they die due to the injuries.

And then you find out it was due to two fuckers who thought it'd be funny/interesting/whatever to disable a car remotely.

Imagine that.


To be fair, a good proportion of the blame -- and a very good proportion of my subsequent lawsuit -- would be directed at the car company whose negligent engineering made the wreck possible in the first place.

Although I do agree with you, I modded you down and the GP up in this case because appeals to emotion aren't the answer. Your post is a form of the "If it saves just one child" thought-ending pattern.


Who protects me from the people who think they are doing the right thing by endangering me?


I support your decision. Disabling a vehicle in uncontrolled conditions on a freeway is reckless, plain and simple.


[flagged]


It's not about the fact that the Jeep was hackable it's about the fact that the demonstration was done on a crowded highway with civilians around.


It's also about the fact that more "reasonable" tests by these researchers were ignored by manufacturers.

Hundreds of thousands of potential deaths at the hands of vulnerable vehicles versus maybe a dented bumper or two. I'll take the latter, please.


You're repeatedly posting on this thread that "the manufacturers" have ignored the researchers' previous tests.

That doesn't seem to be true; Chrysler (the singular manufacturer involved) have, after being alerted to the hack prior to this report, already issued a patch. The researchers are practicing responsible disclosure, have the co-operation of the manufacturer, and are going public with the details at Blackhat next month.

This Wired piece is NOT part of that responsible vulnerability disclosure, it's a teaser to hype up their blackhat talk. It was not necessary to get this piece in Wired to save "hundreds of thousands" of lives. I guess you could argue that the vividness of this imagery will encourage people to follow through on getting their Jeeps patched, so there's that, I suppose.

Since you've repeatedly made this claim, do you have a link to back up the assertion that there are manufacturers who were ignoring this research who will now pay attention because of the crazy stunt Wired pulled?


My remarks were strictly based on the claims of the article. Nothing more, nothing less. The article claims that the researchers performed prior tests, and that said tests were dismissed by auto manufacturers. If we're going to take one component at face value (the idea of the reporter putting others in danger), it would be unfair to not extend the same courtesy to the rest of the article.


"My remarks were strictly based on the claims of the article. Nothing more, nothing less."

Perhaps you should educate yourself before saying stupid, reckless things. The claims of the article are no defense.


Responding to a Hacker News discussion about the article based in information from the article seems quite reasonable. Perhaps it does not deserve phrases like "saying stupid, reckless things". Could you step back for a moment and consider how YOU want others to perceive YOUR postings? I, for one, am a big fan of civil discourse on HN.


Given this is in the context of people loudly condemning a poster here for actually being concerned about other human beings' well-being, I think your claimed concern about "civility" in this one instance is dubious at best.

If someone prefers people making stupid and reckless arguments to other people civilly pointing out that those arguments are stupid and reckless, I'm not concerned about their perception of me.


In that case, the commenter who started this whole discussion of whether or not the researchers' behavior was in the wrong should've also educated him/herself before making phone calls to law enforcement agencies based on the claims of a WiReD article.

Or is basing one's statements on the subject matter alone only valid when you happen to agree with it?


"Before working with computers I drove tractor-trailers for a while and was lucky to achieve a million-mile safe driving award. I have a pretty good idea of the dangers here and I know that stretch of road well, I've crossed it many times."

Game, set, match.


I agree with your move. Not sure who else was supporting you so I figured I'd offer my support. This was stupid as hell and I'm sure was likely set-up/suggested by Wired as a shock film.


I can't say that I completely disagree with you but I do lack your faith in the authorities' ability to respond appropriately. I wouldn't mind if Beavis and Butthead recalibrated their ideas about how to conduct a demonstration. I also hope that they've edited the video for maximum effect, and that the reality was a little less exciting; but the fact is that a stalled automobile is an everyday occurrence that is about as mundane as mundane gets. Minor events such as this cannot be prevented in all cases, and therefore drivers absolutely must watch and be prepared for such an event. It wasn't the safest thing to do, but it isn't outside the normal range of "dangerous" events that one will experience on their commute daily, often more than once daily. IMO it doesn't increase the danger nearly as much as traffic patrol conducting a routine traffic stop on the freeway. If we're prepared to accept traffic patrol on busy freeways, then I don't think it's justified to treat a rare, even if foolish demonstration such as this one as anything more than a nuisance.


There's calling the police, and then there's publishing their phone number in the hopes of directing an angry mob.

Angry mobs are dangerous and volatile and can push prosecutors to overreact. And prosecutors and politicians love to overreact when it comes to hacking.


He published the number of the local law enforcement agency. That is not directing an angry mob, that's helping people voice their opinion to the people responsible for enforcing laws.


Bogging down the local police dispatch isn't a responsible way to voice your opinion.

Imagine: you're a local and you're trying to call the police. But, you can't, because the number is busy. Or you wait forever on hold, because people on the internet are angry about a reckless driving incident that happened weeks ago and that the police already know about.

OP called the police, that's enough. They know about it now. If you want to express your opinion, write the editor of Wired or, if you're really angry, the local district attorney.


I think the level of exposure comes into play here. I'm not sure there's enough people that will make that call from the crowd here to cause an actual problem, but if it's an emergency response number better to not risk it, so I do agree with your reasoning to a point. Perhaps directing people to call a number that is not expected to handle emergency requests would have been more appropriate (and I see OP has amended his comment to remove the number).


> I've just phoned 'Troop C' of the Highway Patrol at their main number, +1-636-300-2800 and they seemed pretty keen to follow up.

The fact you have included their phone number seems to me like you are instigating some sort of lynch mob.

If you actually though it was an issue you would have privately contacted them without telling the world.

Very childish.

This is a big story that the appropriate authorities will be looking at anyway.


You've seriously seen "dozens" of people killed?


Is it really inconceivable that someone who has literally driven tractor trailers more than a million miles has seen dozens of people killed in accidents?


No, not at all. I'm just a regular middle aged driver with perhaps 300k miles under my belt and I've passed numerous fatal accidents on the roadway.

I just sat back and tried to estimate (since of course, I don't have a CB in my car and can't follow traffic details like a trucker) but I'd guess I've driven past at least 5 fatalities -- and that's just fatalities I've noticed (body on stretcher, read about it on the news later, etc)


Many car drivers forget a basic concept of physics named inertia, which is the reason why heavy vehicles (ex. trucks) take ___too long___ to brake.

Emphasis in too long, because this is the reason why road signs announce dangerous sections of a road with even miles in advance.


Whenever I'm hauling >10K lbs with my pickup truck, I get most nervous going downhill through mountains. The only thing between me and hitting something are a thin set of brake pads on all four disc brakes and the extremely limited compression braking on a ~6L engine.


40,000 people a year in the US. I wouldn't be surprised at all if an experienced truck driver has seen dozens of dead people.


>I exercised my judgement and decided to phone the local Highway Patrol office.

That was rash. You called the police within an hour of reading this article? You didn't think it's possible the writer is embellishing or exaggerating the danger he was in here? As of right now, everything they've done has been done in good faith to try to point out the need for extra security.

Also, if they get arrested, even convicted of a crime, then what? You have two extremely angry researchers who know how to hack your car, and what, you're hoping some jail time might help them see the error of their ways and use more caution in the future? You can't see any potential problems if one of them feels vindictive about being jailed over your phone call when they weren't trying to do anything wrong in the first place?


While you may question the morality of notifying the Highway Patrol, your second paragraph doesn't address that issue at all. In fact, if you think that's how the security researchers think and behave, then I'd argue notifying the Highway Patrol was definitely the right choice.

EDIT: Formatting.


I suggest you view the video[1] that bengali3 linked.

1: http://dp8hsntg6do36.cloudfront.net/55ad80d461646d4db7000005...


The journalist is lying. Carefully look at the tachometer when he says that he can't accelerate, it's running at normal speed.


That doesn't really mean anything. They "cut the transmission", which could mean they forced the vehicle into neutral. The accelerator may be supplying gas to the engine, but if the engine is not supplying power to the wheels, the accelerator will have now bearing on whether you accelerate.

Making factual statements contrary to how a situation was reported by those involved is fraught with pitfalls you can't anticipate, from things you don't know about. Until there's careful investigation, or the people involved recant or make factually impossible statements, you should be careful about making assumptions.

In any case, their reported actions are what people are upset about. If someone makes false statements about illegal activity and it results in the police showing up, they have only themselves to blame.


The part before that showed the tach dropping to zero. The cut away to the journalist talking about how he didn't have control then shows the tach at normal. It doesn't take a rocket scientist to tell that it's staged.


That's easily explained. If the car is indeed in neutral, taking your foot off the accelerator would result in a zero (well, idling, say 600-900) tach reading. Pressing the accelerator would result in a higher reading. This of course has no affect on the power to the wheels, exactly the same as if you are stopped and put your car into neutral and do the same.

I understand it's tempting to see a single bit of evidence and want to use it to invalidate an entire narrative, but is it so hard to accept that while editing could have made a situation less dangerous seem more dangerous, the inverse could be true as well? We really have no authoritative source of what happened other than the story put forth. The story may indeed be fabricated or subject to hyperbole in parts, but unless you have a source beyond what's here, you are not qualified to make that assessment from the little evidence presented.


"It says 43MPH but I'm not going that fast"does make me wonder if the display remains accurate after the engine is cut.


> Calling the police on security researchers...I honestly cannot believe this is considered acceptable behavior.

But putting many lives in danger is considered acceptable behavior by security researchers? Does it actually matter that it was security researchers? Do security researchers working on banking software need to steal a million dollars in order to prove that they've found an issue? Would calling the police be acceptable in that situation?

Actual outrage is acceptable when there's actual danger. Calling the cops on a loud neighbor might not be acceptable, but calling the cops on a neighbor firing a gun in the general direction of your house certainly would be.


> But putting many lives in danger is considered acceptable behavior by security researchers?

We don't know, for sure, what happened. There might be some creative license in the journalism. There might be some omission of them talking to authorities (even if someone called the highway patrol and they said "oh, we don't know about this," all it proves is there's bureaucracy at the highway patrol). etc.

Calling the cops is invoking a powerful and hard-to-control force. Unless you think that the cops and the legal system are capable of talking to people fairly and understanding security research and making sense of tech journalism and reaching a reliably just outcome, there are better ways to exert your outrage.

For instance, the researchers are very well known in the security community. They're not rogues or mobsters or fugitives. If they recklessly endangered lives, we can pressure them to turn themselves in to the legal system.

I happen to have more faith in the ability of the hacker community to fairly figure out what happened than the legal system to fairly figure out what happened. If I shouldn't, then we have a serious problem as a community.


Your post basically boils down to:

1. You don't trust the police/legal system.

2. You trust our own community more.

Not saying I agree or disagree, but there are a LOT of people who would really disagree with this, and a lot more who would think that someone saying "why involved the actual people the public has chosen to deal with this, we can deal with it ourselves" would be very wrong.


It's hard to imagine how anyone familiar with any part of the police/legal system's pattern of clueless, ham-handed, hierarchy-ridden interaction with technology over the last 30 years could find justification to continue extending them trust.


So what are we supposed to do instead? Rely on self policing by the individuals or the industry? The the police or judicial system is off track, you attempt to correct them, not ignore them and route around them. That may unfortunately end up with injustice for some while the correction is ongoing, but that's the normal state. There's always correction that needs to happen, and there's always injustice, that's the normal state, the point is to try to minimize the injustice as much as possible. Ignoring the built in mechanisms for steering the government in the right direction doesn't yield a better situation in the end.


"So what are we supposed to do instead" and "Do I trust the police right now" are different questions.

I don't like the fact that I trust self-policing by any community (even my own) more than the institution that is supposed to be doing policing. But whether I like it doesn't affect things.

I agree that it is important for us, as a free society, to fix policing. In the meantime, the best way to minimize injustice is not to invoke police when there is no immedate threat that can't be otherwise solved.


This is exactly why I don't think self policing is suitable. If someone broke the law and endangered other people, you or I as possible community members should not be able to decide no police action is involved when others are the people that were actually endangered. Do the people on the road during this situation not have a voice? We are not incentivized correctly to handle this situation suitably.

The police and judicial system sometimes have conflicting incentives as well, but at least they are aligned more with the public good than ours are. There are laws and they are, for the most part, rewarded for enforcing them.


>Do the people on the road during this situation not have a voice?

Of course they do. They can call the highway patrol or 911 and report a disabled vehicle. Let's imagine how that call would go.

  911: 911, What's your emergency?
  Driver: Hi, there was a vehicle travelling slowly on the freeway with its emergency flashers on, I had to switch lanes.
  Pause...
  Driver: Hello?
  911: Sorry, I was waiting for you to finish. Was there any other information? Did the driver or occupants appear to be in distress?
  Driver: I don't think so, he was alone and appeared to be talking to someone, perhaps on hands-free, or maybe On-Star?
  911: 911 is for emergencies only, in the future please report events of this type to local authorities' non-emergency number accessible via 411. Goodbye.


Exactly. In this situation, the people on the road have less information that we do after the fact. People deserve to know if they were put in danger and why. Since this was done on a public highway, anyone that uses public highways has a right to feel upset about the behavior.

I would like to think that a call to 911 with more information (which of course a fellow driver wouldn't have) would be handled differently:

  911: 911, What's your emergency?
  Driver: Hi, someone on the freeway has purposefully disabled their vehicle in a location without shoulders, and is slowing while driving, impeding traffic.  I'm not sure if the power brakes or steering are functioning, but the driver is definitely not in full control of the vehicle.
  911: We've dispatched an officer to your location.  Has there been an accident yet?  Has the driver recovered control of the vehicle?


With other recent news releases about insiders being fired in internal affairs departments for findings against the police, I believe, we must assume that the police are self-policing in their own right.


If an angry bear is terrorizing your campground, then yes, call fish & wildlife so they can shoot it with a tranq dart and haul it off somewhere safe. In the meantime, though, do you go about your life as though nothing is wrong? Hell no, you get the fuck away from the angry bear!

And tossing chocolate bars into your neighbor's campsite in hopes that the angry bear will wreck their stuff is just not cool.


I agree, but I fail to see how that relates to the current context.

Unless the unruly bear is these security researchers, the fleeing campers are other security researchers in the same field, and their fleeing is them correctly assessing that some LEA is going to be taking down any bears nearby that even twitch wrong after this.

Bad actors ruin it for everyone.


It is true that bad actors ruin it for everyone, and that's why it does not make sense to interact with cops if you have any way of avoiding them. You have no way of knowing which ones are the bad actors until it's too late to do anything about it, and you have no recourse once they have decided to mess with you. Furthermore, they have effectively unlimited resources when it comes to making your life difficult.

You seem to think that because the police are theoretically under democratic oversight, that one can safely interact with cops as though the nominal rules of engagement will restrict them, but even if - in the long run - it is possible to rein them in, the law enforcement system we actually have right now is unpredictable, unjust, and unsafe.


If you don't want to interact with the police, you shouldn't do illegal things, or present your actions as possibly illegal.

If you distrust the police to the degree that you think even if your actions weren't illegal you will still have negative consequences from interacting with them, definitely don't do the above.

When someone's actions extend to endangering the public to the degree we see here (which I think is obvious once you've watched the video), they are past any good will I would have extended them in not contacting the police for fear of an overreaction. Their clear disregard for public safety is reason enough for me.

Additionally, on the chance that it was entirely intentional and they are counting on the media and possibly even law enforcement response to help make this an issue, they they definitely don't need our restraint, and nor do they want it.


Dude, have you not been paying any attention to the War on Drugs, or the Ferguson thing or really any of the Black Lives Matter stuff? The cops will fuck with you if they want to fuck with you, and they will write up whatever paperwork they need to write up to justify it afterward. The courts will believe their testimony by default. The only way to get around this is to release video afterward showing that the cop lied on the stand, and even then the best you can hope for is an overturned conviction; the officer is extremely unlikely to face any consequences. This is the system we have.

It does not matter that we theoretically have democratic oversight. In practice, what we have is a system where cops can do whatever they think fit and expect to get away with it. They are armed and dangerous; it is not safe to interact with them. It is not a good idea to call them, or to talk with them if someone else calls them, because they - the cops - have a clear disregard for public safety when it is counter to their own interests.


You see, the thing is some of us still believe the the police are staff by people, not some faceless conglomeration of drones that all follow the same horrible behavior, and that while there are some, probably many bad police officers, and many systemic problems, they still serve a purpose, and that life without any form of law enforcement would be a big step back in many, many ways. The amount the media reports on something often has no bearing on how common it is.

If I was robbed, I would call the police. If I saw someone waving a gun around in public, I would call the police. If I saw someone using a car as a weapon, I would call the police. If I see a situation where people are endangering the public and someone might get hurt, I would call the police. Not doing so when I clearly knew I should would make me feel somewhat responsible for any negative outcomes otherwise.

I'm not really interesting in continuing a discussion where the other side's position seems to be "the police are racist scumbags and they will ruin your life with the slightest contact, so don't call them on criminals." You might find that characterization unfair, but then again, you're the one over-generalizing using large media events as evidence instead statistics.

Edit: Removed reference to ad-hominem, which wasn't factually correct.


> you're the one pulling an ad-hominem on the police

While I agree with much of the rest of what you right in that comment, this is not accurate: overgeneralizing a negative stereotype of someone other than the other party in a debate isn't "pulling an ad hominem".


That kind of argument is generally considered to be 'poisoning the well' -

https://en.wikipedia.org/wiki/Poisoning_the_well


You're right, so I'll update it to reflect your wording, which I think is clearer, and actually correct.


Have you seen the damage being caused by "hackers" lately? I don't trust the police, but I don't trust my fellow hackers any more, especially when you consider their clueless, ham-handed interaction with the general public over the last 30 years.


Hackers can steal your money. You may be able to get reimbursed, depending on how they did it. The novel thing about this Jeep story is precisely the fact that hackers are demonstrating an ability to do something worse than stealing money.

The cops can wreck your house, break your stuff, take your money, shoot your dog, deprive you of your liberty, and - if they think they can get away with describing you as a threat - shoot you dead. Even if you spend the time and money it would take to prove in court that all of this activity was illegal and unjustified, most of the time you'll fail, and even if you succeed you'll never get anything back.

I don't trust hackers or cops, but I can sure as hell see which one is the bigger threat.


It's not really novel because hackers are having a physical presence, it's novel because hackers are attacking cars. Even then, it's novel because hackers are remotely controlling cars. They've been able to unlock your car and drive away with nothing but their Android phone for years.

Meanwhile hackers can break into the control systems for the power grid and shut down electricity, causing major damage. They can open or close hydroelectric dams, causing flooding and death. They can control hospital systems and kill or injure patients. They can control the airplane you're riding on while sitting in their seat. And all of these have been demoed at security conferences I've been to. I've seen these all in person.

There's a lot of physical harm that hackers can do. Sure, with a cop it's more personal since they're standing there in front of you pulling the trigger and a hacker doesn't even have to see your face.

Love them or hate them, both hackers and cops exist for a reason and are not going anywhere any time soon. One of the reason hackers exist is to point out dangerous security flaws like this. One of the reasons cops exist is because sometimes hackers are just as dangerous as the actions they're trying to draw attention to.


If there's been journalistic licence employed then they should be able to demonstrate that to the authorities since the entire footage was recorded. So there isn't an issue.

It also should be noted (since nobody else has raised this point) that some people within the police force likely read Wired anyway. So complaining about the police involvement for a piece posted to a popular news site is like moaning that your boss reads your Twitter feed.


If only there was some agency or agencies responsible for investigating possible illegal activity, we could have contacted them instead, and then they could investigate the matter with the input of the judicial system, interview the people involved, and come to the appropriate response.


After Alice Goffman, a publication with this kind of journalist involvement has to be seen with a certain suspicion.


Slowing down and eventually driving off on to a grass shoulder wouldn't even crack the bottom 1% of crazy shit I've seen people do on highways, on purpose.

IMO, the danger to the public caused by the researchers somewhat-controlled exploit is utterly dwarfed by the danger Jeep/uConnect is causing by directly connecting its cars to the internet. If the researchers are successful in getting car manufacturers to remove features like this entirely, the net result will ultimately save lives.


> Slowing down and eventually driving off on to a grass shoulder wouldn't even crack the bottom 1% of crazy shit I've seen people do on highways, on purpose.

The article claims that the transmission was cut on a section of the freeway with no shoulder, so I'm curious how being stuck in the middle of the freeway translates to "slowing down and eventually driving off onto a grass shoulder." (And just because something "isn't the craziest thing I've seen" doesn't mean it isn't dangerous)


>'Slowing down on the freeway'...

With or without shoulder, a stalled automobile is an everyday occurrence that drivers must absolutely watch and be prepared for. It wasn't the safest thing to do, but it isn't outside the normal range of "dangerous" events that one will experience on their commute daily, often more than once daily. IMO it doesn't increase the danger nearly as much as traffic patrol conducting a routine traffic stop on the freeway. If we're prepared to accept traffic patrol on busy freeways, then I don't think it's justified to treat a rare, even if foolish demonstration such as this one as anything more than a nuisance.


Grass shoulder: http://www.wired.com/wp-content/uploads/2015/07/IMG_0724-102...

I agree that it probably presented some level of danger to the public, but I maintain that (i) the added danger was small relative to the normal everyday danger of driving with humans; and (ii) the media exposure they've achieved by doing this on a public road has the potential to pressure Chrysler to remove tens of thousands of hazards (read: vulnerable Jeep Cherokees) from the road, which could ultimately reduce danger and save lives. It's not clear-cut when you're dealing with a vendor who chooses to ignore and/or litigate upon an initial disclosure instead of fix their product.


I think the point is that you, and the researchers, have no right to make that decision for other people. It's illegal to purposefully stop in the middle of the highway. It's illegal to cause someone else to do so as well.

People who routinely test things that have the capability for real harm learn to take precautions for as many of the the things you don't think off as you can. For example, the Mythbusters are routinely testing things with cars and other bits of machinery that can cause physical harm if something goes wrong. They also routinely retire to abandoned airforce bases, remote locations, and the salt flats to test things.


Point taken. Maybe I'm just being horribly jaded and/or emotionally invested from too much driving, but after a while it gets hard to discern malice from incompetence, and you start (wisely or unwisely) worrying more about actual outcomes than intent. If I'm hit by a security researcher or a drunk driver or a million miler who had one black swan of a bad day, the result is the same to me: I'm hurt or dead.

But perhaps a little more objectively, there's certainly a moral hazard here; if every security researcher did this on public roads it'd likely be chaos. An appropriate response, IMO, would be for the police to make a phone call and tell them not to do it again.


I agree, I don't necessarily want them to go to jail, but I would be happy if they were suitably scared shitless for a while as the enormity of how bad they fucked up (if the facts are as they seem) hits them. Part of the benefit of the extreme reactions from the people here is that future security researchers working on interactions between software and hardware appliances that may pose a physical threat will have an example of exactly why you should show your proof in a controlled environment.

It's actually not that different than pure software security research. You don't show a POC for your new DNS exploit by doing it against Comcast or AT&T public DNS servers without expecting some blowback. You set up a test environment.


> the danger was small relative to the normal everyday danger of driving with humans

The danger of a car having its transmission crap out on the freeway is non-zero, yes. On the other hand, they explicitly cut a vehicle's transmission on the freeway. The danger for the people in the immediate area of this vehicle was increased because the situation (a cut transmission) went from "maybe it will happen" to "it is definitely happening." At that point, whether or not an accident happened depended entirely on the skill and attention of the drivers around this vehicle something completely out of the control of the researchers.


That photo is next to the parking lot where he "found an empty lot where [he] could safely continue the experiment" and then "they cut the Jeep’s brakes, leaving [him] frantically pumping the pedal as the 2-ton SUV slid uncontrollably into a ditch"

Not the highway.


FYI, here's a link to a video[1] of the situation (which bengali3 posted up-thread). The grass shoulder was later, cutting the power was actually done on a fairly busy stretch of highway with no shoulder. The reporter's words during that incident are particularly telling.

1: http://dp8hsntg6do36.cloudfront.net/55ad80d461646d4db7000005...


The hackers' behaviour was utterly reckless.

Demoing it on a test track with no other vehicles and a volunteer driver with helmet and roll cage -- that'd be acceptable, maybe, with suitable safeguards.

But doing it on the open highway with unaware third parties driving past, merely telling the test guinea pig "not to lose control" while being blasted with cold air and loud noise, having the controls disabled, and visibility impaired? That's gross recklessness with public safety. (Here's a clue: you could have put me in that car and given me all the warning in the world and I could not guarantee maintaining control or not causing a potentially fatal accident at 70mph under those conditions.)

You shouldn't run experiments on big powerful machines in public places where you can't keep by-standers out. Gross ethical breach. I just hope the journalist is exaggerating or making things up.


To me it seems like it is gross recklessness with public safety from car manufacturers. The car manufacturers are risking lives of all these people by not keeping the air gap between CAN and Internet...


It can be both, security researchers don't get a free pass just because they are exposing a wrong.

Had someone died you might (in countries which have it) get corporate manslaughter on a company that ignored security warnings. You absolutely would on the researchers and the journalist for their reckless disregard for the lives of others.


Yes, researchers don't get a free pass. Nothing is free. They've risked lives and their reputations to save lives. It had happened before in the history. And hopefully it will happen again. Some times it is worth it.

(*) without risking lives there wouldn't have been a video documenting these life-threatening vulnerabilities in the cars.


Rubbish - there would be a video of them on a test track. Much like all the videos proving quite how fast the cars can go. You don't drive a McLaren F1 at 240mph on the Interstate and post it to Wired expecting to get away with it...


They did not only risk their own lives. They put other people around them at an increase risk when it was unnecessary. That is the argument, that it was not necessary. This same demonstration could have been done on a track or other controlled environment where the public was not in danger.


I don't know where the threshold is, but calling yourself a "security researcher" is not a blank slate to do whatever you want.

I think it's 100% OK to test on a private car on a private track.


Or test on an official research highway such as Virginia Tech's Smart Road: https://en.wikipedia.org/wiki/Virginia_Smart_Road


Had my car stall on the highway once. Pretty scary because you lose power-brakes and power-steering as you're trying to pullover.

Was it a hacker? Nope, just a dumb mechanic that got trash deep into the air intake during a routine oil change.

How many (dumb mechanics)*(routine oil changes) are there in this country? Five-Six orders of magnitude more than auto hackers, which is why I don't see any harm in one more (where the driver knew ahead of time it was going to happen).


Cars aren't toys. Just because there are many stalls doesn't mean adding one more becomes acceptable.

Here's the good test: since humans were involved, how did they present this to their ethical review board?

I'm pretty confident the answer would be "what's an ethical review board?".


DARPA don't need no ethics board.


The difference is your 'dumb mechanic' made an unintentional mistake.

These people, on the other hand, knowingly and deliberately disabled a car on a highway. Yes, they had a plan, but they are still running their little experiment on other unwitting drivers on the highway. I don't consider the manner in which they ran their test to be ethical; it should have been performed on a closed track.


I will point out that the car didn't stall. Power brakes and power steering weren't affected. The transmission was forced into neutral which did affect his ability to accelerate. Still reckless and stupid. And probably worthy of a call to the police -- though debatable. I don't understand why they didn't just test this in the drive way or on jackstands or at a track or on a dyno. Driving the car on a public street was not needed.


Your recklessness is judged by the extra harm risked.

You can't just waive it away because other risk is more dangerous across the entire nation. Under that standard: One little murder is a rounding error compared to the 2.5 mil who die each year.


The fact that the driver knew ahead of time does mitigate the danger but does not excuse the remaining danger. The fact that there are already lots of dangerous stalls seems completely besides the point to me.


Bad calculation. One auto hacker can shut down all vulnerable cars simultaneously.


They did test in controlled environments previously, according to the article. Said tests were ignored by the auto manufacturers.


> Calling the police on security researchers...I honestly cannot believe this is considered acceptable behavior. A much less aggressive (and thoughtful) move would be to contact the researchers directly. Wow.

What would contacting the researchers achieve? They arbitrarily did the experiment in a public highway "to make the headline more shocking".

As much as I support any kind of security research, and as much as I support getting the attention of people to raise awareness, contacting the authorities was the correct move; a public highway is not a research lab without the proper permission from authorities.


>Calling the police on security researchers...I honestly cannot believe this is considered acceptable behavior.

There is even a bigger problem. These researchers, even if they were negligent, are far more at risk of legal punishment for creating a small risk for the sake of increasing safety standards overall than the people who choose to cut security funding and put magnitudes more people at risk for the sake of making more money.

Isn't it odd that we have a legal system where the ones attempt to expose and fix the problem for the sake of safety are facing far greater legal trouble than those who knowingly allowed for the problem to first occur due to increasing profits?


I think researchers should have complete 100% legal cover if they test private vehicles and private roads.

But as someone who says the car manufacturer ought to face legal consequences for failing to fix a remotely exploitable stall-out in a timely manner (even without demonstration of anyone being harmed), I also say that people who fuck with moving cars on the road are a menace as well.


Depends. For example, in some states, it's still illegal to put someone intoxicated in a car driving on private vehicles on private roads.

http://www.lawyerinlongbeach.com/Torrance-DUI-Attorney.html


We could easily reverse that reasoning:

Security researchers should only be liable for potential risks if we manage to hold those companies for potential risks.

If we fail to do the latter it's quite unfair to let individuals bear the brunt of legal enforcement.


Unfair but also of great benefit to the corporations creating products with security holes. It reminds me of 'identity theft' where the average citizen bears the risk due to the poor verification practices of loaning institutions instead of the institutions who create the risk as a byproduct of profiting off making the loans. If anyone thinks this is accidental, you should contact me as I have a great deal on this new bridge that is about to hit the market.


A small risk? Disabling a car on a busy highway is not a small risk.

What about this "experiment" could not be done in controlled environment on a track… or a country road… or an empty parking lot.

I suppose we could just have infectious disease researchers set up shop on a street corner by this logic. Whatever! It's just a small risk! They're doing it for the sake of increasing safety standards!


Small compared to the risk of under-funding security research as a cost cutting measure knowing that weaken security will allow for these exploits to occur.


Using violent methods (such as intentionally sabotaging a car on a busy freeway with someone in it) to get media attention in order to further a political goal sounds a lot like the definition of terrorism.


Only if your sense of scale has stopped functioning. It is a dangerous journalistic prank that probably does deserve a telling off from traffic cops, to much the same level as someone who is drunk driving. But I think trying to classify it as terrorism is not helpful or particularly sane.


Seeing as how drunk driving kills a very large number of people every year and is now punishable by imprisonment and extremely steep fines, you might be onto something here.


If people were screwing with cars like this as often as drunks were driving, I think you would end up with mortality figures that were at least in the same ballpark.


At what scale would you consider it to be terrorism?


To my mind, it would have to be some form of an attack, if untargeted, at least hundreds of cars, and if small would have to be targeted and strongly political, dangerous stupidity in a single instance for the purposes of having a good press story, doesn't qualify as either causing terror, or having an intent to, notwithstanding the broad legal definition that has been adopted over the past 15 years.


OK, just making sure I follow: They should exploit security holes and put people at risk to ensure that security research is not underfunded, which could lead to someone exploiting security holes, which would put people at risk.


Your argument would make sense if all exploits were equal. Think of it more like infecting people with weakened/dead forms of potentially deadly diseases so they will be better protected against that disease. The weakened form, while it may not be risk free, is not equal to the harm of a full own infection.


> Think of it more like infecting people with weakened/dead forms of potentially deadly diseases so they will be better protected against that disease.

If these guys want to be regarded as researchers, they need to act like them and be accountable like them. No ethics committee would ever approve a test like this.


The IRB as it currently stands it too strict with its regulations. Also, why should the researchers be regulated when the ones producing the things that are initially putting people into danger are not regulated (or are regulated by bureaucrats who couldn't tell you the difference between a buffer overflow and a SQL injection).


> The IRB as it currently stands it too strict with its regulations.

WTF are you talking about? There is no the IRB.


>The IRBs as they currently stand are too strict with their regulations.

Better?


So they should get away with it to ensure that their funding isn't cut? I'm intrigued what you'd consider a "big enough" risk that they should face punitive measures.


It doesn't strike me as so odd, and your framing of the situations doesn't strike me as particularly conducive to honest discussion.

The people who "choose to cut security funding" (I'm assuming you mean congress?) acted within the bounds of the law and within their power as elected officials. They broke no laws and your disagreement with the results does not make them criminals.

I don't know if these researchers broke the law. All that's happening now is they are being investigated. If they did break the law, then it seems to me perfectly logical that they would be in "far greater legal trouble" than someone who didn't.

TL;DR it's not odd that someone who broke the law is in more legal trouble than someone who didn't


"A much less aggressive (and thoughtful) move would be to contact the researchers directly."

Not conducting this demonstration on a public highway would also have been a much less aggressive and thoughtful move, not to mention less dangerous.


Finally some IT guys getting how the press works and now you want to change their story to something like "how I hacked a car in my backyard".


I understand why they did it this way, and I'm glad that this issue getting more publicity. But that does not mean that the means they chose are justified by the end of greater publicity. They simply did not have the right to endanger the other people on that road without their consent.


It's particularly because getting big headlines is so enticing and intoxicating that research labs have review boards.


Test pilots work from very careful plans in order to gradually test the envelope of huge powerful machines filled with explosive fuels.

Their test-engineers don't say "hey, we're going to do some stuff, but try not to kill anyone".

Gross negligence.

Perhaps you are simply unaware of how dangerous the situation was? Several experts (ex-truckers) have described how they have seen people killed in circumstances like this. If you're being intellectually honest, that should inform your responses.


What would be more thoughtful is for the researchers to plan this better. I think calling the police was the right action in this case. Why do researchers get a free pass? If it were some pranksters publishing the exact same thing on youtube, would you still consider it unacceptable behaviour? No, what I find unacceptable is your hypocrisy in trying to shame the guy who did the right thing.


> Calling the police on security researchers...I honestly cannot believe this is considered acceptable behavior. A much less aggressive (and thoughtful) move would be to contact the researchers directly. Wow.

Reminds me of people who will call the police on a loud neighbor instead of just, you know, talking to them first.


This isn't just a loud neighbor. This was a drunk loud neighbor waving a loaded gun around.

The driver was clearly distressed and they were just laughing it up.


No, these are knowledgable security researchers doing serious work who are probably amenable to discussing their research methods with concerned party via email or phone instead of the concerned party immediately phoning the police.


I don't agree. They breached their own covenant, which doesn't afford them the benefit of the doubt you might assign to 'knowledgeable researchers doing serious work': "We won't do anything life threatening." Then they did precisely that: they disabled the transmission on an uphill freeway ramp with no shoulder, with a very large truck bearing down on the Jeep. Trucks cannot stop quickly. Substantial danger for the driver, the trucker, and everyone around them.


Doesn't matter who they are. They have a loaded gun in their hands. Use it somewhere private or not at all. Anything else is unacceptable and extremely dangerous.


Well, let's just have these guys arrested for marauding around the freeways with a "loaded gun" and we'll let the Russian hackers figure out the Cherokee's vulnerabilities.


You honestly don't think these "security researchers" did anything wrong? They could have easily demonstrated the same vulnerability on a private course and made the same point, no need to recklessly endanger everyday people.


You have not met our neighbours! Drug dealers, they run a vehicle repair 'service' from their garden despite local council enforcement notices. They regularly have fights in the street, my wife has been verbally abused and followed on numerous occcasions and there have been two police 'drugs raids' that have resulted in absolutlely nothing useful happening. I could go on, but it's not relevant.

This has no bearing on the original issue ('Calling police on security researchers'), but I'm just saying that you can't debate nicely with everyone.


You're certainly right, you can't debate nicely with everyone. Is there evidence here that the researchers in question aren't open to critique or aren't willing to discuss the safety issues involved with their research methods?


> Calling the police on security researchers...I honestly cannot believe this is considered acceptable behavior. A much less aggressive (and thoughtful) move would be to contact the researchers directly. Wow.

So, you don't care about the fact that this experiment on public roads could have killed people? Just because it's for security research it's ok to recklessly endanger lives? What Wow's me is your cavalier attitude, I'm glad he informed the police and I hope they face repercussions. What they did needlessly endangered people's lives and public safety to add a sensational bit to a story, I find that way more "aggressive" than informing the proper authorities of those actions.


This is like testing the new trigger safety on a gun by firing into a crowd. Its incredibly negligent and unethical. That section of i-64 is very busy and the police should get involved.


> Calling the police on security researchers...I honestly cannot believe this is considered acceptable behavior

> they cut the transmission. [...] Immediately my accelerator stopped working. As I frantically pressed the pedal and watched the RPMs climb, the Jeep lost half its speed

In what world do we not call the police on this kind of behavior?


>unlike the typical TPMs that only allow vendor software to be authenticated, these TPMs would allow the user to directly authenticate the firmware. If you know the firmware is good, then each layer can validate the next layer up all the way to the OS.

nothing novel there in terms of having to have some "new" TPM. Just OEMs choose to lock down their boot chain. Probably most secure boots are minimally implemented to only support the use case of secure/trusted boot (device/chip/OEM key) xor untrusted boot (no key).

If both are supported, whatever functionality that relies on OEM firmware or chain of trust would be disabled if it is an untrusted boot (like fastboot oem unlock for some android devices) situation.

May be tricky to enable certain desirable/required features if user wants to run their own firmware.

>I have yet to hear of a system that allows the user to directly authenticate software/firmware at the hardware level. Is anybody working on research of this nature? Or are there insurmountable problems with this approach?

I think chromebooks/chromeOS folks have been looking at this. Not sure of the current state of things.

p.s. TPMs kind of suck if they are not able to be updated OTA.


The security researches behaved irresponsibly by performing this demo on a public highway, especially one with no shoulder. They could have driven around a private track just as easily. Performing the test on a public road endangers the journalist, other drivers, emergency assistance personnel, etc.


I'm not for agitating the authorities on this one. Sometimes security research like this requires a little performance art to get the message across. Guys like Elon Musk just need a proof of concept to modify his designs, incumbents like Fiat Chrysler need exactly this. Remember Toyota?


Calling the police on security researchers...I honestly cannot believe this is considered acceptable behavior.

Who cares what their job title is? They deliberately blocked visibility and then cut the transmission of a vehicle being driven on a public road in traffic. That is well into the territory of criminal negligence.


Same here. Posts like these make me take the effort to log in just to downvote. Sad to see such a mindset to gain so much traction.


Calling the police was completely inappropriate, but downvoting the comment as a way to signal your disapproval with his action in the real world isn't helpful. The comment itself is well written, on topic, and leading to good discussion.

I agree with what others have already said: Since nobody was actually hurt he should have contacted the researchers to make his point.


Right, yesterday my neighbor shot a gun at me several times but since he missed and no one was hurt, it would have been utterly inappropriate for me to contact the authorities.


Being shot at (intentionally or otherwise) is entirely different than a car in front of you slowing to a stop.


One problem with this reasoning is that the researchers really didn't know what they were doing with 100% certainty. Their code could have accidentally affected the stability control subsystem that most cars have nowadays -- the one that's designed to apply full braking to a single wheel to recover from a skid. In fact, just corrupting the data from the steering-wheel angle sensor could have had that effect (which I personally find rather terrifying in itself.) Good job, guys, now you've caused a 70 MPH rollover in traffic.

The right way to do this would have been for the researchers to call the police up front and arrange a demonstration on a closed road with police escort. That would have lent the video more credibility and shielded the researchers from liability, while addressing any concerns about safety or ethics.


Firing a gun at someone is dangerous. A car slowing down and/or with reduced visibility on a busy highway is dangerous. Then either of these situations happen by accident, we understand that there's not a lot to do about them because there was no intentional behavior that needs correction. When they are purposefully done, that's endangering people, and is unacceptable behavior that needs correcting. In this respect, they are no different.


It is a never ending discussion around here but my take away is that votes express opinion as well as quality of the posting.

We are here not an intellectual debate club were people take pro and contra sides and points are distributed based on the rigor of the argument, but discussions around here are about real world problems.

And somebody calling the cops on security researchers just because he read an article on the intern is in my view highly questionable behavior for somebody familiar with the tech community.


I was thinking about how dangerous it was while I was reading it too, but I came away far less concerned than you I guess. The deceleration on the highway was the most worrisome, but it's not even in the ballpark of common driving hazards like distracted folks on cellphones or flying debris. A crash from such a thing is unlikely and the inconvenience is pretty minimal.

Even you, the busybody who called the cops because you read an article, said "What was the plan if the trucker approaching at 70mph hadn't seen the Jeep stalled early..." which implies that the trucker would have been following too closely or not paying attention (or both).

It's worth pointing out that the driver was aware of the situation and they didn't do anything dramatic like lock the brakes or throw the car in reverse. They chose a gentle deceleration in a stretch of road that had no shoulder to make it feel dangerous, but, on the spectrum of hazards that most drivers face every time they take the car out of the garage, this is pretty tame.

The fact is, had something happened, it wouldn't have been the disabled car that was at fault.

I think the researchers are in the clear, and for you to have read the article and been bothered enough to call the cops (and post the number for, presumably, the convenience of other hyper-sensitive folk who might otherwise just go back to staring at the neighbor kids from their bedroom window with their phones in their hands and 911 on their speed dial) is nuts.


> "What was the plan if the trucker approaching at 70mph hadn't seen the Jeep stalled early..." which implies that the trucker would have been following too closely or not paying attention (or both).

Say there was a person working at a grown-up lab that deals with traffic safety. Like the University of Michigan Transportation Research Institute http://www.umtri.umich.edu/ .

The person wants to know what happens when someone slams on their brakes on a 70mph road. He says "don't worry, if anyone hits me, it will be their fault, because they were following too close."

What do you suppose the ERB says in response?


I thought in the article they say he gradually decreased speed, slamming brakes is a completely different story.


They intentionally disabled a car in an area with no shoulder. They intentionally introduced a large hazard on the highway. If something happened, they would have been partly responsible. You do not just stop on the highway just because you feel like it.


They intentionally put people at risk when there were better, legal alternatives. Responsible researchers do not do tests that subject the general public to risk.

I do think that educating them with regards to better choices would be helpful, but they appear to have committed an offense and documented it on camera in the news. I think they were going to end up in trouble one way or another here.


When it comes to ethics and moral responsibility, intent and agency are everything! For ethical purposes, it is similar to injecting a person with a flu virus to test if their acaiberry diet has improved their immunity. Yes, they could, even without your intervention, have caught flu and also spread it to others, but as an agent, you have increased that probability of flu occurring and spreading in the community to close to 100% when it could have been very close to zero.

Similarly, the researchers have increased the probability of a crash from the near-zero probabilities that are typical of actuarial tables to close to 100%.


Wow, they were really lucky an accident didn't occur since there was a "close to 100%" chance of an accident occurring!


The researchers created an entirely unnecessary risk on a public road. There's nothing more to it than that.


It doesn't appear like the researchers have access to the car's firmware, so how can they guarantee their code will not have other unpredictable effects? Automotive parts and firmware are put through endless testing before being allowed onto public roads. Why should I have to be at an unnecessarily increased risk of an accident when this could've been done on a track.


They did NOT "slam on their brakes" and it was NOT a large hazard. I understand that, if something had happened, they would have FELT partly responsible. But it's the kind of responsible that people with a lot of bumper stickers experience when someone wrecks because someone was paying too much attention to the stickers and not enough attention to the road.

Yes, they created conditions that might have made it possible for a lousy driver to wreck a car, but, no, they did not do anything inherently dangerous. A driver--ANY driver--is expected to be able to handle gently decelerating cars on the highway. They should also be able to pay attention despite big billboards, confusing traffic signs, and attractive people gallivanting on the sidewalks.

The average traffic jam is much more likely to cause an accident, but it typically doesn't and, when it does, we blame the driver that rear ends someone, not the masses of people who have actually stopped on the highway, often NOT gently.


Wow, I'm shocked at how contentious this comment is. Count me in the "thanks for being a responsible citizen" column.

I feel like there's a lot of cargo cult thinking going on here. The situation is _almost_, but not quite, like a lot of other ones where the security researcher is unreasonably blamed. For example, I could easily see some people being up in arms about announcing this exploit at Black Hat.

But that's not the case here. I have a healthy fear and respect of a ton of metal flying down the road at 70mph. And this stunt, done just to generate headlines, was needlessly reckless. It could have just as easily been demoed in a private lot or something.


> t could have just as easily been demoed in a private lot or something.

It was previously demoed in parking lots and other controlled environments by these researchers, according to the article. Said demonstrations were ignored by the auto manufacturers, with some manufacturers - like Toyota - trying to claim that their systems were still "secure".

The public and the manufacturers need a proper wakeup call. My fear is that even a "reckless" test like this one isn't enough of a wakeup call.


Life is hard. Sometimes people don't pay attention. Pulling irresponsible stunts isn't an appropriate response "to make people pay the proper amount of attention."

If someone had died from this stunt, the total number of deaths from remote hacking of cars would be 1.

NB: I highly favor a bounty system where someone who can demonstrate the ability to take over a car without touching it gets paid lots of money, and if the company fails to fix it they get fined even more money. But "someone else is doing something bad, too" is never a good justification.


> If someone had died from this stunt, the total number of deaths from remote hacking of cars would be 1.

If this stunt had never happened, we'd be in a position where some less-scrupulous actor would demonstrate such exploits on a much bigger scale. I can guarantee you that the total number of deaths from remote hacking of cars would be far greater than 1.

If we're going to play the "OH NO THINK OF THE CHILDREN^H^H^H^H^H^H^H^HHYPOTHETICAL DEATHS" game, then let's put this into some goddamn perspective, eh? 1 v. hundreds of thousands (if not millions) that are currently vulnerable to remote hacking right this very instant.

In all actuality, of course, that "1" death was highly unlikely; at most, we'd probably see a few dented bumbers and a couple grand in car repairs. Maybe somebody with whiplash.


I'd like to see the assessment they provided to their review board comparing the risk of their actions with the risk of not doing their actions.


But you're ignoring the fact that this exploit could have been demonstrated in a safe manner on a racetrack or similar with just as much effectiveness.


It could have been demonstrated, yes. It's the effectiveness that's in question, seeing as similar demonstrations weren't particularly effective.

And yes, they could've easily done this demonstration with better safety constraints (particularly regarding communication between the researchers and the driver; said communication was seriously impaired), but the implication is that the researchers believed a "live" test to be necessary to actually get that attention. The point is less "this is what happens to your car" than "this is the sort of danger your car poses to the general public".

My fear, of course, is that even this won't be effective. Hopefully proper basic security measures (like, say, not connecting the transmission, brakes, and steering to the bloody Internet) will be taken seriously before some multi-fatality catastrophe happens because of such security flaws.


Presumably you are leaning on this paragraph when you say that their earlier attacks were ignored?

When they demonstrated a wired-in attack on those vehicles at the DefCon hacker conference in 2013, though, Toyota, Ford, and others in the automotive industry downplayed the significance of their work, pointing out that the hack had required physical access to the vehicles. Toyota, in particular, argued that its systems were “robust and secure” against wireless attacks. “We didn’t have the impact with the manufacturers that we wanted,” Miller says. To get their attention, they’d need to find a way to hack a vehicle remotely.

But you are apparently ignoring this paragraph, which discusses Chrysler responding to the hack, as I read it, prior to the events in the article:

Second, Miller and Valasek have been sharing their research with Chrysler for nearly nine months, enabling the company to quietly release a patch ahead of the Black Hat conference. On July 16, owners of vehicles with the Uconnect feature were notified of the patch in a post on Chrysler’s website that didn’t offer any details or acknowledge Miller and Valasek’s research. “[Fiat Chrysler Automobiles] has a program in place to continuously test vehicles systems to identify vulnerabilities and develop solutions,” reads a statement a Chrysler spokesperson sent to WIRED. “FCA is committed to providing customers with the latest software updates to secure vehicles against any potential vulnerability.”

The way I put the information in those two paragraphs together, it's the fact that the attack can be done without physical access to the car that got the attention of Chrysler, not the publication of a stunt in some web rag.


Even Chrysler is ignoring the root problem that was demonstrated even with wired access: that should the outermost layer of security be compromised in a modern car, the whole car is likely compromised due to a lack of separation between the car's inner workings and the numerous attack surfaces. That's why the paragraph about Ford and Toyota is very relevant here; once that wireless exploit is found (and believe me, it will be found; this is a question of when, not if), drivers of Toyotas and Fords are hosed. Being anywhere on that list of "hackable" cars [0] should be recognized as a significant problem, but manufacturers are continuing to blow off the core problem and only react to specific breaches.

Basically, folks like Chrysler, Ford, and Toyota (and other mentioned manufacturers, too, like Cadillac) are relying on white hats and grey hats to be the ones finding the zero-day exploits in their wireless systems. And even when those exploits are found, they're being "addressed" with half-assed solutions like requiring an upgrade via USB (never mind that if a remote attacker can hijack the brakes and transmission, of all things, an OTA upgrade should at least be possible).

In other words, I'm not ignoring Chrysler's "response" at all. Rather, I'm noting that their response isn't actually indicative of the attitude shift that's actually necessary to prevent death and maiming of drivers.

[0]: http://www.wired.com/wp-content/uploads/2014/08/Screen-Shot-...


You called the cops on two security researchers and a journalist, because you disagreed with their methods and weren't sure what their plans were and what authorities they'd talked to? (And not just any cops, the cops in St. Louis, for bonus points.)

Are we still on Hacker News, or is the transformation to Enablers of Traditional American Power Structure News complete?


Are you not supposed to report dangerous, and possibly criminal, situations to the authorities? A witness to such events cannot know if it has already been reported or is known about, are they supposed to just go on with their day? Yep, just drive by that car accident with possible injuries without calling it in because I'm sure someone else has already taken care of it. What's the worse that could happen? The authorities tell you they are already aware of the situation, thank you? Are we now people who should no longer care what's going on around us or are we careless souls more worried about ourselves to not care over fellow human beings?


Sorry, but the cops lost that trust from me when the started sending swat teams and abusing power way to much. Since then, to me calling the cops has become a last resort. I dont trust ANY of them because of the few aholes that are abusing their power. Mainly caused because of their policies of shutting up and protecting each others. Until they fix this, i will not trust ANY cop again.


Until you're a victim of a crime I presume


In my life I've experienced 2 burglaries, 3 vehicle vandalisms, and one time I was seriously assaulted in a beach town by 10 guys with a gun. In every one of those experiences, the police were unprofessional and completely useless.

I will still call the police in the future, but just so they can fill out a police report for insurance claims or potential law suits. Other than that, I don't expect the police to do anything unless they were on the scene and saw somebody break the law.

And even then, one of the times when my car was vandalized, the cops were there and they filed a police report and told me to try to figure out who will pay between the two dudes that jumped on my car and if that doesn't work, give the officer a call and he will help me with the next steps. Obviously those guys didn't wanna pay so I tried to get in touch with that cop and he was avoiding my calls. I called over 10 times over the course of a couple weeks and he was never there and never returned my calls.

Another time my neighbor was throwing eggs at my motorcycle for 3 nights in a row and I got him on video, call the cops and they come by about 12 hours later and just laughed at the video, and then all of a sudden got a call to something more important and bounced. It may seem funny, but me having to pay someone $300 to clean egg out of all of the fairings and tubing is not funny. At the very least, do your job and file a fucking police report.


I'm not sure how this would make calgoo change his mind.

* Cops are under no obligation to go into harms way to protect the public.

* Cops primarily exist to collect evidence for prosecution after the fact.

* Just because criminals and crimes exist doesn't take away from cops' bad behavior.

* If calgoo becomes the victim of a crime, cops are unlikely to be able to make him whole again - for bodily injury, prosecution of the perpretrator can't restore his body or life - for property damage or theft, police usually can't be bothered with the small stuff.


There are problems with power and corruption.... But

Have you seen what happens to communities when police withdraw? They become overrun by gangs and other less accountable organizations.

Even favelas where cops act paramilitarily, say in Caracas, people still want cops because no cops is usually worse, unless you get a private version of cops, which is essentially cops by another name.


Not to disagree with you, but to add a data point (or rather an anecdote) - I heard a few people from Moscow claiming that they're more afraid of cops than criminals.

(Of course, even if it's not an exaggeration and even if it reflects the actual probabilities of getting hurt by cops and criminals there, it does not follow that things wouldn't be even worse without a police force.)


I've never had the police even bother to show up in those cases.


I fail to see how that bears any relevance to anything I stated. Not that I necessarily fully disagree with your sentiment, but you're presenting a related yet different topic.


There's plenty of safe ways to accomplish this kind of demonstration. The fact they choose to do so in a way that endangered the public is in fact criminal.

Being a security researcher or journalist doesn't give you a license to put the public in physical danger.


In addition, being a security researcher doesn't give you a license to put the public in any form of danger! That's why we have private disclosures.


The researchers did disclose it privately. As always, the believable pressure of public disclosure makes the process work.

https://twitter.com/0xcharlie/status/623492229714313216


I'm sure the police are even more ill-equipped to understand the ramifications of this demonstration and will over-react and start jailing anyone with a laptop and suspicious intent.

I can't wait for the pathetic outrage when "racial profiling" now means harassing white kids with laptops that fit the profile of hacker.

This is a matter for a company like Google to take on politically, not some beat cop in St. Louis of all places.


This has nothing to do with the type of test they ran and everything to do with where and how they ran it.


I totally agree that there's a problem here, but I strongly disagree with the method of action.

Why not engage the FBI? This is not an issue specific to St. Louis. Throwing some researchers in jail solves nothing. This is a way bigger deal than some local offense.

You need an agency with the ability to see the bigger picture.


How is deliberately disabling a vehicle on a highway the FBI's jurisdiction?


How is it not given they could have disabled any vehicle with that vulnerability?

This basically suggests thousands of cars could be driven off the road and deliberately crashed right now. I'd say that's a threat that they need to deal with at the national level on an immediate basis.

Would you rather wait for a malicious actor like North Korea to get involved before the FBI makes a move?

What I'm trying to say is I'd rather the FBI gets involved and works with these researchers to develop an expedient fix for this problem than some beat cop in St. Louis to bust them and throw them in jail where they help nobody and the threat remains extremely grave.


As much as it seem over the top, those researcher could have hurt people.

Calling the police will not have them go to jail or have their data deleted. It might (rightfully) get them a fine. It will however ensure that their next experiments are done in a safer, more legal way.

Calling the police isn't all about emergency. You can call them to talk about issues that worry you such as this one. They will take care of bringing the issue the the right entities, it's their job.


  > Calling the police will not have them go to jail or have their data deleted. 
Do we read the same Internet news? Having seen the way the law enforcement + prosecution machine works in cases like Aaron Schwartz, I would be surprised if these researchers did not spend time in jail, and didn't at least face charges of some Serious Nature.

If there's a case to be made, the police will build it. If they build it, the DA will prosecute it, and there could be (is going to be?) things like charges under the CFAA, since they could certainly try the perspective that the access needed to be authorized by the auto manufacturer, rather than the owner of the car.

I could totally see how invoking the power of the police on these researchers could, through the kind of progression we've seen many times before, destroy their lives. I really hope that's not the case -- I'd much rather they got some kind of warning like, "Don't you ever do this on a road with other people on it again". Even so, I agree with many others that their actions were pretty reckless, more so than I realized when I first read the article. This is the kind of thing that should have been done on a private test track, and doing it around others was reckless and negligent.


> Having seen the way the law enforcement + prosecution machine works in cases like Aaron Schwartz, I would be surprised if these researchers did not spend time in jail, and didn't at least face charges of some Serious Nature.

Having considered how dangerous their little stunt was, I'd almost expect them to be sentenced to some gaol time. What they did was pretty darn Serious!


Hah.

If prior HN articles are anything to go by, it's a matter of time before SWAT kicks down their doors, beats them up a bit, and maybe even a few officers "fearing for their own lives" (yeah right) take a couple of shots in "self defense" against unarmed nerds.

You're delusional if you trust in a law enforcement agency to take reasoned and measured action in any situation.


> Calling the police will not have them go to jail or have their data deleted. It might (rightfully) get them a fine. It will however ensure that their next experiments are done in a safer, more legal way.

Or not done at all.

I may not fully agree with their methodology but I'm thankful this work is being done by people with good intentions instead of having these issues come to light when people with malicious intent find the vulnerability and kill or maim countless people.


Calling the police is just going to discourage more researchers from even attempting "safe" experiments. Calling the Highway patrol is like trying to open an egg with a sledgehammer.


If I were to do experiment on a public or private road (or property) that I do not own, the first thing that I would do would be call up the owner.

You cannot simply test a car like that on the highway. There are privates road, abandoned airports, big parking lots that are more suited.


HN is chock full of self-righteous hall monitors. They usually don't progress to the point of calling the cops.


Can you expand a little on how calling the police about something is more or less self righteous than doing the (at least somewhat dangerous) experiment on a public highway?


Never talk to the police. At least in places like St-Louis and other cities with historically bad police departments (most of the US).


It's interesting how you view this issue. If the researchers are more self-righteous than the person calling in, it justifies the latter?


No, it just makes directing that line of criticism at the caller pretty inane.


Hall monitors are the best way I could imagine these people being described as. I completely agree.


It's definitely elementary school stuff to snitch and play the informant on your colleagues. The worst thing that this was coming from an old fart stuckup who thinks that they could police every thing and everyone around them and act more royal than the king himself.

Old habits die hard.


> Because you disagreed with their methods

This isn't really engaging with his action. Specifically, because called the cops because he believed that their methods put people in danger of physical harm. This objection isn't coherent without an argument either that:

1) He was unreasonable in his belief that they'd put people in harm's way.

or

2) It is not appropriate to contact law enforcement as a result of observing one person put another in harms way.

I'm guessing you're arguing both, correct?

aside: He contacted the state highway patrol, not the local St. Louis police. aside2: Hi Geofft! How are things going?


If he was actually concerned, and not just outraged, he would have called the police and reported Chrysler for endangering thousands of people's lives.


Hi! :)

My argument is roughly that we don't know for sure that people were put in harm's way, and we have good reason, as hackers, not to trust the legal system to reliably figure these sorts of things out (and they generally fail in the direction of being worse for both individual researchers and society of a whole). If we did empirically find the legal system reliable and fair, I'd be more convinced that the threshold for objecting should be "unreasonable".

The action is also over right now, and I see no indication that they intend to do so again. So this is either about punishment-as-retribution, or about dissuading future researchers from doing similar things. I don't think retribution is particularly justifiable, and I think there are better ways to dissuade future researchers, like having a conversation about it without the police involved. So I guess I'm arguing the specific sub-case of 2 that if they don't intend to put someone in harms' way again, law enforcement isn't necessarily right.

(You're right about the highway patrol thing, btw. I think I had it confused in my head with some other recent public/police conflict where the highway patrol was worse than the local police, but it looks like the opposite was true of the Ferguson protests.)


"Hacking" is not what's portrayed in movies.

The researchers could have achieved the exact same results (albeit with fewer clicks) by conducting this experiment in a remote parking lot or a private road. Heck, if the writer had contacted the cops, they could have given him an escort to make sure nothing bad happens.

If you ask me, it is this kind of behavior that makes the work of real researchers harder, as the media is quick to paint all security researchers as clueless nerds who will put people at risk.


> The researchers could have achieved the exact same results (albeit with fewer clicks) by conducting this experiment in a remote parking lot or a private road.

According to the article, the researchers already did as early as 2013. Auto manufacturers ignored the reports while continuing to pretend that their vehicles are secure.


According to the article, that experiment required physical access to the car; it was NOT a remote experiment like this one.


It still demonstrated the same root problem: that the computerized systems on cars today have very little in way of basic safeguards. And there was indeed quite a bit of cracking UConnect and wirelessly spying on Dodges and Chryslers throughout the country before the experiment.

If I were an auto manufacturer, I wouldn't wait until someone finds a wireless exploit (at which point it's too late to do anything about it before people die or are maimed unless I'm lucky enough for the zero-day to be found by a white-hat or grey-hat). I'd see those earlier reports, say "holy shit if we have one wireless bug, the whole car could be pwned", and start working on a better isolation of critical systems from internet-connected systems immediately.


You don't think there's a difference between exploiting physical access, and remote network exploits? Given physical access to a computer, you can break into it almost trivially; but you don't see people sweating about that.


> Given physical access to a computer, you can break into it almost trivially; but you don't see people sweating about that.

Sure you do. This is why large businesses (smart ones, anyway) require employees' smartphones to be locked with a password or PIN. This is why standards like HIPAA require secure data to be encrypted at rest. This is why laptops being stolen from government agencies leads to things like millions of confidential records disclosed (true story).

And you're still missing my point: that the likes of Toyota and Ford are relying on their wireless systems being secure. That's reckless, since now their wireless systems are the single point of security failure. The lack of even basic safeguards, access levels, etc. should a breach occur is the point of this article, more so than the specific UConnect breach. Having only one layer between "secure" and "pwned" is by no measure a good idea.


You've said that a dozen times in this thread with no direct quotes or proof


The fact of the matter is that this is Startup News not Hacker news. Hardly anybody on this website is a hacker. Most are people that code html and php in their day job and go home and do normal shit. These are people that complain about how the industry "pressures" them into coding in their free time.


look mom I'm a hacker too :^)


I glad to see he did it. Not all "hackers" should be non-responsible kids who think about police as about enemies.


> You called the cops on two security researchers and a journalist, because you disagreed with their methods...

Calling it a disagreement over methods glosses over the real issue, which is that it was a dangerous exercise and its perpetrators apparently don't have sufficiently good judgement to be left to their own devices.


This was my first thought. To repeat what others have said: why on earth did they do this on open roads and high speeds? I can only assume it was for additional 'shock impact' of the story.

Reckless in many, many ways, no matter how interesting the story actually is. In fact it's so reckless that it actually devalues the interesting and important core of the story itself.


> To repeat what others have said: why on earth did they do this on open roads and high speeds?

Because - according to the article, at least - they'd already demonstrated similar exploits in more controlled environments, and said demonstrations were handwaved and dismissed by the auto manufacturers.


The risk still doesn't justify the supposed reward. Why not a lower speed in a quiet street if you absolutely feel you have to do this on open roads?


I don't disagree with you; the researchers could have taken better safety measures (most notably, better communication between themselves and the reporter would have eliminated most of the risk by allowing the reporter to cut the experiment early), and had they done so, they would have been more clearly in the right.

However, there's some usefulness to the higher speed, since it indicates that the car can be isolated among highway traffic even at high speed. The researchers were also smart to not slam brakes (which would have turned the minimal danger from unpowered coasting into the maximal danger of sudden stops).


The security industry has unanimous voiced their concern that remote controllable cars and kill switches is one of the worst ideas possible, and will be exploited and cost human lives. Nothing has yet to happen from that.

So what should researchers do? Do nothing and keep their hand clean while waiting for the train wreck to happen? Continue in a fruitless effort to warn people on papers only other researchers reads, knowing by historical evidence that no change will come from it. Ask government for permission to do a live test, knowing that it would never be granted. Do a demo test on a demo road, knowing that neither government, industry or public would care.

I don't like this either, but it seems to me as a society we only really give researchers one option and that is to do nothing and wait for the bodies to pile up.


> So what should researchers do? Do nothing

There is a big world between "do nothing" and "put third-parties at risk by stalling a car on a three-lane highway with concrete barriers."

Doing the right thing is often boring and takes lots of work. That's why it's called "doing the right thing" and not "doing the splashy thing" or "doing the easy thing."

They already had the attention of the media. Keep on working with the media to get more and more attention. Is it hard? Then do it some more.


Thank you. This was very irresponsible. Ignore the vigilantes saying this doesn't cause harm. My family had a close friend killed on the freeway when she ran into a stalled tractor trailer and it decapitated her.

We've had 2 MAJOR accidents just recently by my house (I-85 near Atlanta) due to foreign objects and/or stalled vehicles.

Anyone who thinks this isn't unsafe is absolutely delusional. Any unexpected failure is hazardous on the interstate. Especially failures to the drivetrain, suspension, steering, or braking system. Beyond that, I hope everyone can agree spraying the windshield with washer-fluid and thereby completely obscuring the vision of the driver while he was traveling at 70mph is absolutely a hazard (what if the car in front had stopped for some reason).


I agree: it was very poor judgment to demonstrate this on a public highway.

But when I read that you actually called the authorities and encouraged others to do the same by posting the number, a certain somewhat Tao-istic scene in The Big Lebowski [1] came to mind.

1. https://www.youtube.com/watch?v=uQl5aYhkF3E


Performing this test on an open highway is incredibly irresponsible behavior on all parties. This is why they have test tracks (or even sandlots). Cutting the transmission to a vehicle going 70 miles an hour on an open highway is reckless endangerment -- even if the person behind the wheel knows it is going to happen.


You're not gonna make the news unless the media can spin up a headline that scares people

People won't pay attention until they're scared

People won't demand action if they're not paying attention

Nothing will happened if people don't demand action.

If nothing happens the status quo (vulnerable systems) will remain. Until some bad actor (I'm sure several nations states would love that capability) gets into onStar and turns every connected vehicle (every GM made in the last 8yr or so) into a brick at an inconvenient time (rush hour on a monday).

>I've just phoned 'Troop C' of the Highway Patrol at their main number, +1-636-300-2800 and they seemed pretty keen to follow up. The fact that the vehicle was disabled where there was no shoulder, was impeding traffic, and the demo not cleared with them in advance has them concerned. I'm all for testing exploits and security research, but this isn't the right way to do it. And to film it and post it to a high traffic site is nuts.

I'm not sure if you're actually this dense or just trolling. What good can involving the police, after the fact, in a situation where nobody was harmed do?

To clarify: If a story involving events of questionable legality, no matter how small to were hit the news the police would be obligated to investigate on some level. Think about the kind of message that "we saw it on the news but we don't think it's worth investigating" would send. By informing them before it hits the general news, one enables the "swat teams and more" knee-jerk response that the police love (if I had cool toys I'd want to play with them too) but without any media scrutiny. For example, law enforcement was plenty eager to screw the guy that "hacked and airplane" (through similar means I might add) until the story became more widespread and they had to use their discretion to act in a manner that would not reflect poorly on them.

By alerting the State Police in advance they're

I don't expect this to hit the news. University of IIRC Michigan (something with an M) was doing similar things at closer range (bluetooth) on a test track back in 09(?) and nobody cared.

And for all the people saying they were "reckless and dangerous, etc, etc," sure, yeah, to a small extent. If they wanted to be reckless they'd have made the car go instead of stop, swapped left and right on the electronic power steering, disabled the brakes on one side or end of the car, etc, etc.


>I'm not sure if you're actually this dense or just trolling. What good can involving the police, after the fact, in a situation where nobody was harmed do?

I don't know, maybe if they get in trouble the next researcher who wants to do a test by disabling a car doing 70mph on a public road will maybe just alert a few people and make sure that it would be impossible for someone innocent to die during their testing.

I was with your comment until you called the GP dense or a troll. Because to follow your logic, to get action, they should've just actually killed a random person. Then you'd be right, we would get some changes, pretty quick.

Who do you think should be the random person to get killed for change?


> Who do you think should be the random person to get killed for change?

If we decide now, then it wouldn't be a random person, now would it? :)


I agree they may not make news if they did this in a safe manner.

However, the goal of people researching security, shouldn't be to make news. And these people while admittedly working with Chrysler to see it fixed, seem to be forgetting that. Especially since they plan to release their code, despite the fact that Chrysler has to get people to manually update their cars.

"The two researchers say that even if their code makes it easier for malicious hackers to attack unpatched Jeeps, the release is nonetheless warranted because it allows their work to be proven through peer review."

Their justification for releasing their code, as someone who works in peer reviewed industries is weak and they clearly are prioritizing attention over security at this point.


I saw a presentation at a departmental colloquium 3 years ago which demonstrated similar capabilities. The point is, car companies are not responding well to this threat even though it is well known to them. In such situations it is in the public's best interest that information about the vulnerabilities be widely disseminated in order to keep the general public safe. Those with know how can already exploit these flaws and likely have been for years. The car companies need to act to secure their customer's systems.


> In such situations it is in the public's best interest that information about the vulnerabilities be widely disseminated

This assumes many facts not in evidence.

It may, in fact, be the best thing. But security people, as a rule, are strongly biased to love things that increase the social standing of security researchers, and chaos does that.

There are other ways of pressuring the car companies. I'd like to see companies failing to fix disclosed security holes in safety critical applications in a certain period of time face monetary damages, even without need to show harm was caused.

But lobbying is boring and getting on the top of HN is fun.


>> The point is, car companies are not responding well to this threat even though it is well known to them.

I think the problem is related to core competencies (sorry to throw in the MBA speak).

The old-school car companies are good at making cars, and not secure computer systems.

You can likely say the same about the skill sets of the decision-makers running these companies. Many of them just can't wrap their head around security implications, because they don't fully understand them.


Car companies, possibly more than anyone else in the world, are the home to people who understand how mechanical failure affects lives.

The car companies' failure to patch defects ought to have them facing severe fines. In fact, I would support a bounty system of millions of dollars for researchers who can demonstrate 1) finding a flaw, 2) telling the company, and 3) the company not fixing it in X months. All this finances by fines on the car companies.

The above facts doesn't mean that what these guys did was okay.


>> Car companies, possibly more than anyone else in the world, are the home to people who understand how mechanical failure affects lives.

You're completely right, but the key phrase in your sentence is "mechanical failure".

I've worked on analytics projects in the automotive industry for analyzing defects before they get into the "campaign" (aka recall) stage. They are incredibly good at that type of analysis. Most mechanical parts "make sense", since they're designed for only a few functions.

An Internet connected computer and software, on the other hand, doesn't always make sense to auto execs because they are significantly more complex.

As it relates to the article, I wouldn't be surprised if the car's computer system was perceived more as just a part having a particular set of features by Chrysler's top executives than as a computer system requiring the same types of security controls as, say, an ATM would.


They could have made still made the news if they had taken a few extra precautions to reduce the risk of an accident.

However, they do need to make the news. Them making the news makers it easier and more likely that politicians will prioritize the political capital of working to solve this over the lobbyist from the automotive industry.

If Chrysler and other car manufacturers were taking this sufficiently seriously the releases might not be necessary. They gave Chrysler plenty of warning, Chrysler could have issued a recall (and still can), the consequences are on Chrysler, not on the security researchers.


You do realize a recall doesn't make all the cars come back on their own to get fixed right? Hell many consumers don't even realize there was a recall till their product fails for the reason it was recalled.

Chrysler seems to be taking this sufficiently seriously enough that releasing the code will do more harm than good. Could they take it more seriously? Well everything can always be taken more seriously, and someone will always claim it should. So I will say that's a matter of opinion.

EDIT: If their plan to 'release their code' is nothing more than a bluff to raise awareness I would consider that a much more appropriate course of action.


>> I agree they may not make news if they did this in a safe manner.

Maybe, maybe not. All they need to get eyeballs is a linkbaity FUD headline with a few extra scary sentences thrown in.

It's not as if the TV news doesn't already do this with their teasers for "Is eating too much XYZ going to kill you? Find out after the commercial break" only for you to find out that the story about XYZ is overblown and poorly vetted.


They didnt have to do this on a public highway. A large parking lot would have be sufficient. Why should other motorists be subject to harm because a couple hackers and a reporter want to make a story. They could've easily gone on one of the 24hr news channels to scare the masses.


Or heck even a quiet public road. There are tons of inter-state roads that have single-digit numbers of cars an hour. Just drive out into the desert and play around.

PS - Plus due to the flatness of these roads, with large de-facto shoulders you can pull off onto, they're much safer even ignoring traffic levels.


Reckless endangerment/criminal negligence is a crime even if nobody is harmed.

The hackers may have crossed the line if they disabled the engine on a narrow stretch of a busy highway. It should be investigated.


Yes, yes. We wouldn't be having this big thread about the safety of the experiment, and consideration for other motorists, if they hadn't done it this way.

If they had done this in a parking lot at 25 MPH with a couple cops present, the way Mythbusters does things, they would have ONLY had a story about hacking a Jeep to shut it down. And if they played their cards right, they might even be able to start some LEO contracts for car-disabling equipment.


> I'm not sure if you're actually this dense or just trolling. What good can involving the police, after the fact, in a situation where nobody was harmed do?

People who do one reckless thing such as this demo are likely to do others. Calling the police about this incident means that they'll have a record of the people doing this, and if it becomes a pattern, handle it considerably harsher than an isolated incident.


So what you're saying is this was small potatoes and they should have caused a pile-up.


Absolutely the right thing to do. I don't care how technically gifted these people are. They are morons who deserve whatever legal consequences this might bring on.

This isn't about security researchers. There's a HUGE GAP between security research and setting up a situation that could kill someone's daughter, son, mom or dad. That incredibly stupid at the least and criminal at worst.

There are levels of this in tech all over. I don't know if it is about social isolation or something else. Things ranging from the kinds of privacy decisions made by people coding social networks to the totalitarian and inhumane approach seen in dealing with various large web players. It's almost like you are dealing with a non-human race (the Borg?) that is almost completely devoid of human feelings, emotion, consideration, respect, a sense of community and simply making decisions that are humane rather than cold and mechanistic.

The other one is morons flying multicopters above people, neighborhoods and around firefighting aircraft. How does a human being go there mentally? I don't know.

I applaud your actions.

What's worst is that it is likely this was not the first time they did this.


You better have the Highway Patrol investigate every single person who doesn't maintain their car properly and takes it on the highway because they're causing far more risk than this demo came close to creating, IMHO.

Was it a stunt? Yes. Was it life threatening? Hardly. The real risk is the early 90s Civic with a torn up clutch and bald tires swerving between lanes.


> Was it life threatening? Hardly.

Uhh what? It seems you cannot go a week without reading about a pile-up on a freeway. Just last week a big-rig lost a wheel, it rolled into the on-coming lane, and drivers swerving and braking to avoid it actually caused a pile up. Stopping even on the shoulder on a freeway is considered "risky" by most police officers and many (like triple digits) have been killed while stopped in the shoulder due to vehicles drifting, failing to pay attention, or otherwise being distracted.

I cannot remotely begin to fathom how anyone can think a car going 0-10 MpH on a freeway ISN'T dangerous. And it is absolutely life threatening. If a car behind didn't notice the change in speed, panicked and either hit you or the concrete barrier(s) that could very easily cost them their life. Or leave them with life-long disabilities. Bigger things like trucks and those "road-trains" are even bigger liabilities.

Honestly I'll defend security research strongly in almost all contexts, but when you put people's actual lives in danger you clearly cross a line. There's no shades of gray there, endangering people's lives and health to effectively show off is absolutely immoral and should be illegal (and likely is).

Saying "well nobody got hurt" completely misses the point. It is the intent that is wrong, not the result. The result could have multiplied the wrongness of the intent and resulting in tens of years of jail time, but luckily for them their only "crime" this time was the intent of their dangerous actions.

And let's be frank here luck is the only reason nobody got hurt. The only reason why these two won't be in jail for many years.


The driver was aware of their activities, so he is probably the only one with any legal culpability.

Impeding traffic is a misdemeanor in Missouri, probably rates a maximum 1 year jail sentence (note 6: http://www.nhtsa.gov/people/injury/enforce/stspdlaw/mospeed.... )


According to the article, they didn't actually tell him ahead of time what they were going to do to the car.


Accomplice liability would make the researcher exactly as guilty as the driver.


Poorly maintained vehicles that break down while driving surprise the driver. This happens daily on public roads. Should we fine them for failing to maintain their vehicle to your standards?

There are autonomous vehicles being tested on our roads with a failure mode of "coast to a stop". They may not even have a human inside to react to things around them. Do the operators deserve to be jailed?

People modify their cars with various after-market upgrades and take them onto the highway. If the car fails, do they deserve to be imprisoned?

What a slippery slope!

Driving is a risk. The most deadly risk you will take each day. Drive defensively, don't be a statistic.


> Should we fine them for failing to maintain their vehicle to your standards?

You might want to review existing laws. See, e.g.:

Georgia: http://law.justia.com/codes/georgia/2010/title-40/chapter-8/...

"O.C.G.A. 40-8-7 (2010) 40-8-7. Driving unsafe or improperly equipped vehicle; punishment for violations of chapter generally; vehicle inspection by law enforcement officer without warrant"

Ohio: http://codes.ohio.gov/orc/4513.02

"(A) No person shall drive or move, or cause or knowingly permit to be driven or moved, on any highway any vehicle or combination of vehicles which is in such unsafe condition as to endanger any person."

California: http://www.leginfo.ca.gov/cgi-bin/displaycode?section=veh&gr...

"24002. (a) It is unlawful to operate any vehicle or combination of vehicles which is in an unsafe condition, or which is not safely loaded, and which presents an immediate safety hazard."

This research appears to have happened in Missouri, where it's harder to find the actual laws on the subject. That said, I did find this: https://www.mshp.dps.missouri.gov/MSHPWeb/PatrolDivisions/MV... which tends to imply that there are laws to this effect that I cannot easily locate via internet searches.


Failure to maintain your vehicle such that it puts other people at risk is against the law.

The people testing self-driving cars had IRBs that go over their test cases. Do these guys even know what IRB stands for?


and it's rarely prosecuted.


ya well all they did was stall the engine, they didn't tell the car to apply the brakes.


With what certainty did they know that was going to happen?


because they had previously tested it and knew what each function was doing. this was the Hackers movie with them flying around a computer and poking and prodding random things.


How did they know they would only affect the one car they were targeting? What if another similar car also stalled out?


They decelerated a car. The brakes weren't even applied. This happens all the time on highways. It is unfortunate that it happened where there was no shoulder on the road, but if an accident did happen then I'm not so sure the researchers or journalist would be at fault.

Here's a scenario:

Let's say a person is driving a car, when their car engine fails. There's no shoulder for them to drive onto, so they are just slowly decelerating when they are rear-ended by a vehicle behind them. Would you say that the car that had a mechanical failure is at fault, or the person behind them who wasn't paying attention is at fault?


> but if an accident did happen then I'm not so sure the researchers or journalist would be at fault.

So the people that purposely tried to cause the accident wouldn't be at fault for the accident if it occurred..?

I find it highly amusing that in your scenario you're using an unpredictable failure as an equal for an intentional act.

A better scenario would be:

I open your car bonnet while you go to the bathroom. I half-cut some cables knowing that they will fail when you knock them a few more times. You come out, get in your car, and drive down the freeway. A few miles later your car stops suddenly in the fast lane, and a big rig crashes into you while you sit there stopped going 70 MpH and you die. According to you I am not, at all, responsible for your death.

Or better yet still:

You just stop on the freeway just for fun/see what would happen. Someone drives into the back of you at 70 MpH and THEY die. According to you, you aren't at all responsible for that.


There's a huge difference between stopping on the highway and decelerating due to lack of engine power. The driver knew what was happening, turned on his hazard lights, and didn't apply the brakes. Slowing down on the highway, although annoying, shouldn't be an unfamiliar or unsafe scenario (ex: construction, traffic backup, etc.)

This would be a completely different story if the researchers applied full force to the brakes or accelerator since those are unexpected (to other drivers), sudden, and difficult to react to behaviours.


Everything else aside, slowing without good reason is likely to be a traffic infraction (in Missouri, a misdemeanor punishable by 1 year in jail!).

It isn't that convoluted to hold the driver responsible for the vehicle, they knew prior to driving into the area with a minimum speed that there was some intent to tamper with it.


They decelerated a car enough for other drivers to honk. It was slowed to a crawl. States have adopted minimum highway speeds for 50 years for a reason.

What if their proof-of-concept didn't work as predicted and did slam the brakes? This is just a reverse-engineered hack that was unleashed on a highway while the radio was blasting too loud to hear each other on the call.


> but if an accident did happen then I'm not so sure the researchers or journalist would be at fault.

Legally, you are required to do what you can to avoid accidents.

Ethically, it's all kinds of fucked up when you rationalize with "well, if someone goes wrong I can blame someone else."


Just because similar events occur under different circumstances doesn't mean this is ok.

This test could have easily been done on a closed test track or heck, even a large parking lot.


> Was it life threatening? Hardly.

You really really don't know this.

If this was done by an actual research lab staffed by adults, it would never have gotten past the ERB.


> Was it life threatening? Hardly.

Danger Checklist:

✔ 70mph

Public highway

✔ Driver not in control


This is also the checklist for most of the traffic that's currently driving on US interstate highways.


There's a lot of bad drivers out there that make a lot of bad decisions, but saying "most" of them are essentially not in control of their vehicles is frankly ridiculous.


It was a joke. Or at least a slight exaggeration.

I have a 26 mile daily drive, all high-traffic interstate freeways, and I can usually count at least two occurrences per day where I have to take evasive action to avoid a collision – people illegally on their (hand-held) phones, people blowing across three lanes at 75mph and not even bothering to check their mirrors, drivers leaning over in to the back seat on the freeway, people who drift into the wrong lane in a concurrent two-lane left turn, people who tailgate leaving mere inches between them and the car in front of them despite you having nowhere to go, people braking as if their car weighed half of what it actually weighs, et cetera.

I'm honestly not sure what the solution is, but it's (i) legitimately terrifying every single day, and (ii) hard to believe that any other kind of transportation modality would accept the kind of outcomes that humans driving on the US interstate highway system produces.


It'll also be the checklist for even more of the traffic that drives on US interstate highways once self-driving cars become all the rage.


A self-driving car is still being driven, by a computer that has control, situational awareness, and the ability to recognize and avoid dangerous situations. This demonstration was specifically about removing those three factors.


So what happens when (not if, but when) said computer encounters a fatal error? What happens when future security researchers like the ones in this article manage to break into said computers and manipulate them?

If we're going to condemn researchers for potential danger, then we might as well extend the same courtesy to car-driving AI and the makers thereof.


I had the same thoughts. Testing the exploits on a open highway, at full speed, strikes me as needlessly reckless.

There is no excuse for this when there are plenty of lower speed locations available. They should have used a large parking lot or similar.


> There is no excuse for this when there are plenty of lower speed locations available. They should have used a large parking lot or similar.

At some point you probably want to test it at highway speeds. I agree that a closed course would be the only responsible option, however.


They are not testing it, they are showing it.

Reckless yes and still probably not enough....

I believe people will need to be killed, or get their cars destroys before the rest of the population takes enough of a stance against "neglecting" security.


Nowhere near as reckless as missing oncoming traffic by mere feet at speed differentials exceeding 100MPH. Happens billions of times a day without anyone expressing the slightest concern. People are regularly killed and cars destroyed; the rest of the population doesn't care.


You're officially wrong per Federal Certifications (some courtesy Jeep, some the dealer,) so that's the Safe way for researchers to have approached it; mountains and no easement would've brought it down to rules for scratch journalists (please try to recover my GoPro...) State (etc.) laws are 80% hate speech against cyclists. Not even tagged Florida; my car's entertainment system made me climb a tree and launch t-shirts at traffic, blister, etc. If they'd done it as a vetted demo in a lot that could have been a Federal lot...insert stdSecLetter, stdClearance, stdDeclarationOfInterest...meh.


I was wondering why they wernt in constant communication(didnt he say he had to grab his phone and ask them to stop?), wjy the f*#@ would you test this at speed?

I agree with you, i hope that the author was lying to make his story more interesting (hows that for a bad wish).

I completly agree with you, seems to have a total disregard for anyone elses safety.


Let's not forget Wired's responsibility for this either. I wonder what editor Scott Dadich and owners Condé Nast have to say. OTOH, we don't know for certain that the tale of what really happened on the public highway didn't grow in the telling.

EDIT: Holy moly https://twitter.com/CondeNast/status/623533074865893376 .


That looks like a staged shot. Highways don't have corners like that.


It's not staged, it's just a parking lot or something. It's mentioned directly in the article which it seems that most people have not even read:

"They demonstrated as much on the same day as my traumatic experience on I-64; After narrowly averting death by semi-trailer, I managed to roll the lame Jeep down an exit ramp, re-engaged the transmission by turning the ignition off and on, and found an empty lot where I could safely continue the experiment.

Miller and Valasek’s full arsenal includes functions that at lower speeds fully kill the engine, abruptly engage the brakes, or disable them altogether. The most disturbing maneuver came when they cut the Jeep’s brakes, leaving me frantically pumping the pedal as the 2-ton SUV slid uncontrollably into a ditch."


It certainly doesn't seem to be from the main transmission-shutdown incident, at least. I'm much less interested in the photo than in the fact that Condé Nast corporate thought it was a good idea to proudly Tweet this article to the world.


Did you watch the video? It's not staged. They killed the brakes (only possible below a certain speed).


While I agree with the fact that what they did was dangerous, the fact they did it that way will garner much more attention to the root cause of this problem - connected cars allow remote control of car's most basic mechanical features, which they shouldn't. Hopefully, this will result in better safety measures in car systems in the long run.

Edit: They could've done it on the parking lot and the article would be put in a pile "some geeks are doing some geeky stuff" and forgotten. 70 MPH on the public highway is like a billboard with ten foot letters saying "PAY ATTENTION" in your face.


Murdering a family of 4 using the hack would garner even more attention. The ends don't justify the means.


Those particular means are unjustified. What actually happened wasn't nearly as extreme as you're indicating, and given the previous behavior of auto manufacturers to security hole demonstrations in their cars, this sort of demonstration was viewed by the researchers as the next logical step.

I don't entirely agree with the methodology, but nobody was hurt, unlike what would would likely be the case should even less ethically-grounded "researchers" demonstrate similar capability - probably on a larger and more dangerous scale, mind you.


Nobody was hurt because they rolled the dice and got lucky. There was a non-zero probability of injury or death that was completely unjustified.


It was a gradual slowdown. That "non-zero" has enough zeroes after the decimal point for Japan to send the number to Hawaii and have another go at Pearl Harbor.

Worst-case scenario, somebody might've been rear-ended. Maybe a bit of whiplash. That's not great, either, but seeing as more-controlled tests by these researchers were outright ignored by auto manufacturers, your priorities have to be incredibly out of whack to villify the researchers over the auto manufacturers - who are willfully endangering hundreds of thousands, if not millions, of Americans every day - in this scenario.


Your estimates for both the "non-zero" probability of injury and the worst-case scenario are very far off from mine and from the those of the thread-starter, who appears to have some expertise in traffic considerations, and the dangers of semi trucks in particular. I wonder if your opinions about this would be different if you believed this was as dangerous as many of us believe it was, rather than merely having an extremely low probability danger of a harmless fender-bender.


My estimates come from some personal and professional experience (including being a former employee of a state highway patrol, mostly tasked with - among other things - processing traffic collision reports and dealing with phone calls from those involved; not a fun job, that was). Admittedly, probably not as much as a semi truck driver, but contrary to popular belief I'm not entirely inexperienced here :)

The reporter mentions that this was uphill. Semis generally have a hard time going uphill at an appreciable speed (as I know full well being stuck behind them regularly on the mountain pass highways that connect my town to the rest of the world; lines and lines of trucks at less than 45 MPH with their flashers on); more weight leads to a harder time fighting against gravity. The uphill slope should make it easier for the truck to slow down.

If the reporter had made an abrupt stop (i.e. if the researchers slammed his brakes or something), then yeah, I'd be more concerned. That wasn't the case, though. Rather, it was a gradual deceleration according to the article. Cars can actually coast quite a distance, even uphill, when they start at 70MPH; I know this firsthand from my own SUV running out of gas once on a busy interstate, and on an uphill no less. Even with the uphill, there was enough momentum for me to put on my flashers, merge right from the fast lane, and eventually coast into the next offramp a quarter-mile away. No shoulder, either.

Now, this isn't to say that it couldn't've been safer, nor do I disagree that more safety precautions should've been implemented. For one, the researchers could've - at the very least - told the reporter "hey, if our attack comes at a really bad time and you feel like you're about to die, turn the car off and on again and you'll regain control". However, even with the described scenario as-is, risk of life is quite slim. We're not talking about a driver slamming his brakes and going from 70 to 0 in seconds; we're talking about the equivalent of an engine stall, and thus a rather gradual slowdown - graudal enough for even semis, let alone smaller vehicles, to react to.

> I wonder if your opinions about this would be different

They probably would, yes. Slightly, though; ultimately, one injurious pileup is a drop in the bucket compared to the hundreds of thousands that might actually be prevented by demonstrating precisely why proper security measures on Internet-connected heavy machinery are worth taking seriously. Not that I think the possibility of the former should be dismissed (indeed, I agree that the researchers could've done things more safely while still getting the attention of auto makers), but said possibility needs to be weighed against the possibility of the latter, with the recognition that any demonstration - ideally a totally safe one, but even one with some degree of risk - is necessary to push auto manufacturers toward taking security seriously.


I feel like we might be in an Internet fight here but the line about Pearl Harbor is the best thing I've read all day, thanks for that.


No problem. :)

And I wouldn't call this a fight. Just an ethical debate. One that'll probably be a bit heated, of course, given the circumstances, but it's one that needs to be had.


They did it on a public highway to sell ads on Wired.


Feynman had a nice story where he figured out a way to crack many of the safes in Los Alamos, then dutifully reported his method to some bigshot general. The general said "hmm interesting, thank you very much", and banned Feynman from entering rooms with safes or something. The safes stayed as unsafe as ever.

You remind me of that general. You should be hanging out on Catch The Hacker News, not Hacker News.


Testing on uninformed humans is unethical.

Wasn't hackernews just all up in arms about the US military spreading germs to test bioweapons? Isn't this the same exactly thing?


There can be times when it is okay to test on uninformed humans.

For example, I have relatives who do fire safety. How people do (or don't!) evacuate from buildings when fire alarms go off is a big area of research.

The ideal way to test this is to set off the fire alarm in a building where people do not know it is happening, along with some smoke and pyrotechnics.

HOWEVER, there are ethical concerns, and a review board would ask questions like:

1. Has anyone else done this study before? If not, why not? How sure are you that no one has done it before?

2. What does the previous research with similar protocols say? What key question are we trying to answer?

3. What is the harm that will be present to people? Are we doing everything we can do to reduce that harm?

4. What more could we do to reduce harm but that might impact the reliability of the research?

5. Quantify how much of a benefit this research would be so we can compare to the risk you are presenting.

6. Demonstrate that you have done all the preliminary work that is necessary to achieve good results, so that we can make sure that the research is used. It would be foolish to put humans at risk and then be unable to use the research because we forgot something we could have taken care of upfront.

These researchers would bomb most of these questions.

The reason for an INDEPENDENT review board is that researchers tend to follow this flow chart:

Have idea. ----> Wait, should I do this? ----> Yes, of course!


There's a pretty significant difference in scale between two nerds putting maybe 2 or 3 vehicles in danger of a dented bumper v. the world's largest military conducting live-fire bioweapon testing.


Yep, totally agree with you. Good job for phoning the police. People need to know that's not ok!


[deleted]


So either the journalist/researchers did something highly dangerous to the reporter and others on the highway as a stunt, or the journalist has no qualms about making up details for shock value? Sounds ethical.


There was a "documentary" of these guys when they were testing via a hardwire to the car's computers. They were also in the car with the driver as well as in a parking lot and on some non-busy country road.

I wonder if the reporter just added in those details about the highway to make it seem like more of a real threat or if they actually did test on a busy public roadway.

edit: Found the video - https://www.youtube.com/watch?v=oqe6S6m73Zw


The Wired article has a video of the test on the busy public roadway.


Ahh, I skipped right over it to the text. Woopsies. :D


I agree that they definitely took it too far and weren't being very safe, but calling the cops seems to be taking it a little too far, also. What laws were broken?

Edit: Actually I've thought it about it, and they could probably be charged with reckless endangerment.


Now there are two parties who are making poor choices about how to handle things.


I would put this more on the reporter. He knew full well what the plan was, and chose to put himself in the situation voluntarily.

I suspect there's also a bit of embellishment going on.


Well done. I agree.. there is no way they should have jeopardize safety of people who were on the road.

They could have easily demo'ed it in million other ways.

Kudos for the hack but shame for the demo.


Oh man, I'd hate to be your neighbor.


Jesus, just google similar researches in 2011, which has been done in "safe environment" without naming manufacturers etc. etc. and in 2015, after 4 years we see that manufacturers did say "meh, thank you but no, we aint gonna do shit about it"

How about that - "Fiat Chrysler now says that 10 vehicles from its 2013, 2014 and 2015 model years are vulnerable to hacking, including five 2013-2014 Ram truck models, the 2014 Jeep Cherokee and Grand Cherokee, the 2014 Dodge Durango and 2014 Dodge Viper, and some 2015 Chrysler 200s."

and that - "Miller and his associate, Chris Valasek, director of vehicle security research at the consultancy IOActive, estimates that hundreds of thousands of Fiat Chrysler vehicles on the road today could be vulnerable. That’s unsettling." just read people! 2013 models. And now, this a hole calling police because this guys opened your eyes. Wouldn't it be better if they made "safe" test again, manufacturers ignore it AGAIN & then some sick bastard simply crashed thousand of those?

And yes the main thing I like is - “Customers can either download and install this particular update themselves or, if preferred, their dealer can complete this one-time update at no cost to customers.” DOWNLOAD & INSTALL THEMSELVES? What? but yeah right blame the researches of course.

"In case any of you think this was cool or even remotely (no pun intended) ethical, I'd like to know if you have a problem with letting these two test this on a loved one's car. How about they remotely poke around your husband or wife's car and explore, as long as they promise not to intentionally trigger anything?"

I would certainly let this guys to check on my car and my wife's car, just to make sure that if it can be hacked then I'd better get rid of that crap and sue a holes which let me drive a car which can be controlled remotely. Cause I would rather trust ex NSA and current director of vehicle security research at the consultancy IOActive, rather than have even a 0,00001% chance that some unknown hacked crew can end my life sipping coffee in starbucks.


I would like to give some other perspective. FCA (parent co. of Jeep) have been slow about a number of safety recalls and are under increased scrutiny by NHTSA:

http://www.detroitnews.com/story/business/autos/chrysler/201...

Here is a choice quote about the culture relating to safety at Fiat - Sergio Marchionne is CEO: >> Marchionne said in January that the auto industry may have “overreacted” to some safety issues, especially the massive air bag recalls, which may have been “overkill,” he said. << This is about the Takata recall in the news where the detonators can produce deadly shrapnel.

So yes the demonstration described in this article was somewhat reckless, but the facts that FCA has not notified owners beyond a posting online about a firmware update (who checks that?), tacitly condemns security researchers' decision to publish some details in their communications with Wired, all the while stonewalling recalls - for example in Jeep vehicles where they catch fire in rear end collisions, killing occupants - that upsets me much more.

In my opinion, when a company is notified of a safety or security issue, they should do all that can be done as quickly as possible, here instead FCA has once again done the minimum plus has the gall to respond in writing, "We appreciate the contributions of cybersecurity advocates to augment the industry’s understanding of potential vulnerabilities. However, we caution advocates that in the pursuit of improved public safety they not, in fact, compromise public safety." I guess I embrace hacker spirit more than anything else I considered here is what it boils down to.

So I would have written to the NHTSA trying to make this yet another recall if I thought it would have done any good, but in that culture of 21%-compliance-is-acceptable, I don't think it would do a lick of good, so I won't bother.

Also, I accidentally clicked on "flag" above when I wanted to click on "parent." I am sorry, that was not my intention, I just wanted to refer back to the Wired article as I was responding, and they are small and right next to each other. Ah, I notice when I refresh there is an unflag option, I have just taken that action, again sorry.


>Also, I accidentally clicked on "flag" above when I wanted to click on "parent." I am sorry, that was not my intention, I just wanted to refer back to the Wired article as I was responding, and they are small and right next to each other.

There should be an "unflag" where "flag" used to be.


Thank you, I noticed when I refreshed and clicked unflag.


Agreed, and now the headline reads more like "Hackers endanger people on public highways" instead of the more interesting (in its consequences) "Jeep cars can be taken over almost completely over an Internet connection while they are running". I'm sure this generates tons of traffic for Wired but this does not bring the necessary focus on the security issue.


Do people need to be reminded of what happens when these kind of issues are not disclosed out-of-the-blue?

Well, here you go (search for 'Volkswagen'): http://attrition.org/errata/legal_threats/


Real world scenario was used to gain more publicity and to get mainstream media sources attention. People get attracted to catchy titles and can relate the incident to themselves because it happened on a highway. Nobody bats an eye if the test was done on a parking lot.


> If I ever learned this had been tested on a vehicle I was in, I'd make sure this cost the researchers dearly.

That's not how civil court works, as I'm sure you're referring to filing a suit against them, for... some nebulous thing? You have to prove damages to be awarded anything in a civil court.

Theoretically let's say that they tested some remote tracking on your vehicle without your consent. What then? If you can prove there was some damage to your vehicle, great, you'll be reimbursed for it. Otherwise?

Content aside, the self-satisfaction and smug attitude of this comment is disgusting.


What on earth do you think you are going to accomplish that will have any positive effect whatsoever by calling the cops of all people?


I can't believe that you snitched on these guys and you're proud to share it with us! WoW!

The guy consented to their experiment and he voluntarily engaged with them. It is not like they set him up for this.

Maybe you could argue that they could have jeopardized the lives of people on the highway with their reckless behavior esp the engine shutdown stunt and I believe that they didn't exercise wise judgement in doing so but didn't they instruct the driver to switch off and back on to regain control of his vehicle and move ahead?

You also claimed that they're boasting of their act by publishing this video when it was Wired that produced and made the whole report and experiment and not them. The reporter himself the subject of this experiment didn't file any report with the authorities so you come and act more royal than the king!

What a mess!

What was that snitching for? This is completely uncalled for.

This is a knee jerk reaction from you and testament of your true character.

You should be ashamed of yourself snitching on your colleagues like this and your phony outrage at this act is not fooling anyone.

Grow up you are not in elementary school anymore!


..."Posted from my iPhone in rush hour traffic!"


Wow. You need to cry a little harder.


Sure it was not a smart move but you're really overreacting.


It's not that your points are irrelevant, it's that they needlessly draw attention from the issue at hand: that the car manufacturers are being criminally negligent.


There is more than one party here who is displaying extreme neglect.


As opposed to the alternative of someone specifically leveraging this with the intent to doreal harm? They could have gone about this in a safer manner but the person you should be upset with is the manufacturer not the person pointing out a large gaping problem.


This is the kind of completely uninteresting side discussion that people on internet forums love to get into. "Somebody did something dangerous on a road" is not a relevant topic on HN.


This is behavior that makes adults think that people calling themselves "security researchers" are bonkers and need to be legislatively controlled.

Mature research labs have review boards to govern "researchers" who want to just see what happens when LSD is put in the water supply.


Blame the messenger hey?. How about calling the cops on the company that actually put that crap tech in your car instead? To the OP: You don't have to scare me twice. I'm sporting pre 9/11 PC-hardware and now I'll be driving a pre 9/11 car. If you tell me that someone can remote control my underwear then maybe I'll have to draw the line there because I'm not buying pre 9/11 undies dammit....not yet anyway.


> I've just phoned 'Troop C' of the Highway Patrol at their main number, +1-636-300-2800 and they seemed pretty keen to follow up. The fact that the vehicle was disabled where there was no shoulder, was impeding traffic, and the demo not cleared with them in advance has them concerned. I'm all for testing exploits and security research, but this isn't the right way to do it. And to film it and post it to a high traffic site is nuts.

WTF is wrong with you?


Can you clarify exactly where the line is between what you've done here today, and an outright swatting?

I'm not trying to equivocate the two, but it would seem they both exist somewhere on the same continuum of personal information and police involvement.

How much do you think your decision to make this call was influenced by your perception of what law enforcement does in Europe vs. what law enforcement does here in the United States?


Outright swatting: "911, my name is <researcher name>, I live at <researcher home address>, and I'm currently holding my girlfriend hostage with a shotgun and plan on killing us both in 30 minutes."

What tombrossman did: "911, I saw a video of researchers shutting down a car on a freeway in the middle of the day with little concern to public safety, can you guys investigate and make sure that nobody's life was in danger for this experiment?"

I don't think there's even a tangential comparison between the two.


Are the police not capable of reading wired on their own? The article is quite public. I can't imagine this could be published without the relevant law enforcement agencies being aware of it on their own.

In general, I would feel uncomfortable alerting a local law enforcement agency in _another country_ about something I saw on the internet, both because the premise is silly, and because not all cops are loyal public servants dedicated to protecting people. Many just like the power trip they get from having a badge and a gun to wave at us plebs.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: