While planes have become so, so, so much safer because of all this automation, pilots uncertainty regarding autopilot functioning is a major concern nowadays, and the reason for several accidents.
There are very interesting HCI challenges around properly communicating to the pilot/driver "what the heck it is doing" and clearly communicating just how much control the human has or doesn't have at any given point.
This "announcement" certainly doesn't inspire any confidence that they have really thought this through deeply enough (I think they probably have, but it should be communicated like it). As a huge Tesla fan, I can't help but feel like I need to hold my breath now and make sure something terrible doesn't happen because of this, and it ends up leading to even more regs setting us back on the path to full car automation.
It sounds like it's time to introduce the dog?
Example Air France 447, where (presumably) the airspeed sensors where blocked by ice, which lead to the autopilot to disengage. There is also the theory that the pilots then made some of their mistakes based on the belief that the avionics would stop them from bringing the plane in an unsafe state and didn't realize that the system wasn't able to do so with missing information. (I hope I'm presenting this correctly, but that's what I remember reading afterwards)
There is also the training aspect: if a system takes care of something 99,9% of the time, the pilot is less experienced in the 0,1% where it doesn't. There is a reason the safest airline pilots often fly way smaller aircraft as a hobby and get some instinct for manual flying there.
I'm not an airliner pilot, but I have the impression almost everyone is always grossly underestimating how often a human pilot is still required to safely fly a commercial airliner. Yes, the autopilot may be able to handle 99% of the flights without intervention, or even 99.9% of the flights, but simple math shows that that means there are still tens if not hundreds of thousands of flights every day where the pilot needs to take action, and that any individual pilot will get into a situation where the autopilot will fail to handle a situation probably once or twice a year. Likely a lot more often.
Large parts of the world have weather conditions, diversions, or airports with spotty or unreliable ground systems by the way, in those areas the need for a human pilot will be several times higher than once every hundred flights. There's ample examples of plane crashes that were caused by automated systems, rather than pilot error (I don't remember the exact location and date, but I know that e.g. not very long ago a plane crashed because the ILS beacon at the airport was malfunctioning). In other cases the level of automation enabled lousy pilots with bad training to fly the airplane (case in point: the Air Asiana crash at SFO), which IMO is not 'pilot error' because these people should not have been flying the plane in the first place.
All this makes me think the solution to improve airliner safety even further is not more automation, but better pilot training.
As for autonomous driving, besides AI assisted highway cruising, I don't believe in the concept at all, and my bet is that in 25 years we've realized that we've been wasting most of the time trying to build fully autonomous vehicles. Limiting the possibilities of driver error seems like a much better investment (e.g. automatic cruise control, automatic emergency braking, etc).
There's a lot to be said for that. The pilots didn't know their altitude and airspeed either. They thought they were overspeeding when they were stalling.
Some military aircraft, mostly the stealth ones with terrible aerodynamic stability, need that sensor data to stay in the air at all. If they lose the sensor data or the stability control system, the only option is to eject. So they tend to go in for more sensor redundancy. There's certainly no reason that large transport aircraft can't have more sources of basic attitude/altitude/airspeed info.
They needed to descend to gain airspeed and lift. But they were pulling up and literally falling out of the sky without realizing it.
That's what you're doing here. Saying "the copilots crashed it" teaches us nothing; we need to know why they crashed it, what cues they misunderstood and what skills they lacked, so we can keep it from happening again.
I think the design language will evolve slowly with the users at a speed roughly related to the adoption curve.
It may tell users everything that the car is deciding to do now, and as confidence in the system increases with adoption, it will do less and less.
That is why more advance self-driving car researchers are working on the harder problems of getting abstract and intelligent interaction with a car working. The real market problem is just telling you car "take me to the store" rather than just getting it to drive down a straight lane.
If the bad guys had only hacked Bond in this scene, it could have been much different!
If % of accidents caused by pilot interference on a working system > % of accidents caused by system malfunctioning and pilot ignoring it: people will still be against not allowing pilots to interfere. Even when it causes more accidents...
There's something about humans trusting humans more than machines that I don't fully understand. Systems can make mistakes but the amount of human mistakes is often exponentially greater to a degree of absurdity that humans are even trusted at all and yet people will side with the human over the machine.
Humans will always want human oversight - even when that oversight does more harm than good once automation reaches a certain threshold...
Special note: I'm not aware of avionics and the data on pilot interference w/ the system vs failure of pilot to intervene. So maybe this example doesn't hold very well for avionics...
Or when the road is slick and you are doing 5 under, and you see someone pass you doing 5 over, with a car following too closely to it... you have a better understanding that there's likely to be an accident ahead... (I've seen this happen several times). A computer wouldn't predict that most likely.
Unless ALL cars are automated, it is a must that a driver be able to take over quickly.
> A human can make judgement calls in unexpected situations.
A properly programmed machine can behave smarter and faster and it also knows its limitations, so it can account for them. "Judgement calls" is something not needed because the machine always keeps good judgement.
> Objectively, a human is more likely to see falling debris and react to it than a self-driving car (of course depending on the programming and sensors).
Very much depending on programming and sensors, but from some point on, I'll always be betting on programming and sensors.
> Or when the road is slick and you are doing 5 under, and you see someone pass you doing 5 over, with a car following too closely to it... you have a better understanding that there's likely to be an accident ahead... (I've seen this happen several times). A computer wouldn't predict that most likely.
The computer will track all vehicles, their velocities, past observed movement history, perceive road conditions that you can't due to limitations of Mark I Eyeballs, and will push all that data through an inference system that is capable to stay rational all the time, without being affected by "stress" or "surprise". Solutions will be computed literally faster than the blink of an eye. The machine will see the phase space of the road-cars system and be able to navigate it to safety.
> Unless ALL cars are automated, it is a must that a driver be able to take over quickly.
The driver needs to never take over. A dog should be placed, trained to bite the driver if he tries to touch the controls. Better yet, remove the controls.
The machine knows its limitations. The right way to avoid an accident is not to perform stunts based on intuition, it's to keep the entire system (of cars and the road) in a stable and known state, and navigate that state to safety.
People seem to get machines really wrong. Machines today are limited in their creativity. But they are orders of magnitude better than humans in getting things right. On the road, we need more of the latter than the former.
You are correct that in the vast majority of situations a computer will outperform a human. I've avoided accidents by luck more than judgement, guessing that the car will be able to do what I'm asking (whilst the autopilot would know at all times and react in milliseconds).
All the world needs to accept is that at some point someone will die because of an auto pilot error and that's ok because it's net lower numbers of deaths than the same number of humans driving.
That's where I see the real sticking point for automation. Driving isn't about getting from A to B, nor is safety the top priority. If it were, there wouldn't be such a thing as a V-8 (or whatever the electric equivalent is). I find it very ironic that a performance car like the Tesla might promote "the path to full car automation". You think the gun debate is tricky? Try telling people they cannot do 51 in a 50 anymore.
Yeah, this is a huge problem with drivers today. 99% of driving is absolutely about getting from A to B in as safe way as possible; anything else is reckless endangering of the lives of others. But people grow up dreaming about fast cars and freedom and adrenaline, then they get their licenses and can't confront the boring reality. That's the very reason we need self-driving cars on public roads. Driving for fun needs to be separated out and put somewhere else.
How about sightseeing buses? Or bicyclists riding purely to stay in shape? Or people learning how to drive? Or cops paroling neighborhoods? Or ice cream vans? There are lots of perfectly reasonable uses of roads that have little to do with A-to-B. The elimination of "fun" in the name of safety is a very uphill battle.
> So my friends and I riding motocycles through the mountains on a sunny afternoon are reckless endangerment?
If you're breaking the law then yes, you're reckless, period. If not, then enjoy it all you want.
> How about sightseeing buses?
They're driving A-B-C-D in a sightseeing-optimal way within the limits of law and safety.
> Or bicyclists riding purely to stay in shape?
They aren't doing it next to cars, are they? There are dedicated areas for riding bikes for sports.
> Or cops paroling neighborhoods?
A-to-B driving, with lots of As and Bs. Also, there are two of them, one is doing the driving, the other is doing the patrolling.
> Or ice cream vans?
A-to-B, with lots of As and Bs.
Public roads are not the place for having fun. You can enjoy driving on them however much you want as long as it doesn't affect the safety of others.
I myself spend more hours on public roads on a bicycle than driving a car.
But pure sports? There are parks for that.
Don't get me wrong, I'm all for reclaiming cities for pedestrian and bike travel and creating a maximally energy-efficient public transport system (ironically, the best idea would probably be self-driving, publicly-owned electric cars forming a PRT network). But in current situation one has to stay pragmatic.
We should be promoting cycling not making it even more difficult. Self driving cars can make cyclist significantly safer for everyone. The most dangerous thing on the roads right now is human controlled cars.
I love to go on long bike rides. Like 4 or 10 hours long. Parks is not an option for that. While backroads are awesome. Once out of town, I usually meet a car once in a while. And while in town and suburbia, I guess I can qualify as A to B commuter. Even out of town, sometimes it's sort of A to B commute, because I want to go to some sightseeing spot or fancy cafe in the woods or smth. I just happen to cycle instead of drive. Or sometimes I dispatch my family in a car and go by bicycle myself. It's still A-to-B, isn't it?
Saying what is and what is not a "proper road use" is a very slippery slope. Is it OK to drive to get to a grocery store? Is it OK to drive to some spot in the middle of nowhere for sightseeing? Is it OK to drive for few hours to bbq in an unseen place? Is it OK to drive to go fishing in a nicer lake instead of the one nearby? Is it OK to drive to see friends if you can video call them instead?
Generally such laws include things like:
1) May use the turn lanes when turning.
2) Must ride to a specific side of the road unless: You can keep with the speed of traffic, there is not enough safe distance for a car to pass you, or it is not safe to ride on the side of the road. (These rules are generally true of any vehicle that drives slower than general traffic.)
3) Protections to the bicyclist. Either in the form of "vulnerable traffic" laws, or specific bike protection laws.
4) There are even places with laws preventing bicyclists from using the sidewalk.
5) About the only place a bicyclist isn't allowed to ride is on high speed traffic areas like a freeway/highway/etc.
6) Protections for the bicyclist against negligent drivers such as mandating safe driving distances, when to pass, and protecting against specific activities that may endanger the bicyclist.
I do agree that a bike path is certainly much safer (and perhaps more scenic) for fitness, but many people are interested in traveling to a specific location. (Ex, the beach or a park. Riding along the coastline. Etc.)
The main problem with biking on the road isn't that you are doing something wrong... but that drivers do not respect you. Tailing too close, not giving you the right of way when you have it, etc.
And I wonder why you assume that a bunch of motorcycles driving through the mountains would be breaking any laws. I know a group of lesbian harley riders who never go faster than 50 who would take issue with any assumption that riding a bike suggests illegality. For them it is about being seen and showing pride. I did a ride for burn victims a few years back. The kid riding on my bike was all smiles even though I doubt we broke 30 the entire way. He just liked the wind, a bit like my dog when she hangs her head out the window. Riding for fun does not mean riding for speed.
What I mean is - going on a motorcycle trip on a scenic route? Sure, it's fine (a self-driving motorbike would probably be even safer though :)). The problem really starts when some people put fun in front of safety and e.g. start speeding.
TL;DR: have as much fun as you want, but not at the expense of safety. Want to have additional fun coming from doing dangerous things? Go do it somewhere where you endanger only yourself (and those who consciously decided to participate in such activities). Racetrack, private roads, whatever.
Any sudden, unexpected, movement by an autodrive motorcycle would probably see the drive thrown free, or at least result in a weight shift great enough to down the entire package.
Because all those "ballet-like" things is basic feedback control issues, the kind of which you learn about on control theory 101 in college. You can pretty much convert the problem of steering the self-driving motorcycle into the problem of self-driving car by adding a module that accepts car-like inputs and translates them to dynamic balance control. We've solved the basics with segways, which are smart-high-schooler level electronics projects.
Also, generally, whatever you can do on your vehicle a machine can do the same using the same control inputs, only better. Manual gear shift included.
This all comes to a head at the transition point of countersteering, where the controls reverse around 20ish mph. The rider's balance controls whether turning the handlebars right means right or left. If the rider isn't in sync with the autodrive, everyone is in the ditch.
(That experience taught me to always tell people to lean into the curve if they've never sat on a motorcycle before)
Given the close interaction between the rider and the bike, I'm not sure how comfortable would such a vehicle be. It may turn out to be nausea-inducing.
The speed limit (and various other laws) are very much all or nothing. Riddle me this, which of the following is safer to do on the same road with a speed limit of 50:
a) Driving a brand new porche with brand new tyres on a sunny day, doing 55
b) Driving a car from the 70s with tyres that are almost worn out (but still in the legal range), at night, in heavy rain, doing 48
now consider that a) is illegal and b) is legal.
Breaking a law should not be a binary decision, making it a 0/1 choice makes the world worse for everyone, because to be completely safe, the restriction has to account for the worst case and that means that the vast majority will be artificially limited and feel like the law is needlessly overreaching. This is not a good situation to be in.
That doesn't mesh with areas where robots have already taken over. The robots moving freight around shipping terminals don't move any faster than humans driving trucks. The robots I see delivering drugs in hospitals don't move faster than the nurses.
Speed limits aren't just about the cars. Noise and pedestrian safety are big factors. There is no technology yet that can allow a robot to see around blind corners. They may have a 1/4second faster reaction time, but that won't mean they could be trusted move much faster towards a crosswalk that might see a bike cross at any second.
I guess it boils down to perspective. The hospital robot is in no hurry. His life is no better or worse if he is faster or slower. But a human inside a robot-driven car does care. He doesn't see the overall traffic efficiencies, and even if he did he wouldn't take much notice. All he cares about is getting to the destination asap. So it is a different speed decision than the robot.
That's one reason. Then there's the money printing aspect of it and the environmental aspect of it.
But they're inconvenient for getting from point A to point B. Which is why they've been replaced by cars.
I suspect the same will happen with cars. I love my fast cars, but apart from that once a month weekend drive, I really don't get to flex its muscles.
I'd much rather relax in my car on my way to work every day. Then I could buy a more powerful car which I could take out to the race track on weekends.
Particularly when you can work in your car. Or watch a movie in your car. Or have sex.
But joking aside thats where these systems will eventually take us.
And compared to that, how many accidents were prevented by the use of autopilot? How many times did the autopilot make a better decision than a human would have? You can't compare to an absolutely perfect state (zero accidents). You have to compare to the current situation.
I was only remarking on the possibility that some of the issues that this automation have now created in aircrafts might also be seen in cars, especially when there are things like still keeping your hands on the steering wheel despite the fact that the car is in control.
And, unlike your rational analysis, I don't trust politicians or the media to think about it like that. Instead, I'm sure the first accident that could even be remotely attributable to an automated system would immediately ignite a firestorm of bull-shit articles and possibly damaging regulation.