While a human driver can easily see a rider doing a track stand isn’t going anywhere...
Ok, I'm just going to come out and say, I'm a licensed human driver and a bicyclist, and I find track stands confusing too. Like a skateboarder in the bike lane, there's a lot of potential for unexpected motion, and that needs to be accounted for.
The bicyclist is wobbling in place, but could easily transition to going forward or backwards without warning. The bicyclist is intent on the stand, so drivers won't be able to gain eye contact.
I think the cloud of uncertainty around bicyclists is also why cars wrongly yield to my bike so often at stop signs. Some bicyclists do unpredictable things, and drivers incorporate that into their mental models.
People learn, and I'm confident that self-driving cars will too. Eventually.
I hope their cars keep a good log of exactly what logic led them to every decision they make, so when an accident happens, the car's log can clearly indicate something like: "The bicyclist to my left appeared to be in a 'track stand' position, indicating he was yielding to me, so I went forward through the intersection."
Although I suppose keeping a rolling video of everything would suffice as well, for diagnosis purposes.
Videos of events are actually a terrible idea, for the defendant. There's way to much tendency for 'backseat drivers' to second-guess a decision made in a moment. Videos give them a 'sound byte' of a complex event. Juries make terrible decisions that way.
Somebody looking over your shoulder from the calm surroundings of a court room are very likely to second-guess everything you do, looking at a video of it. That's why we have a jury, an advocacy system, and we question witnesses.
Yes, when I'm at a stop sign anywhere near a bike, I'll yield if I'm in a car. Bicyclists are just too unpredictable - sometimes they'll roll the stop sign even if it seems unwise (or maybe they just didn't notice me). So now I just assume all bicyclists I see on the road will ignore all stop signs in their way; it sometimes leads to awkward wave/stops, but better that than running somebody over.
As a cyclist, I find it frustrating when cars behave unpredictably. If you get to a 4-way stop before I do, I'm basically planning on timing my crossing to go after you do - and maybe use the delay for a chance to catch my breath.
But if a driver suddenly decides throw the 4-way stop etiquette out the window and start directing traffic from behind a windshield I often can't see through clearly because of tinting/my sunglasses/glare-from-the-sun. I'm at a bit of a loss...are they waving me though? talking to a kid in the back seat? messing with their phone?
If I go, I'm risking my neck if they were in fact distracted and not waving me though. If I resist going out-of-turn, that's awkward because my feeling is some drivers get frustrated I'm ignoring their attempt to help me out.
As it happens, I learned my own behavior because I once struck a bicyclist who wasn't as predictable as you. I went to a 4-way, stopped, saw a biker approaching the intersection who hadn't stopped yet, and pulled into the intersection.
The bicyclist didn't stop, was struck by my car, fell off, went to hospital. She was okay in the long run, but sued over the incident; the police officer who wrote up the event determined that I was at fault (despite eyewitnesses to the contrary) based on the claim that I should have stopped for the bicyclist.
At this point, given that bikes are known to do this I really don't have any choice in the matter. I'm glad I didn't kill anyone - head trauma can be really tricky. I'm concerned that I was the victim of insurance fraud (she really did run out in front of me). My insurance company just settled the complaint - didn't want to get deep into litigation over a matter of hearsay and he-said-she-said.
So, what should I do? If it's a car approaching a 4-way, I assume they will follow traffic laws. If it's a bike, I'm not so certain. My experience isn't just that one time in berkeley I hit someone, now that I live in SF, there's plenty of times when I've observed bicyclists taking their lives into their own hands and dodging traffic without obeying stop signs and red lights.
Keep in mind that once you "wave" you assumer responsibility for any accident which might ensue do to your waving people on because others may not be aware of your waving and interrupting expected traffic patterns.
Yah, in general I hate wavers. The problem is that I've experienced bicyclists who just don't follow traffic rules regarding stop, yield, and red lights. Given that I know there's a nonzero probability that a bicycle will intersect with my car if I pull out, I don't go from my stop until I'm pretty sure they're going to stop at the 4-way.
As I mentioned in a cousin reply, I once struck a bicyclist who wasn't obeying traffic rules (she blew a stop sign at a 4-way) and was found liable for the accident at least by the officer who wrote the accident report. We're all lucky her injuries weren't major, but there's no way I'm going to risk hitting a biker who is barreling down a hill toward a 4-way by assuming they're going to follow the traffic laws.
Yes my traffic school instructor told me to never wave. Not because of responsibility, but of the fact that you might have missed something and the person (especially children) might interpret your wave not as "I'll stop until you have passed" but as "road clear, go ahead".
I hate wavers. They include among their number, those that are planning to run over bicyclists. I can't distinguish between the two sets, and a false negative means I die. So no, I'm not going to go first even if you wave at me.
I ride a unicycle and commonly perform tracks stands ... It really doesn't require that much concentration you're balancing the bike the same way a Segway balances. Would someone concentrating that hard be able to recount the car's behavior so clearly?
I truly believe that general purpose self driving cars are much farther away than the latest press releases would suggest. Apparently they don't work well at night, in the rain, when the road lines aren't perfect, etc[1]. They basically only work in the perfect weather and roads of Mountain View during the day.
In Boston, the roads are shit, the drivers are crazy (including me), and everything is permanently under construction. Once people learn how to spot these cars, very quickly the tricks to passing them are going to be learned. Not to mention the jokers using laser pointers and whatnot to mess with them. If you think this won't happen, you are sorely naive.
I think self driving cars will work for specialized cases, like long haul trucking on highways, but for the commuter, it's going to be a long ways off.
I think it's more amazing we went from 0.02 miles one way, to 477800 round trip to the moon in just 66 years.
I vaguely recall the google director saying their autonomous cars already drive better than his parents.
Even if it's only daytime clear weather driving, super cruise control would be insanely useful for commuters, truck drivers, and city busses.
It's easy to fall into the trap of perfect or nothing. Even partial solutions create a lot of value. Complete autonomous driving will be here real soon, but there might be a few intermediate steps along the way.
And people often use that, the NYC subway is a great example. But it's not so effective in lower density, or less established rail systems. You'll often see east/west travel relatively accessible, but north/south hard, for example. Rail only really serves those along the line. I'm sure phoenix could build a ring and spoke system to connect up all of those low density developments.
The thing that sucks though, it's a big upfront cost. super cruise control is an incremental cost that takes advantage of existing infrastructure.
Rail isn't so great at getting food to grocery stores. (cities, sure, but not actual transportation endpoints.) Such a system could be built, but trucks are so obviously effective, and they work with the existing infrastructure.
Rail is cool but a big upfront cost for something that's not obviously superior to the current system seems wasteful.
Maybe fill train stations with self driving taxis to cover those last few miles to home, or work, or wherever.
Yes, the thing is just that the interstate system was in the end more expensive than similar rail infrastructure would have been (especially in maintenance).
And, well, yes, cars and trucks will still be necessary, but not nearly in the same amount.
I'd love to believe this quote, but light rail per mile is an order of magnitude more expensive than highway per mile. Do you have any citations I can look into?
$35million/mile for rail, $25million/mile for a single lane. Rail maintenance is cheaper, and ongoing costs are generally lower - you don't need police patrolling railroads.
Railroads are cool, and they have a lot of advantages, but we've got this huge investment in our existing system. And unfortunately, unlike self driving cars, there isn't a smooth transition. It's like switching from pc to mac, but all of your software is home made, and must be rewritten. You have a huge upfront investment to just get started, and you have to maintain the other system as well.
- much shorter stopping distances. This lets roads be "general access" in a way that rails have never been, even if you were to have a railway running past your house.
The issues are essentially based on (a) how many lanes your highway has, (b) how many complex interchanges you build (interchanges with many bridges tend to lead to a cost explosion), and (c) rail tracks are usually able to be used for decades without major maintenance, while highways usually need to get filled and fixed multiple times a year.
I only have a specific study for my city, where they were comparing the costs from fixing the streets in the cities from the busses running over them vs. building light rail instead, and in that study it turned out the light rail would be, if you also consider stuff like maintenance and accidents, cheaper.
Safer, perhaps. Cheaper? Not so much. Rail infrastructure is expensive, both to build and to maintain. Rail is also much less tolerant to slopes and tight corners than even the most unwieldy of road vehicles (which contributes substantially to the expense; lots of terrain engineering involved in order to build a railroad that's actually safe to travel on).
That said, rail does have superior cargo and passenger capacity, so these costs can be offset by the increase in potential revenue.
Rail is, compared to the huge amount of highways and interstates built, quite cheap. Especially if one wants to expand capacity of highways to counter population increase.
The sheer capacity of rail makes it even on small scale sustainable, on state scale it’s even more effective. (especially on the east coast).
And with cars having such low efficiency (transporting 2 tons of metal for a single passenger using gasoline at 90km/h seems a waste when you can have a single train transporting 400 people at 320km/h using electricity instead) it’s just not really effective on large scale.
New York City is a good example in the US, but cities in Europe and Japan show how even small cities and whole countries can benefit from rail infrastructure.
By "Japan" you mean major cities in Japan. An hour or two outside Tokyo, public transportation is 1 bus every 30 minutes. A friend and I were in Yamanashi-ken last year, and we were complaining to a random shop owner about the bus schedule. She said, "twice an hour is amazing! this isn't Tokyo, you know!"
Most people out there drive if they have to go somewhere.
Twice an hour is actually not that bad – I live in a suburb of a German city with the bus to my uni only coming every 30 minutes, and it’s not a problem – almost everyone uses the bus or bike. You just have to get used to the thought that not everything happens when you need it, but instead you have to adapt to when things happen.
It doesn’t have to. In a properly built city, the rail will end up less than 500m from your destination, and you can walk the rest.
Example cities with okay or good rail infrastructure: New York, London, Paris, Vienna.
You also save a LOT of time, because you are never stuck in traffic, and you can use the time in the train to check up on your emails, etc – and you never risk getting into an accident because you are tired after work.
> It doesn’t have to. In a properly built city, the rail will end up less than 500m from your destination, and you can walk the rest.
Part of the problem in 2015 is that unless there was planning for rail from the start, the cost to build out rail in established areas is enormous - typically because you need to tunnel - in addition to the cost of community consultation and other access-related issues.
I otherwise agree that well-planned rail into key areas would have benefited a lot of cities, especially those feeling the pain from road congestion now.
I commuted by tube in London for 3 years, my most recent house was 1 minute walk from the station and my office was also 1 min walk from a very busy central station. So basically door to door. Even still, commuting was not a pleasant experience.
The human congestion in the central tube stations at rush hour is appalling, as is the air quality down there, as is the lack of personal space on the train itself. It's a very stressful, undignified experience for everyone.
I'd much rather be spending a bit of extra time in traffic, sat comfortably in a self-driving taxi, than take the tube. That said, I think there's good reason to believe that as self-driving cars become more prevalent, traffic flows will actually improve due to better flow management, less need for traffic lights, and fewer cars parked.
Rail does the job, but it does not do it elegantly. It's very unpleasant to take mass transit railways in major world cities. Nobody who's done it for any length of time would reasonably contest this. Self-driving personal vehicles could be a major improvement on the commuting experience, if indeed the future involves commuting at all - a vast increase in remote work may simply sidestep the issue completely by the time self-driving cars are a mature tech.
A single person in the tube takes, technically, 0 space due to being in a tunnel, and even in overground railway a single person takes less than a square meter.
With a car, you waste far more space.
And, by the way, I’m taking public transport every day, all year round, even in situations like http://i.imgur.com/KyWkUzJ.jpg and I still find it far more pleasant to wait 20 minutes in a crowd than to be stuck for 40 minutes on the highway.
> there’s not physically enough space for self-driving taxis for everyone
Do you have any sources on this? I agree that space on the roads of major cities is an issue now, but several changes in the future could mitigate these issues significantly:
1) Self driving cars will probably end parking in city centres. Suddenly, a lot of congested roads have ~twice the surface area.
2) Self driving cars may eventually be able to drive within centimetres of one another.
3) No need for a huge bonnet (small electric engine), boot/trunk (rooftop rack storage), pedals, dashboard, etc in an electric automated taxi. A 2-4 person vehicle is now rather compact indeed.
4) Remote/part-time-remote work may mean commutes are avoided, or staggered throughout the day, for the majority of people. There's a lot less peak capacity required, compared to today.
Well, your car will still have only one person – and you won’t get it smaller than the Messerschmitt Kabinenroller I linked above. And even that is more than twice what a person in an overground train takes in space, underground trains are essentially free.
Additionally, in many cities there is already almost no downtown parking.
It’s not like you’re going to turn the laws of physics over.
I live in a european city founded around 1200 – so, while we destroyed our tram system, and our public transport is shit, I can still get everywhere with public transport equally fast as driving, often faster.
I’d like to have even better infrastructure, and 24/7 service, but I don’t have an issue. I have 3 bus stations within of 300 meters, and a train station 800m from my current home.
Clearly you've never ridden the subway in NYC during rush hour. Most of the major lines will have some delays, you'll have to wait for multiple trains to pass before you can get on, or once you're on the train will stop because "we are waiting for train traffic ahead".
The point of the parent was that for commuters, for long-distance cargo transport and for busses some "super lane guide" would be useful.
Does your going into the mountains contain "driving on the same lane with thousands of others from the same place to the same place every day"?
Because that’s what I was suggesting – nothing more, nothing less: Replacing mass-scale commuting with rail.
Yes, if you drive somewhere where no one else wants to go, cars are still useful – but if you’re going the same path that another million people are also taking, maybe rail might be better.
Rail and Road are not exclusive, they can coexist.
Sure, they're limited, can't drive in snow. Snow shut down Atlanta because humans couldn't hack it. Autonomous cars can't compete with humans at their best, but they're consistent and know their limits.
It's tough to see in tech, but some older people's driving fades fast. They arrange their lives to avoid driving on the freeway or at night.
Snow shut down Atlanta because the city did no preparation on the streets and didn't salt them, so the whole city was covered in an inch of ice. I don't think any car could drive on that as long as it used tires.
There were also (according to local legend) three snowplows for the whole city in the 2000s.
Is your analogy useful? It's been several decades since the first driverless cars, so if the "Wright Brothers to St. Petersburg-Tampa Airboat Line" comparison were predictive, we should have had plenty of those cars on the road by now.
There are also many other analogies you could have made with different timelines. Virgin Galactic, which bills itself as the first commercial spaceline, is already half-a-decade behind its original launch date. Even Tito's 2001 flight was four decades after Gagarin.
Why look to commercial flight and not commercial space travel as the relevant comparison?
It's harder to be aggressive when dealing with human-driven cars because people can react. The commuter dozing in his self-driving car is not aggressively driving, and with his car legally mandated to be in super-safe mode, it's going to take twice as long for it to make it through rush hour traffic.
Rush hour traffic on a highway is a good example of where aggressive driving doesn't actually work very well. I've seen people try to drive aggressively (switching lanes and tail-gating), and they're still in sight for a very long time. So they're maybe seconds ahead of me? Sometimes I even end up passing them because they chose the wrong lane, or they ended up stopped at a light.
You do get idiots saloming around in somewhat heavy traffic, but that doesn't actually slow others down much while they're putting everyone at risk.
> The commuter dozing in his self-driving car is not aggressively driving, and with his car legally mandated to be in super-safe mode, it's going to take twice as long for it to make it through rush hour traffic.
That's a common idea based on not-very-careful observation and just isn't factual. There's plenty of data showing that aggressive driving has minimal effect on travel times. In fact, you actually get a bigger speed-up by strategically avoiding left turns: companies with delivery fleets plan their routes with this in mind.
Rush hour traffic is going so slow that non-aggressive driving will only cost you a low single digit percentage of travel time, which will all be made up at the first red light.
I routinely drive at 90km/h on the main 100km/h stretch of my 28km commute. 23km at 90km/h, and I pull up next to or immediately behind the person that passed me a few minutes ago: the difference in arrival time between a car doing 120 and another doing 90 on that stretch is 4 minutes, assuming they hit 120km/h as soon as they leave the lights at one end of the road. If they have to overtake slower traffic like busses, cranes and me, they get less of a head start.
Eleven minutes versus fifteen on the only stretch where aggression makes a difference. There is no way in the world that this counts as "double" the time. The rest of the trip is the same average speed of about 25km/h, with the more aggressive driver getting to use their brakes more than me.
Red lights are the great equaliser.
Aggressive driving buys you one or two cycles of the traffic lights, along with a much higher fuel and maintenance bill.
I'm skeptical of that. I think on average drive time goes down. A few steady drivers drastically reduce stop and go driving, and can eliminate those weird traffic aftershocks where everyone slows down for nothing, just because someone slowed down at that point an hour ago. [1]
Furthermore, the total time might be less on a good day, with aggressive driving, but the more aggressive drivers there are, the higher the likelihood of an accident, which takes substantially longer than slow drivers.
t1 = aggro drive time
t2 = slow drive time
t3 = accident drive time
small cities, say less that 500k, it's probably not worth it, because p(accident) is pretty low, freeway backups relatively rare, say in the neighborhood of .1
bigger cities though, traffic gets messed up all the time. t2 wins.
The robot equivalent of a "work-to-rule" strike could be a possible response to this kind of behaviour: the driverless cars all around an aggressive driver could go into a slower, more defensive mode and box the aggressive driver in—to ensure the safety of neighbouring traffic, of course.
Even in your made up thought experiment that accounts for << 1% of all aggressive driving miles I still think this is the correct approach to take. Speeding to the hospital is only superficially different than speeding and weaving through traffic on your morning commute. The biggest difference is probably that you are flustered, and less likely to make good decisions while in route. The worst accident I have ever seen was someone rushing to the hospital [1]. The threat to those around you, and the chances of an accident slowing you down outweigh the small potential time saved. It would be far better for you to learn first aid and respond appropriately than for you to drive aggressively to the hospital.
[1] So there I was[2] driving through Iraq when an Iraqi army convoy approached rapidly from behind us. While swerving past my trucks, that had pulled over for them to pass, one of the vehicles left the road and rolled over multiple times. 2 people died in that accident, I applied tourniquets to 2 additional people who lost limbs, and no, they didn't get the gunshot victim to the hospital any quicker that way.
[2] Fun fact. Do you know the difference between a fairy-tale and a war story? War stories start with "So there I was".
Emergency vehicles rushing around may or may not be good overall. They have a few advantages over your average crazy driver though such as lights and sirens to get the other vehicles attention so they can move out of the way.
Police often have additional driver's training. I happen to think that police often drive faster than they should, but this is less the case for emergency vehicles. I rarely see an ambulance or fire truck moving at high speed. They are pretty aware of their job to get somewhere safely, and I am typically only passed by an ambulance if I am stuck in traffic myself.
> it's going to take twice as long for it to make it through rush hour traffic.
So read a book or work on something on your laptop or go on Amazon and buy a present for your partner with the health insurance money you'll eventually save by registering your commute as autonomous.
You're using a first approximation to a self-driving car ("follow all laws to the letter") to extrapolate.
There are a large number of reasons you'd notice that won't work the very first time you take a car out for a drive (you give a great one: "it's going to take twice as long for it to make it through rush hour traffic"). With no eye contact or hand gestures available it's a harder problem, but aggression and counter-aggression is going to have to be a part of any self-driving model. Even if all cars were self driven you'd still have pedestrians willing to cross the street at any sign of hesitation.
Whose problem is this? If I am in a self-driving car, I don't expect it to use "aggression" to try to compete with other drivers. I just want it to get there safely.
Traffic police will be looking to make up lost revenue from being unable to ticket autonomous car riders, so those aggressive drivers will be a better target. (Not that I approve of using tickets as revenue, just saying that it will happen.) And those autonomous car passengers now have their hands free to take video of the unsafe human drivers nearby, for later public judgement ;)
I've been thinking about this and cringing in wait for the lobbying against self-driving cars from police unions.
But then I thought, that couldn't happen, because it's too clear that any such lobbying puts police unions squarely at odds with citizen safety. Now, shoot-first cops, civil forfeiture, and highly militarized police forces already point at this, but there's plausible deniability. Opposing self-driving cars could be too direct of a swipe at public safety from public safety officers.
I agree. Law enforcement will definitely have to change, though. Right now, traffic infractions are a way for the police to look for other criminal activity. Pulled over for speeding? Cool - let's run your license to see if you have any warrants. Let's see if I can find something else that you're doing.
Self-driving cars will completely remove this avenue of investigation because there will be much fewer infractions.
I think that this is a good thing and a bad thing. On the one hand, you won't get police harassing random innocent people on the off-chance that they're criminals. On the other hand, the police will start doing other things because they don't have to pull people over anymore, and those things might be much more invasive.
As well as those points, it also seems that the on-board sensors for the car are not good enough, as Google has spent a great deal of time scanning the roadways and constructing high resolution models of the environment. The cars apparently use this pre-prepared data to help them drive well.
This strongly suggests that the cars, in their current form, are not as autonomous and self-reliant as people assume. It also implies that the cars are not going to cope well if they encounter sudden changes in the environment.
Does it matter if they rely on some pre-processed data, as long as they have some form of fallback that allows for manoeuvring when a change in environment is found?
Human drivers also face immense difficulty when there is a giant tree fallen across the road. Luckily, such changes in environment are relatively infrequent.
Google Maps’ data about my city are from 2005. Half of my district is not even existing – and with Google Mapmaker not available in my region (northern Germany), I don’t think it will be added anytime soon.
While currently I end up with dozens of human drivers every day stopping in front of a retractable fence (and some try to trick it), do I have to expect Google’s cars soon just driving right through it, or refusing to use alternative routes?
Especially because we have shit weather with lots of wind and rain and snow and fog and below 10m vision often during winter. How’s the Google car supposed to drive in that situation?
I'm assuming that self-driving cars will also perform data collection for the rest of Google's self-driving cars. I could be incorrect about that, but if they do, it won't matter that the current data is from 2005. Alternatively, if Google do need to collect manually, I would assume they will mobilize to do just that. They did it for Street View, a feature that isn't directly linked to revenue. A much more solid business case could be made for doing it for self-driving cars.
Humans rely almost solely on vision from a single source (counting both eyes as just one source) when driving. Autonomous vehicles have far greater potential when it comes to sensors. Google's current vehicles utilize LIDAR and GPS. Tesla's lane following uses cameras, but they get more than one perspective. If it added anything, infrared could be used. There's potential to have tonnes of cameras on the car to get a fuller view. They can triangulate position more effectively by listening out for Wi-Fi and cellular network signals. There's gyroscopes, too. CPUs can calculate exactly how much stopping distance is required in the particular road conditions and adjust speed accordingly. When you think about the sensors and processing capability compared with two human eyes and a brain, it seems to me as though humans are very poorly equipped for dealing with driving in hazardous weather.
You're probably right that the current cars struggle under poor weather conditions, but it hardly seems like an unsolvable problem. Once they do function in poor weather conditions, they'll probably do a far better job of it than a human driver would.
Well, Google doesn’t even have StreetView imagery of most of Germany after they intentionally broke the law to get better images¹, so it’s not expected they’ll do much better anyway. I talked today with a bunch of friends, and we discovered that for all purposes here maps works better in Germany and Austria (except in Hamburg, that actually works in Google Maps. Vienna's satellite view is unusable in Google Maps)
1: To take images from just above 2.50m height you need permission from every house owner, cause most house owners build their hedges and fences in expectation for a person with an eyeheight of 1.90m or less. Google actually intentionally increased the height after their first test runs because they were annoyed with hedges blocking the sight. According to the news back then.
---
To your point: Well, the issue is, if Google is not willing to provide or collect up-to-date data – even if they get it for free – then data-based driving is quite risky.
And in the narrow, crowded streets of european cities I’d rather not have an automated car drive on sight in a winter storm.
The issue is that the reason the fence is retractable is because of the busses that speed through there every 7 minutes. If you drive there slowly, you might cause a crash. Cars shouldn’t drive there at all, but Google Maps still suggests to drive right through it as the fastest route. And it’s just one example, we have hundreds of them all over the city.
If they can’t even provide reliable maps, how can we expect reliable self-driving cars?
> Once people learn how to spot these cars, very quickly the tricks to passing them are going to be learned.
You say that as if arriving at your destination one car-length in front of a self-driving car somehow renders autonomous vehicles worthless.
Who cares if they stop or slow down when you cut them off? It's only idiot humans who think that the reward pathway signal of "woohoo I'm in front" means that you've won something. You haven't.
Reminds me of that Mythbusters. The junior team split in two, one drove aggressively over-taking, switching lanes constantly, while the other stayed in a single lane, didn't overtake, and just arrived as traffic allowed.
They arrived within a minute of another another, and the aggressive driver (who was Grant I believe) reported that it totally wasn't worth all the stress to arrive so close.
People who stress, get upset, and try to maximum their arrival time down to a second are only hurting themselves.
Aside from dangerous driving I try to keep as calm driving as possible, and like to just get a safe distance (4 sec rule) behind a semi on the freeway, and sit on cruise control until I arrive. Even if the space I give isn't massive, a lot of people utilise the space between me and the semi to transition from the fast lane[s] down to the slow lanes/exit, which I think is a legit public service, nothing worse than getting stuck on the freeway because you cannot transition.
I use "driver assist" in my car every day, for about 50 miles. It keeps my car in the a lane, follows traffic, and lets me concentrate on my podcast.
It's not perfect, I sometimes have to step in; but it works at night, it works in all but the heaviest rain, it works in snow... and this isn't some crazy future product, it's a thing I already own.
Self-Driving cars are coming faster than the naysayers expect.
I feel very much the same way. A significant portion of drivers also deal with winter/icy conditions for at least part of the year, and I have a really hard time believing that self-driving cars will be able to handle those conditions well, at least not for a long time. You have snow falling, snow covering many/most of the lines on the road, varying degrees of traction - added to the uncertainties of pedestrians, construction, other crazy drivers, etc. Heck, the sensors alone are going to have an inch of snow and ice on them after being outdoors for a couple of hours. Winter driving is hard enough for highly-focused human drivers, making decisions that are often based on factors that don't easily translate into algorithms and computer models.
Google speaks of a need to leapfrog the driver assistance systems and instead focus on fully-automated driving systems because the driver assistance technologies won't incrementally improve to the level we need for full driver-less cars[1]. I think that's a fair point, but I'd go one step further: self-driving cars thrown into a mix of human-operated vehicles and roads that aren't specifically designed for self-driving cars is (in my opinion) a similar mistake. I believe we'll reach a point very soon where great technical advances in the vehicle alone will have diminishing returns in terms of safety on the road.
If we are going to solve this problem once and for all, I'd be much more enthusiastic about a self-driving system that operates only on distinct roads/lanes/tracks that are designed only for self-driving vehicles. Perhaps that's an unreasonable expectation, but trying to design systems that anticipate and make decisions based on the (often irrational) behaviors of other human drivers seems no different than designing a full human-level artificial intelligence - and even then, it'll still be prone to mistakes because other humans are involved.
Either we get to the point where all vehicles on public roads are networked and cooperating through software to self-drive, or we're just making guesses about other people's behaviors. As Chris Urmson puts it, "just making the cars [and I'll add, roads] incrementally smarter, we're probably not going to see the wins we really need."
Remember though that we're coming from a position where humans sometimes don't drive very well while sober and in daylight, not to mention night, rain, while drunk, tired, etc. We're often selfish or distracted. An automated or augmented car with cameras can see virtually instantly in all directions.
When I'm driving, my rear view is somewhat obstructed by a toddler's car seat in the middle of the back row, my blindspot is further obstructed by a sunblocking mesh on the infant's side. When I check my mirrors or especially the blindspot, I am taking my eyes off the road in front of me.
I think I'd rather a superior robotic driver that stalls in tricky situations than all of us crazy humans making panicked decisions when facing pressure.
Google doesn't need to fix all those problems in order for self-driving cars to be in widespread use. Even a self-driving car that works only during daytime would be very successful.
Please Google Engineers: If your car knows where bicyclists are anyways, can you please have an additional safety to stop people from opening the driver-side door, even in a parked car, if a bicyclist is actively passing the car?
That is probably the biggest danger to me as a bicyclist in the city: People opening doors on parked cars for me to slam into (and potentially throwing me into a lane of car traffic) Your cars could easily outperform most humans in preventing this dangerous situation.
A warning or alarm would be good, but would you feel comfortable being prevented from manually exiting your car? Sounds like a great avenue for edge-case hazards.
Bike lanes often run adjacent to parking spaces. The bike has right of way and is not being a moron. That said, agree with others that a warning is the best option.
I hate cyclists and driver online arguments bbbuuuttt...
You probably also hate when a cyclist is riding in the middle of the road to avoid car doors and which makes you unable to pass him.
Cyclists should ride near the middle of the lane and be treated as taking up the whole lane. I'm not gonna try to squeak by somebody within a single lane just because he's riding toward the right side of the lane.,
> “The odd thing is,” wrote the cyclist, “I felt safer dealing with a self-driving car than a human-operated one.”
In the not so far future this will be a concept impossible to explain to kids. That we all used to drive around these multi-thousand pound machines and rely on the expertise of often distracted strangers to not be maimed or killed (and dying by the millions by the way)... It already is starting to sound ridiculous. The thought of no machine in control will be terrifying before long.
In the not so far past, there were people employed to operate an elevator. (Elevator operators still exist, but then again, so do steam trains - neither are in widespread use.)
As a kid, this was hard for me to grasp. "Why did they hire someone to push a button?"
> That said, I am very frightened about the closed-source software that will be running in these cars.
That is perfectly valid, but currently we have humans who vary widely in skills and reaction times who are controlling cars. I'll take closed-source software over a random person any day.
I think you're misunderstanding why I'm concerned about closed-source software in this case.
I'm not worried by bugs that will be included unintentionally in the accident-prevention aspects of the code: even if there are bugs they can be fixed and in the long run the software will be better.
My concern is the things that will be put into closed source software intentionally: things that monitor where you go, things that can be used to control your car remotely. It adds a very fundamental attack vector in the war governments and corporations are waging on personal freedom and privacy.
I would rather take responsibility for the relatively known risk of getting in a car accident than have all of society give up the ability to travel without the permission of their government. As Franklin said, "Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety."
We also have humans who vary widely in skills who are writing software. The same people who allow acceleration and braking to be linked to the entertainment system, and by extension, controllable over the internet. You seem to be assuming the worst possible case for autonomous car software will necessarily be better than the best human driver, but that's not an assumption backed up by reality. More likely (to me) is that, while both humans and autonomous cars might be safe in most situations, the fail states for autonomous cars are likely to be less predictable.
How should the car have responded to the cyclist? In ambiguous stops like this I find non-verbal social cues like nods, facial expressions, and hand waves are the only way to break the impasse.
This reminds me of the driving I saw when I traveled to India many years ago. On relatively-major "two-lane" roads, trash and draft-animal traffic would force trucks to drive basically in the center of the road. When two trucks met, they would each swerve slightly to the right so as to barely avoid crashing. Before they met, however, they would each turn on their left signal light. Our driver said it was supposed to intimidate oncoming traffic into leaving more room.
My thought is that the existing set of signals on a car are solutions the existing problems of human drivers, and maybe a self driving car needs additional signals. A flashing green light could mean "I am yielding to you, and get some new bar tape." ;-)
Yes, communicating / negotiating the decision with the other person. That's how humans do it.
Now try driving in dense NYC traffic, and you'll see humans communicating everywhere. Good luck to self driving cars in that scenario - unless humans are 100% off the roads.
Even a minority of self-driving cars will change traffic, probably for the better in the vast majority of cases. Human drivers following a self-driving car will have to adopt the self-driving car's caution, moderate acceleration, and deference to pedestrians and cyclists.
> If self-driving vehicles almost never crash, roads will become immensely more safe and inviting to cyclists.
Not unless the more aggressive cyclists also start following the rules of the road as they pertain to bicycles. I can't count the times I've seen near accidents caused by cyclists who make abrupt, last-second dashes into oncoming traffic because they feel they have 100% of the right-of-way 100% of the time. And I say that as a bicyclist myself.
As a driver who recently had a jackass cyclist cut across opposing traffic across Mass Ave in Central Square in front of me, I'd rather have the self-driving car I'll soon be robot chauffeured in stand on the brakes than hit the cyclist, had he been more of a jackass than he was. People inconvenienced for a few seconds by cyclists need to get over themselves and probably have the wrong attitude to have a driver's license.
> As a driver who recently had a jackass cyclist cut across opposing traffic across Mass Ave in Central Square in front of me, I'd rather have the self-driving car I'll soon be robot chauffeured in stand on the brakes than hit the cyclist, had he been more of a jackass than he was.
I agree, however:
> People inconvenienced for a few seconds by cyclists need to get over themselves and probably have the wrong attitude to have a driver's license.
How does observing someone else doing something completely out of my control, that almost causes an accident with me, make me have the "wrong attitude" to have a driver's license? That doesn't begin to make sense. You're also contradicting your own point of view between sentences (in your first sentence, the cyclist is the jackass; in your second, the driver who dares to question the cyclist's bad habits is the jackass unworthy of a driver's license).
"Inconvenienced" is not the right word. "Frightened witless" is more apropos.
Example: The cyclist cuts in front of me, violating traffic laws, and I hit him and he's injured or killed. Not only does this affect me emotionally, but suddenly I'm in the liability hotseat unless I can prove the cyclist was in the wrong. After all, I'm in a one ton car and he's on a 30lb bike. It is this situation that I'm hoping to avoid, and if a cyclist nearly causes it, I'm pissed.
Unless it'd been another car. Then the damage to the other vehicle is pretty obvious and, since we're on city streets, neither driver is injured.
The impression I get from the media and from cycling advocates is that auto drivers are very rarely ticketed for killing or injuring cyclists. But please, ignore that fact and continue to drive so as not to kill and injure cyclists.
Obviously it's better to not have an accident than to have an accident. But there's nothing at all wrong with being angry when some moron on a bicycle does something dangerous and illegal, almost getting himself killed by my car and through no fault of my own.
I wonder how they're going to handle event traffic. Football games, concerts and other events where the traffic is being heavily managed by traffic officers on the street. You're supposed to obey the traffic officer and ignore traffic lights, stop signs and even road directionality. If they direct you to go the wrong way down a one-way street, that's what you do because it is special event traffic.
Will the self-driving cars be able to handle this?
I don't know if they can handle it at the moment, but traffic management during events at Shoreline Amphitheater definitely provides a great opportunity for Google to test this in its backyard.
I think that solution would be far worse. Could you imagine if it got hacked? Some hacker sends all self-driving cars the wrong way down some one-way street.
And even if the Google side doesn’t get hacked – as we’ve seen before, the government side will contract this out to the lowest bidder, who’ll hire some guy in china to do it and give him full root access.
Hate to nitpick, but... the title is misleading. The effect of doing a track stand causes the self-driving car to be confused. Not the fact that the bike has a fixed-gear. I have a free-wheel single gear bike that I can do track stands on...
I'm commuting by bike and on my route in Mountain View I see quite many Google cars. They are very conservative and so far I haven't seen one that acted based on my arm turn signals. Either they don't recognize them, or they don't trust me ;) Overall I feel much more secure biking around autonomous Google cars than human drivers. Just the number of people doing red light violations is incredible here in the SV.
> We repeated this little dance for about two full minutes and the car never made it past the middle of the intersection. The two guys inside were laughing and punching stuff into a laptop.
Lol. This reminds me of communizing on my motorcycle in the rain (vfr800). Whenever I lay flat on the tank to avoid the rain a bit every human-driven car thought I was about to rocket past them. I wonder if a similar thing could happen with a motocycle. Someone on a sportbike can strike a much more speed-like pose than someone on a bicycle. Perhaps a motocycle looking like it was speeding into the intersection could convince the robot to panic break?
There's a YouTube video linked in the article. Basically, stand off the seat, and balance while stationary by rocking forwards and backwards. Possible on a fixed gear bike because pedaling backwards moves the wheel backwards.
Also possible on a "normal" freewheeled bicycle. Just takes a bit of an incline and a bit more body english. Not super useful, but a neat trick, and good balance exercise.
Indeed. I can't do it -- well, I've never really tried -- but I saw a guy doing it this morning. His bike definitely had derailleurs. (Though hmm -- I suppose I don't know for an absolute fact that it had a freewheel. But I've never heard of a rear cluster without a freewheel.)
It's balancing in place on a bike. You turn the handlebars 45 degrees one way or the other, and find the balance point. Since the front wheel is turned, when you pedal forward you lean one way, and when you pedal backward you lean the other way.
It's sometimes employed in track cycling (where fixed gear bicycles are used), hence 'track stand'.
Standing still on a bicycle with both feet on the pedals.
The term comes from track racing where jockeying for starting position by moving at a snails pace and/or frequently stopping signals the start of the race.
Track stands are "almost moving" by design - every rock is supposed to be nearly the explosion that leaps you ahead of the other guy for victory, fortune and glory. It's a bit like a sprinter's crouch - and as a driver, if I saw a sprinter's crouch by the roadside, I'd be cautious and confused.
Interesting. I didn't know the term "track stand" existed, but, yeah, as a cyclist myself, I know a human driver would have flipped me off after a few seconds and plowed through the intersection.
"Gartner placed autonomous vehicles at the peak of inflated expectations"
Sounds about right. I think that we all want to see self driving cars, but I don't think were ready for software with morality. If I'm driving a car filled with kids and I'm faced with the choice of running over the old lady or driving the kids off the cliff I'm pretty confident in what choice I'm going to make. Is that the choice everyone would make? is that the choice the programer made? do I have a morality switch in the car? will it make value judgements about the occupants and the external world when forced to make a bad call?
Its going to be interesting times if these things end up on the road.
Yeah, there's always someone with a contrived answer like yours. If 1/10th the people die because the computer isn't perfect, aren't we much better off? Deaths are actually increasing this year: http://www.newsweek.com/us-traffic-deaths-injuries-and-relat...
"In 2010, there were an estimated 5,419,000 crashes (30,296 fatal crashes), killing 32,999 and injuring 2,239,000.[2] The 32,479 traffic fatalities in 2011 were the lowest in 62 years (1949). Records indicate that there has been a total of 3,551,332 motor vehicle deaths in the United States from 1899 to 2012."
We are much better off, of course, but that isn't enough to get the thing done. We also need to figure out how the "much better off" gets distributed, in a way that convinces everyone to actually do the thing.
If, for instance, Google is held liable for accidents while reducing the overall rate, self-driving cars aren't going to be a thing (unless you fix that particular problem). Or if the victims of self-driving car accidents have difficulty getting compensated.
Is my above scenario contrived, yes. Is it demonstrative of the kind of ethical questions that are going to have to be answered by developers, rather than individuals. It absolutely is. If you note I'm not even dismissive of "self driving cars", I only point out that its going to be INTERESTING if we ever see them.
The reality is that at some point, some where, some piece of software is going to have to make a very hard choice. The road will always have exposure to irrational actors (kids, drunks, a random person crashing their bike).
Are we at a point where we WANT software making that decision? If I have terminal cancer I might make a different choice than if I just had a baby and I was a human behind the wheel. Will we be able to inform the system that were willing to commit self sacrifice? Is there going to be weighing on the system based on ages of those involved? What happens when someone decides to kill themselves by jumping in front of a self driving car?
Lets forget about the legal battle that ensues the first time this happens. We will pretend that no lawyers will be allowed to be involved. The guy who wrote the code that said 4 kids > granny do you tell him? Maybe you have to go back and tell him his code killed 4 kids, and NOT granny and he has to fix it, cause that was wrong? I saved 32,999 people in THEORY but I know my code killed 4, they told me, it was a bug, I have to fix it now.
Lets look for a real world example where we see some of this today. Train drivers kill people on a fairly regular basis. Some times its because people choose to kill themselves, sometimes it really is an accident where there was no intent on the part of the person killed. Do you know what happens to those folks? Do you realize the impact these actions have on their lives? They certainly do, its part of the job, they get training before they go out and drive, the get counseling after. For some of them IT DOESNT HELP. Stats aren't people, they don't react the way we want or expect.
Shouldn't you got the pick the morality of the car you're in? Which will probably ends up with everyone choosing to protect themselves. And if that causes too much externality to society (some kind of Prisoner Dilemma where everyone tries to protect themselves being the worst?), then we have regulation step in and regulate the morality of the systems.
It's not going to be easy, but I don't agree it's harder than the gazillions moral choice we already made as a society nowadays.
harder than the gazillions moral choice we already made.
Im not so sure on that. The two closest things I can find, that address something "remotely close" are the following, and they more or less drag the law into it (sadly) but it is a sort of moral arbiter:
There is a quote in the Dudley and Stephens one that is rather interesting:
"We are often compelled to set up standards we cannot reach ourselves, and to lay down rules which we could not ourselves satisfy. But a man has no right to declare temptation to be an excuse, though he might himself have yielded to it, nor allow compassion for the criminal to change or weaken in any manner the legal definition of the crime."
That's for self driving car advocates to state, rather than simply raising the bogeyman of road deaths every time there is any commentary that detracts from their position.
The easiest answer to your argument is for self driving car advocates to claim that it's the ONLY solution to reduce road deaths. Note that you are the one who claimed there are more than one solution.
Except that that is self-evidently false, and we're not talking about some meaningless argument in a vacuum, but one between humans who have experience of the world.
> Self driving cars are by no means the only solution to road deaths.
I asked you "what other solutions?" You're literally pointing at other unnamed solutions as better alternatives to road deaths, it is very reasonable to ask you to be more specific.
It is in no way up to driverless car advocates to complete an argument that you made.
It's fairly obvious that things like pedestrian detection systems with automatic braking, lane departure warnings, tiredness detection, etc. will make a significant difference as they become widespread. These exist in production today and similar technologies are in active development and deployment.
There is no reason to believe that human driven cars can't be made safe through technology without requiring self-driving. There may be other benefits to self-driving cars that are desirable, but as I said, the road deaths argument is clearly just fear mongering based on the false idea that other technological safety measures are not being developed.
As for costs and numbers - those certainly aren't provided by self driving car advocates, who can't even say when the technology will be available or how basic legal and economic problems it presents will be overcome.
Where do you live that little old ladies are jumping out in-front of fast moving traffic with cliff edges that lack any safety barrier?!
The car is going to make the best effort to minimise impacts to its self, given there's a cliff for it to fall off it's not going to swerve over it. Also as said in a sister comment while you're car full of precious children go over the cliff you can find peace in the knowledge that on average less children have died because there's self driving cars.
I agree that my scenario is contrived. It was meant to be illustrative of the actual ethical dilemmas were going to face when this happens.
Does my car have ethics? If I'm terminal can I tell my car that in a me or someone else situation choose them? What other factors come into play if Im not saying I'm willing? Is it age? Does my car have some sense of WHO is involved and makes a decision based on some arbitrary stat? does it flip a coin?
In theory 32k people live with self driving cars. In theory 32k people live if we all stop driving. If you are give you the choice of writing two programs, one that prevents cars from working at all or one that is going to kill four kids what program are you going to write? There are a lot of things about this that aren't cut and dry at all. Like I said its going to be interesting if we ever get them.
It doesn't bother me actually. The point of the clearly contrived situation is to make one think about the ethics involved.
If I'm terminal with cancer, can I tell my car that when it gets to a me vs them, that I'm willing to sacrifice myself? What happens when the me vs them scenario is someone attempting to kill themselves by jumping in front of a car? How does the car resolve the conflict in a case where it has NO other parameters? Are we going to say "auto sacrifice car occupant"? Do we have the ability to make a guess based on age of the involved? How about the "value" of the people involved? Would you trust a self driving car built by bill gates in a scenario between you and bill gates to choose you over bill? Like I said, its going to be interesting.
I'm having trouble imagining a scenario where I have enough time to consider who should live and who should die, make a decision, and act on it... but not enough time to just brake and avoid the accident.
Maybe autodrive cars, but I certainly do not want to see driverless cars. I want meat onboard. Licensed meat. Someone to get the tickets. Someone to sue when the robot does something stupid. Someone with a foot on the hydraulic braking system and mechanical steering for when the electrics fail.
I think you strongly overestimate how helpful having a human being in the car would be in any of those scenarios. If my mother is the driver and the power steering goes out, she might as well not be there. And we already have well established systems for giving out tickets to cars whose owners aren't physically in the car (license plates, registrations tied to VINs, etc.).
Really? How can a cop issue a google-powered car with a ticket for reckless driving? Are cops now to submit bug reports when they see cars making mistakes or misreading signs? They can churn out the tickets, but the likelihood of a ticket altering the machines behavior is very slim. Now if a person is in the car, they can learn where the software falls down and adapt accordingly. If the robot doesn't spot the new speed limit sign, the human can.
I'm still waiting to see how these google-cars manage with hand gestures, people directing traffic. Just today I was waved through a red light by a volunteer firefighter not in any sort of uniform.
The actual problem will be that cops will have nothing to do and no ticket revenue to pay them with. We'd better get on what to do with a 70% decline in cop workload before they come up with some bright ideas.
> Someone with a foot on the hydraulic breaking system and mechanical steering for when the electrics fail
A huge number of accidents are because of inattention or distraction today. How are you going to the the bag of meat to pay attention in case of an anomalous situation if 99+% of the task is being taken care of for them?
Not sure if you missed the point of my question. Unexpected situations happen very fast while driving. If the human driver isn't sitting there ready to take over, they might as well not be there for that type of situation, and if they haven't had to do anything for the last half hour while driving, it's highly unlikely their eyes will consistently be on the road with hands ready to take over the controls.
Getting the human to take over from a failing autopilot is a fool's gambit. It takes the human thirty seconds to become as competent as a drunk driver, about a minute to get to normal competency levels.
The time it takes a human to regain competence suggests that the only way to safely hand over is for the robot to come to a complete halt before ceding control.
Where is the meat operating Predator/Reaper/Global Hawk drones? Thousands of miles away in an air-conditioned cargo container in the outskirts of Las Vegas.
Ok is there a difference between "writing software for global hawk" and "writing software for self driving cars"
As humans killing others is something we do, and based on history were pretty good at. We have the ability to justify our actions in some fairly ugly ways. There is a great quote from fog of war (check it out if you haven't seen it) on the firebombing of japan in WWII
"LeMay said, "If we'd lost the war, we'd all have been prosecuted as war criminals." And I think he's right. He, and I'd say I, were behaving as war criminals. LeMay recognized that what he was doing would be thought immoral if his side had lost. But what makes it immoral if you lose and not immoral if you win?"
So thats war, thats folks who build the engines of war, and send folks out to die and kill. I would say that in writing a missile guidance system, your writing it as "fit for purpose" and should KNOW and COMPREHEND what is going to happen when it is used.
Is that going to be the same mind set for someone building a self driving car? Would I, or you, feel any better knowing that I had saved lives in 'theory' or 'statistically' only to know that I had caused actual death with my programing? I don't know how I would feel, I'm somewhat thankful that I'm not facing that dilemma in my current job.
I think you missed my point. Ignore for the moment what drones are used for (a pilot still makes the final decision to deploy ordinance).
My point was, software has advanced far enough for unmanned vehicles can fly on their own and navigate without human intervention. The X47-B (the Navy's "Salty Dog" UAV) can take off from an aircraft carrier, navigate to waypoints, refuel on-air from a tanker, and land back on an aircraft carrier deck in the ocean without any human intervention.
Its not a stretch to think that self-driving cars are of similar complexity, and the problems around them will be solved in due time as well. There is no reason to have "meat" in the driver's seat anymore.
A drone isn't the same thing as a car with a person in it.
Look at the problem tree for the drone, if it comes to a decision between crashing into the big expensive carrier or dumping it self in the sea, whats it going to choose? You can base this decision on dollars only and still make the right call.
The self driving car is going to face a genuine ethical dilemma at some point (or its coding will have to deal with one, as it isn't "self aware"). For the sake of simplicity lets make that "occupant or pedestrian". Are the lives of equal value? How does it choose? Am I allowed to influence that choice as the "passenger" in the car? By getting in the car do I have to agree to some TOS that says the car will sacrifice me to save someone else when it is unavoidable?
I don't think the pilot makes that decision. They push the button according to essentially a decision flowchart. Their decision is based on evidence gathered by others, wanted lists drawn up by politicians, standards written by committees, and orders from commanders. They do not see the guy's face in the truck and match it to a wanted poster. (Even in than scenario the decision is partially made by the person who created the poster.) The pilot/operator may decide the exact moment the ordinance is released, but he is not the person making the decision to kill. That is a very shared responsibility.
Motorcycles are already 45 times more dangerous than a car. I have a hard time (even as an ex-motorcycle owner/rider) feeling bad for motorcycle riders who are worried about self-driving cars (which are already better drivers than humans).
Except that in many ways a motocycle is no different than a bicycle, wheelchair, pedestrian, moped, horse+rider, or anyone else not wrapped in steal. Cars must share the road with non-cars.
Right. Google and several other tech companies are going to keep pouring vast amounts of money into self-driving tech (to ensure its ability to "share the road with non-cars"), and naysayers are going to yell angrily at clouds.
In this case, I appreciate the momentum behind self-driving vehicles. The sooner people are replaced, the better.
I see the momentum but question its direction. Why not start with driverless freight trains? That would seem a far more controlled environment and some cities (Vancouver, DisneyWorld) have been doing it with passengers for years. Then self-driving planes. They already have autopilots. When I see people lining up to board a pilotless 747 then I'll start trusting driverless cars to not flatten me from behind as I sit on my bike helpless at a red light.
I have yet to see a production car that doesn't have manual brakes, either hydraulic in the case of master-slave cylinders or mechanical in the form of parking/emergency brakes, that can be engaged without electricity. There might actually be a rule about that, else parked cars could start drifting away if their batteries run down.
But is there a model that, if you disconnect the battery, has no brakes? The few electronic parking brakes I've seen fail into the on position, much like truck trailers and or traincars. They lock if they loose voltage/pressure from the controlling system. And even those cars had master cylinders under the pedal.
There are no break by wire production cars in the USA. Infinity sells the only steer by wire car. It uses clutches that engage a direct mechanical link if the electronics fail.
Ok, I'm just going to come out and say, I'm a licensed human driver and a bicyclist, and I find track stands confusing too. Like a skateboarder in the bike lane, there's a lot of potential for unexpected motion, and that needs to be accounted for.
The bicyclist is wobbling in place, but could easily transition to going forward or backwards without warning. The bicyclist is intent on the stand, so drivers won't be able to gain eye contact.
I think the cloud of uncertainty around bicyclists is also why cars wrongly yield to my bike so often at stop signs. Some bicyclists do unpredictable things, and drivers incorporate that into their mental models.
People learn, and I'm confident that self-driving cars will too. Eventually.