Hacker Newsnew | comments | show | ask | jobs | submit login

Although I have up voted you, it is known that humans are more prepared to face new dangers than software applications. And a quad-copters flying over a city is exposed to a multitude of unknown dangers that its software may not expect, thus it's plausible that in the same situation of danger a human may find better solutions than a quad-copter.



You are right that humans are better when it comes to adapting to new situations. But that doesn't matter. What matters is risk vs reward. For automobiles, we have about 40,000 deaths each year. But as a society we accept that cost because the utility of cars is worth even more. The same thing will eventually happen with UAVs. The utility will offset the cost.

-----


I'm not saying that I don't want to see UAVs flying over my head. I'm just saying that maybe a bit of over-prudence is justified here.

Should we start testing UAVs for commercial applications? Yes. Should we allow 13 years old kids to code on them and build their new cool pizza delivery service? Not yet, I think. Just my $0.02.

-----


What would you like to see in terms of regulations? (this is a sincere question)

At the moment, a 13 yr old could fly a radio controlled F-15 legally, so long as it was for recreation and not for commercial usage.

-----


Currently, the deaths associated with a great many UAVs appear to be a bug, not a feature.

I'm talking, of course, of the 'killer' app - assassination!

e.g. https://en.wikipedia.org/wiki/Predator_drone

-----


The solution to that is good regulation, not complete banning. While software has the disadvantage of being unable to adapt on the fly, it has the advantage of never making mistakes - if it's written correctly, it will not randomly fail. That's decidedly not the case for humans, as we see every day with automobile accidents.

-----


Software is only a part of it. I deal every day with software that doesn't randomly fail -- we write damn good code -- but the hardware it controls has failure modes that can't always be predicted, or are so unlikely that no one thought of it.

Realize here that we're not talking about software running on a server in a nice temperature controlled room. This is software controlling hardware that is under constant vibration, will get sticky, or bend, or break, or ice up under variable conditions - hot, dry, rainy, wind gusts as it goes from behind a building to crossing an intersection. There is a mind boggling number of things that can go wrong when you're controlling a device in the 'real world.'

Even if the software is perfect there is still a large number of variables to account for and most of them can't be controlled. There's a case for UAVs and I would love to get involved with them, but building a reliable UAV and properly maintaining it would almost certainly cost too much to have it deliver tacos. Unless you're willing to spend $250 to avoid walking a few blocks.

-----


If it's written correctly [...]

No one argues this is not true, it's the premise that's unlikely.

-----


Well, virtually all modern cars have software running critical systems. How is that regulated? I don't think it would be too difficult to adapt those regulations to flying "tacocopters".

-----


One of the most basic safety measures cars take is to reboot critical systems several times an hour and have mechanical backups so the breaks both work and can overpower the engine. You can't exactly do that with a drone.

-----


I am not aware of a single automotive subsystem controller (or any other embedded system for that matter) that reboots as a preventative mechanism. For handling an unrecoverable error, yes, that's standard practice. But rebooting in an attempt to prevent errors? That screams bad design.

Can you offer more details?

-----




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: