In Prof. Tegmark’s recent presentation at the UN he mentioned the possibility of extremely cheap drones that approach the victim's face very quickly and pierce a bolt into their brain through one of their eyes. Such a drone wouldn't require high precision projectiles which would make it cheap to build.
> Of course the reason governments don't do this is because almost nobody sees the risks of AI "acting on its own"…
It is near impossible to enforce something like this globally and forever. At best, it would be a near-term solution, especially so because there is a huge military and economic interest in technology and AI. Quite possibly, the only long-term solution is solving the control problem.
That's super impractical for many reasons. Drones don't move fast enough, nor can they change their velocity quickly enough to do that. People will almost certainly react if they see the drone coming for them, and cover their face, or swat it down, etc.
But even if it did work, it's not a serious threat. For some reason people spend a ton of time thinking about all the ways new technologies could be abused by terrorists. For some reason they never consider that tons of existing technologies can be abused too.
Many people have come up with really effective terrorist ideas that would kill lots of people, or do lots of damage. The reason the world is still here is because terrorists are really rare, and really incompetent.
> Short range slingshot mechanisms are several orders of magnitude cheaper to build than firearms.
Not necessarily. It's actually not that difficult to make a firearm from simple tools and parts from a hardware store. And it will be way more deadly and accurate than a sling. Not to mention simple pipe bombs and stuff.
I think a new quality about this kind of weapon is that it can be controlled remotely or can even operate semi-autonomously. Deadly pipe bombs are certainly heavier than a crossbow and ignition mechanisms aren't trivial to build.
It's not much more of a threat to society than a handgun. A WMD it is not, unless you make a lot of these and launch them at the same time, which is probably less effective than an H-bomb. (The one major difference between such a drone and a handgun is you might be able to target politicians and other people who're hard to shoot; a somewhat higher mortality rate among politicians is hardly a huge threat to society though.)
> there is a huge military and economic interest in technology and AI
There's also a huge interest in nuclear energy, and it doesn't follow that a consumer should be or is able to own a nuclear reactor. If anyone took the dangers seriously, it wouldn't be that hard to at least limit what consumer devices can do. Right now a single botnet made from infected consumer PCs has more raw computing power than the biggest tech company server farm, which is ridiculous if you think of rogue AI as an existential threat. Actually it's ridiculous even if you think of "properly controlled" AI in the wrong hands as an existential threat; the botnet could run AI controlled to serve the interest of an evil programmer. Nobody cares because nobody has ever seen an AI that doesn't fail the Turing test in its first couple of minutes, much less come up with anything that would be an existential threat to any society.
These drones could be programmed to target specific groups of people, for example of a certain ethnicity, and attack them almost autonomously. Short range slingshot mechanisms are several orders of magnitude cheaper to build than firearms. Moreover, the inhibition threshold is much lower if you are not involved in first-hand violence. There is also a much lower risk of getting busted and no need for intricate escape planning.
you got me thinking, googling, then frowning. imagine this https://www.youtube.com/watch?v=crzXD6NjBAE milspecced.
Are these drones also self-replicating and fully independent?