Are armed drones a violation of the three rules of robotics?
Observing members:
0
Composing members:
0
13 Answers
The 3 Rules of Robotics, as laid out by Asimov, are fiction. They have no bearing on reality.
Good question. I don’t think it violates that rule, because the drones do not do any variation of thinking. No artificial intelligence is involved. They are controlled by a human who directs their path.
If the drone is ever controlled by any type of intelligent software (AI) then the three rules, IMHO, should apply. They probably wouldn’t, due to the reasons cited above by @canidmajor.
Well, the 3 laws apply to robots which are more or less completely autonomous, or in many cases in asimov’s work, sentient, however drones are basically remote control planes.
Drones don’t kill people; humans piloting drones kill people.
@canidmajor Asimov’s rules may have been fictional but reality is catching up fast.
@flutherother: my post states that the 3 Rules are fiction. Not the advances in robotics. There has been a lot of talk about the concern re: the development of AI and how that would relate to humans, to my knowledge (which is not absolute, I get that) no one has been programming 3 Laws restrictions into AI attempts. And wouldn’t that essentially negate AI parameters?
As far as I know, which isn’t very far, no one has been programming AI to ensure humans can’t be hurt but it’s something that will have to be borne in mind as AI develops. Asimov’s fiction is becoming our reality.
I think there is a fundamental disconnect between true AI and programming of specific values.
If you’re interested, a realistic take on this concept was presented in the film Ex Machina. It’s worth the watch no matter how you feel about this topic.
It was one of my favourite films of the year.
PS AI has just beaten the world’s best player at Go There are four games still to play.
Asimov’s “Laws” are a fine idea, but it’s not like they were ever agreed upon by any society. Given there is a lot of research in AI funded by various militaries, there will be a lot of AI that harms.
You can call the Laws of Robotics as rules, but who says anyone is playing by the rules?
I loved that film too, creepy ending.
AI has come a lot farther than many people realize. But waay short of what people consider “AI”
No. For one thing, armed drones are not robots in any sense of the word.
Although unmanned, a drone is still piloted, electronically, by army personnel from a military base. There is still human command and control, and the decision to fire (and kill) is still a human one.
A robot, in contrast, is self-controlled and working according to its own thought processes.
The key fallacy with your question is to think that drones are robots. They are not.
Answer this question