Slashdot stories can be listened to in audio form via an RSS feed, as read by our own robotic overlord.


Forgot your password?

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Security IT

How Dangerous Could a Hacked Robot Possibly Be? 229

Posted by CmdrTaco
from the i-for-one-welcome-DELETED dept.
alphadogg writes "Researchers at the University of Washington think it's finally time to start paying some serious attention to the question of robot security. Not because they think robots are about to go all Terminator on us, but because the robots can already be used to spy on us and vandalize our homes. In a paper published Thursday the researchers took a close look at three test robots: the Erector Spykee, and WowWee's RoboSapien and Rovio. They found that security is pretty much an afterthought in the current crop of robotic devices. 'We were shocked at how easy it was to actually compromise some of these robots,' said Tadayoshi Kohno, a University of Washington assistant professor, who co-authored the paper."
This discussion has been archived. No new comments can be posted.

How Dangerous Could a Hacked Robot Possibly Be?

Comments Filter:
  • by IDtheTarget (1055608) on Thursday October 08, 2009 @10:07AM (#29681393)
    I'm not at liberty to get into details, but suffice it to say that Predators and Reapers utilize security features provided by the NSA, that were incorporated into the design, and are effective. While nothing is impossible, IMHO it is vanishingly unlikely that control of either of these devices could be wrested away from the appointed controller. Jammed, yes. Hacked, no.
  • Re:Beware of robots (Score:3, Informative)

    by The Evil Couch (621105) on Thursday October 08, 2009 @10:45AM (#29681865) Homepage
    That is incorrect. Shoving is the answer. Shoving will protect you from the terrible secret of space. []
  • by _Sprocket_ (42527) on Thursday October 08, 2009 @11:03AM (#29682111)

    This meme has to stop. No his stories weren't about how to subvert the 3 laws. The stories were about how robots were used by humans, who manipulated the robots to perform malicious acts without breaking those laws. There is a subtle difference. And due to the diligence of Elijah Bailey, or Wendell Urth, the humans responsible were *always* caught because the 3 laws defined the behaviour of the robots in such a dependable manner.

    Not all the issues with the three laws were about manipulation. There were times when the robots fell in to undesired behavior due to the 3 laws all on their own accord. There are two examples that come to mind.

    The first is when Powell and Donovan are assigned to revitalize a mining operation on Mercury (Runaround). One of their robots is given a simple instruction. However, they soon find it behaving in an erratic manner and thus the mystery is set. It turns out the robot set out to follow the initial order (second law: a robot must obey any orders given to it by human beings) but then finds out fulfilling that law would invalidate another law (third law: a robot must protect its own existence). The robot's behavior is due to following the 2nd law until the 3rd law comes in to conflict at which point it would retreat until the 2nd law came in to effect again. The humans had to invoke the 1st law (a robot may not injure a human being or, through inaction, allow a human being to come to harm) to finally break it out of it's cycle.

    A second example is Dr. Calvin's analysis of a telepathic robot (Liar!). The telepathic ability is an unexplained anomaly but the humans interacting with the robot soon find it advantageous as the robot can tell them all manner of information about the people around them. Unfortunately for Calvin's social situation, the robot is also able to determine what people want to hear. The robot determines that telling a lie that a human wanted to hear avoids harming a human by telling a truth that would be distressful. This behavior ends up putting Calvin in an uncomfortable social situation until she gets her revenge by pointing out to the robot that it's attempts to avoid hurting a human by lying had ended up hurting a human, causing a logical paradox and destroying the robot's mind.

  • by Kell Bengal (711123) on Thursday October 08, 2009 @11:39AM (#29682593)
    I'm going to pull out the Yes-I-make-robots-for-a-living-card here and tell you that both your points are quite untrue. Firstly, hacking robot code is not just a case of saying "Do Y, then do X" - I'm sorry, but it doesn't work that way, especially if you have something like cascading vision systems and sensor fusion.

    Software, and robot software in particular, is extremely brittle - you muck up one little bit and it doesn't go haywire, it just falls in a heap and does very little at all. The level of cognition required to, say, determine that it's unsafe vs safe is no different from that required to determine if it's safe vs unsafe. Maybe if you know the code well enough to slip a single '!' into the test, sure you could do something like what you suggest, but you've still got to be smart enough to know where in the code that is, /and/ be able to remotely modify that code in the first place. Actually, the best place to make malicious code changes would be in a UAV, where doing nothing at all is as good as sending the "Halt; scatter internals over a wide area" command.

    The market is most definitely open to anyone. I've been to robotics expo's geared to military customers in particular (in fact, I was at one in Boston recently) where everybody from Raytheon to a backyard operator building recon bots from modified RC trucks were present. If you've got a better mousetrap you can definitely sell it - if not to the government, then at least to a huge corporation (who might then give you a job!).

Byte your tongue.