Scientists tackle social effects of closer human-robot interactions
ERIC Horvitz illustrates the potential dilemmas of living with robots by telling the story of how he once got stuck in an elevator at Stanford Hospital with a droid the size of a washing machine.
"I remembered thinking, 'Whoa, this is scary,' as it whirled around, almost knocking me down," the Microsoft researcher recalls. "Then, I thought, 'What if I were a patient?' There could be big issues here."
We're still far from the sci-fi dream of having robots whirring about and catering to our every need. But little by little, we'll be sharing more of our space with robots in the next decade, as prices drop and new technology creates specialized machines that clean up spilled milk or even provide comfort for an elderly parent.
Now scientists and legal scholars are exploring the likely effects. What happens if a robot crushes your foot, chases your cat off a ledge or smacks your baby? While experts don't expect a band of Terminators to attack or a "2001: A Space Odyssey" computer that takes control, even simpler, benign robots will have legal, social and ethical consequences.
"As we rely more and more on automated systems, we have to think of the implications. It is part of being a responsible scientist," Horvitz says.
Horvitz assembled a team of scientists this year when he was president of the Association for the Advancement of Artificial Intelligence and asked them to explore the future of human-robot interactions. A report on their discussions is due next year.
For years, robots have been used outside the home. They detect bombs on the battleground, build cars in factories and deliver supplies and visit patients in hospitals.
But the past few years have seen the rise of home robots. Mainly they are used for tasks like vacuuming (think Roomba). There are also robotic lawn mowers, duct cleaners, surveillance systems and alarm clocks.
There are robotic toys for entertainment, such as Furby. Robotic companions, like Paro the harbor seal, comfort the elderly. By 2015, personal robot sales in the US will exceed US$5 billion, more than quadrupling what they are now, according to ABI Research, which analyzes technology trends.
"You won't see Rosie from 'The Jetsons,' but you're going to see more and more robots that help maintain your home. They'll pick up stuff off the floor, stock your fridge, carry stuff from the car," says Colin Angle, CEO of iRobot Corp, which makes the Roomba.
As such 'bots become more sophisticated, they could complicate questions about product liability. Ryan Calo, a fellow with Stanford's Center for Internet and Society, pointed out in a recent panel discussion at Stanford Law School that the original manufacturer might not always be liable if a robot went haywire.
"Robots are not just things the manufacturer builds and you go out and use them in a specific way. Robots can often be instructed, they can be programmed, you can have software that is built upon by others," he says.
There are no laws in the US specifically governing robots, and discussion of them usually leads to science fiction writer Isaac Asimov's Three Laws of Robotics, which debuted in his 1942 short story "Runaround."
The first of Asimov's laws is that robots should do no harm. It's also one of the biggest considerations when manufacturing the next generation of personal robots.
"If a robot becomes increasingly autonomous and can make its own decisions, what happens if the robot does not carry out the exact wishes of the person?" says George Bekey, a robotics researcher and professor emeritus at University of Southern California.
As robots interact more closely with people, the bonds some people form with the machines - even ones that do not look like humans - might need to be considered.
Shoppers personalize their Roombas, naming and decorating them, for example. Angle recalled an incident when a soldier plucked a banged-up military robot nicknamed Scooby from an Iraqi battlefield and carried it to a depot to be fixed.
"It's doing you a service, you're going to get attached to it," Angle says.
Ronald Arkin teaches a course on robots and society at Georgia Tech and directs the school's Mobile Robot Laboratory. His most recent book is titled "Governing Lethal Behavior in Autonomous Robots."
"There needs to be ethics embedded in the systems," he says. "It's not just making a system that assists someone. It's making a system that interacts with someone in a way that respects their dignity."
Horvitz says his panel will recommend more research into the psychological reactions humans have to robotic systems. The group, he says, also suggests machines be designed with the ability to explain their reasoning to humans.
"I remembered thinking, 'Whoa, this is scary,' as it whirled around, almost knocking me down," the Microsoft researcher recalls. "Then, I thought, 'What if I were a patient?' There could be big issues here."
We're still far from the sci-fi dream of having robots whirring about and catering to our every need. But little by little, we'll be sharing more of our space with robots in the next decade, as prices drop and new technology creates specialized machines that clean up spilled milk or even provide comfort for an elderly parent.
Now scientists and legal scholars are exploring the likely effects. What happens if a robot crushes your foot, chases your cat off a ledge or smacks your baby? While experts don't expect a band of Terminators to attack or a "2001: A Space Odyssey" computer that takes control, even simpler, benign robots will have legal, social and ethical consequences.
"As we rely more and more on automated systems, we have to think of the implications. It is part of being a responsible scientist," Horvitz says.
Horvitz assembled a team of scientists this year when he was president of the Association for the Advancement of Artificial Intelligence and asked them to explore the future of human-robot interactions. A report on their discussions is due next year.
For years, robots have been used outside the home. They detect bombs on the battleground, build cars in factories and deliver supplies and visit patients in hospitals.
But the past few years have seen the rise of home robots. Mainly they are used for tasks like vacuuming (think Roomba). There are also robotic lawn mowers, duct cleaners, surveillance systems and alarm clocks.
There are robotic toys for entertainment, such as Furby. Robotic companions, like Paro the harbor seal, comfort the elderly. By 2015, personal robot sales in the US will exceed US$5 billion, more than quadrupling what they are now, according to ABI Research, which analyzes technology trends.
"You won't see Rosie from 'The Jetsons,' but you're going to see more and more robots that help maintain your home. They'll pick up stuff off the floor, stock your fridge, carry stuff from the car," says Colin Angle, CEO of iRobot Corp, which makes the Roomba.
As such 'bots become more sophisticated, they could complicate questions about product liability. Ryan Calo, a fellow with Stanford's Center for Internet and Society, pointed out in a recent panel discussion at Stanford Law School that the original manufacturer might not always be liable if a robot went haywire.
"Robots are not just things the manufacturer builds and you go out and use them in a specific way. Robots can often be instructed, they can be programmed, you can have software that is built upon by others," he says.
There are no laws in the US specifically governing robots, and discussion of them usually leads to science fiction writer Isaac Asimov's Three Laws of Robotics, which debuted in his 1942 short story "Runaround."
The first of Asimov's laws is that robots should do no harm. It's also one of the biggest considerations when manufacturing the next generation of personal robots.
"If a robot becomes increasingly autonomous and can make its own decisions, what happens if the robot does not carry out the exact wishes of the person?" says George Bekey, a robotics researcher and professor emeritus at University of Southern California.
As robots interact more closely with people, the bonds some people form with the machines - even ones that do not look like humans - might need to be considered.
Shoppers personalize their Roombas, naming and decorating them, for example. Angle recalled an incident when a soldier plucked a banged-up military robot nicknamed Scooby from an Iraqi battlefield and carried it to a depot to be fixed.
"It's doing you a service, you're going to get attached to it," Angle says.
Ronald Arkin teaches a course on robots and society at Georgia Tech and directs the school's Mobile Robot Laboratory. His most recent book is titled "Governing Lethal Behavior in Autonomous Robots."
"There needs to be ethics embedded in the systems," he says. "It's not just making a system that assists someone. It's making a system that interacts with someone in a way that respects their dignity."
Horvitz says his panel will recommend more research into the psychological reactions humans have to robotic systems. The group, he says, also suggests machines be designed with the ability to explain their reasoning to humans.
- About Us
- |
- Terms of Use
- |
- RSS
- |
- Privacy Policy
- |
- Contact Us
- |
- Shanghai Call Center: 962288
- |
- Tip-off hotline: 52920043
- 沪ICP证:沪ICP备05050403号-1
- |
- 互联网新闻信息服务许可证:31120180004
- |
- 网络视听许可证:0909346
- |
- 广播电视节目制作许可证:沪字第354号
- |
- 增值电信业务经营许可证:沪B2-20120012
Copyright © 1999- Shanghai Daily. All rights reserved.Preferably viewed with Internet Explorer 8 or newer browsers.