Fair
46°FFairFull Forecast

Do humans have moral obligations to robots?

Published: Tuesday, Oct. 9, 2012 4:16 p.m. CDT

DeKALB – On the topic of computers, artificial intelligence and robots, Northern Illinois University Professor David Gunkel says science fiction is fast becoming “science fact.”

While robotic personifications are still the stuff of fiction, the issues they raised have never been more relevant than today, said Gunkel, an NIU Presidential Teaching Professor in the Department of Communication.

In his new book, “The Machine Question: Critical Perspectives on AI, Robots, and Ethics,” Gunkel ratchets up the debate over whether and to what extent intelligent and autonomous machines can have legitimate moral responsibilities and claims to moral treatment.

Ethics is understood as questions of responsibility for and in the face of another, presumably another person. But Gunkel, who holds a Ph.D. in philosophy, notes this cornerstone of modern ethical thought has been significantly challenged, most visibly by animal rights activists but increasingly by those at the cutting edge of technology.

“If we admit the animal should have moral consideration, we need to think seriously about the machine,” Gunkel said. “It is really the next step in terms of looking at the nonhuman other.”

Gunkel points out that real decision-making machines are ensconced in daily life. Machines are trading stocks, deciding whether you’re creditworthy and conducting clandestine drone missions overseas.

“It’s getting more difficult to distinguish whether we’re talking to a human or to a machine,” Gunkel said.

“In fact, the majority of activity on the Internet is machine traffic – that is, machine to machine. Machines have taken over. It has happened.”

Some machines even have the ability to become smarter, raising questions over who is responsible for their actions. Programmers could be viewed as parents, Gunkel said, who are no longer responsible for decisions and innovations the machine comes to on its own.

Some governments are beginning to address such ethical dilemmas. South Korea created a code of ethics to prevent human abuse of robots and vice versa. Japan’s Ministry of Economy, Trade and Industry is purportedly working on a code of behavior for robots, especially those employed in the elder-care industry.

Gunkel  said  he was inspired to write “The Machine Question” because engineers and scientists are increasingly bumping up against important ethical questions related to machines.

Reader Poll

Have you been directly impacted by heart disease?
Yes
No