Could Shakespeare and Jane Austen Help Teach Robots Ethics?
Innovators are thinking about how robots might learn to think and act morally.
As the world enters into a new age of robotics, researchers are starting to think about how to teach artificial intelligence (AI) programs to think and act morally. How can robots receive an education in ethical complexity? Could they acquire what humans call a conscience?
A number of experts in the field of AI believe that these examples can be found in stories or literature. Scientists at the School of Interactive Computing at the Georgia Institute of Technology are developing a system to teach robots to learn from fictional characters. They call the system Quixote, The Guardian reports, from the “honorable but deluded Spanish gentleman who believed the world was exactly as depicted in the chivalric romances he loved reading.”
The system encourages robots to behave like the admirable characters in the tales they are told, whether it’s Henry V or Pride and Prejudice. The robots are trained to read stories, learn acceptable sequences of events and understand successful ways to behave in human societies.
Quixote places rewards on socially appropriate behavior. Essentially, the system learns that it will be rewarded when it acts like the protagonist in a story instead of like the antagonist, or randomly on its own.
Researcher Mark Riedl says that the system is a “primitive first step toward general moral reasoning in AI” and will be best for robots that have a limited purpose but need to interact with humans to achieve said purpose.
However, The Guardian points out that not all stories will make sense to robots. Fiction and drama contain stories of time travel, magic, wizards and monsters. Though there are some cultural myths and legends that teach important lessons, oftentimes much of the story is told through the supernatural or the plot is just a little too far-fetched, something the robots will not understand.
But sometimes the message in the literary tale is clear even when no one character is worth emulating, from stories like The Iliad to The Talented Mr. Ripley. What then, does the robot learn?
The author of The Guardian article, John Mullan, is a professor of English literature at University College London. He is skeptical of Quixote because he says the people who get paid to spend their careers reading and rereading the world’s greatest literary masterpieces—like himself—are not any morally better, socially more skilled, or psychologically more adept than anyone else.
This article was featured in the InsideHook newsletter. Sign up now.
Suggested for you