An ethicist reveals that intelligent machines could avenge the future of human beings and demand rewards for the mistreatment suffered by them.
“One day the artificial intelligences will see us as apes that walked upright condemned without remedy to extinction,” they say in the movie Ex machina. We do not know if there is much left for the machines to inherit the earth, but some specialists draw the possibility that future AI, stripped of current limitations, will take revenge for the way we treat it today.
While currently most AI systems are algorithms capable of detecting certain patterns within the data, when machines increase their sensitivity the thing could change. So says Nicholas Agar, an ethicist at the University of Victoria in Wellington who warns that future robots may want a reward for the way we treat their insensitive ancestors today.
“Perhaps our behavior toward non-sensitive artificial intelligence today is due to the way we would expect people to behave with whatever future artificial intelligence they might feel, that they might suffer,” Agar wrote in an essay for The Conversation published on Tuesday. “How would we expect that future sensitive machine to react to us?”
As underlined in Futurism, the notion that battered robots have become violent has soaked the science fiction universe from “Westworld” to “Bladerunner”. In summary, according to Agar’s thinking, “if we are going to make machines with human psychological capabilities, we should prepare for the possibility of them becoming sensitive.”
The concept of abuse towards machines is also linked to other violence and addresses the problem of how humans treat other living beings. Proponents of robot brothels, for example, argue that people with violent tendencies can act according to their impulses without hurting anyone, while others are concerned that it involves an approach to the violence they seek with real people, especially in the case of sexual abuse, pedophilia and violence against women. In fact, therapeutic evidence of sexual robots does not exist to date, but can aggravate existing problems.