After a year like 2020, it can feel like the world is transforming into a gigantic Black Mirror episode. That inkling could very well be a reality, thanks to a recent Microsoft patent.
In December 2020, the U.S. Patent and Trademark Office granted Microsoft a patent which details a process of creating a chatbot based on a person’s social media data. Even a deceased friend or family member could be resurrected as a chatbot, the patent reads.
Popular Mechanics recalled the similarities of this patent to Black Mirror episode “Be Right Back” in Season 2.
In the episode, the protagonist’s husband dies in a car crash. Shortly after, she signs up for a service that creates a robot of her husband from his social media data.
Microsoft’s patent sounds eerily similar.
According to the patent, the chatbot would have a personality derived from “images, voice data, social media posts, electronic messages, and written letters.” There is also an option to manually input the personality into the chatbot yourself.
Microsoft’s chatbot opens up a number of ethical questions and concerns. Anyone on the internet could find enough social media data on a famous actress and impersonate her with a chatbot.
The ability to resurrect a deceased partner or family member as a chatbot is of special concern.
Business Insider writes that the patent would run into several issues as 23 states recognize postmortem rights. Postmortem rights are defined as allowing “a deceased person anywhere from 10 to 100 years of protection from unauthorized use of their identity.”
These rights would come into question if someone was trying to replicate a loved one, depending on the state they lived in.
Jordan Wales, a theology professor who has done research into artificial intelligence, has his reservations about Microsoft’s chatbot.
“It would prevent someone from going through the healthy process of grief. In terms of human flourishing and psychology, it’s pretty obvious that substitutes for grief don’t work,” Wales said.
A bot of a deceased loved one, Wales claimed, would simply be a “rebound robot.”
“There would also be a subtle instrumentalization of the other person. If the dead or absent person is unable to provide what I want, there’s a risk of erosion of the relationship. It is a mode of self-soothing that utilizes the apparent presence of the other that ends up reducing the person,” Wales said.
Wales argued that to create a chatbot based off a deceased loved one shows a fundamental misunderstanding of grief.
“Not understanding grief prevents one from making the final gift of self to the loved one which is the acceptance of the loved one’s passing,” Wales said. “Everyone has to come to maturation. Having the chatbot for the person who died, one is not giving oneself to the person, one is clinging to the person.”