SHARE
Pixabay

After a year like 2020, it can feel like the world is trans­forming into a gigantic Black Mirror episode. That inkling could very well be a reality, thanks to a recent Microsoft patent.

In December 2020, the U.S. Patent and Trademark Office granted Microsoft a patent which details a process of cre­ating a chatbot based on a person’s social media data. Even a deceased friend or family member could be res­ur­rected as a chatbot, the patent reads.  

Popular Mechanics recalled the sim­i­lar­ities of this patent to Black Mirror episode “Be Right Back” in Season 2. 

In the episode, the protagonist’s husband dies in a car crash. Shortly after, she signs up for a service that creates a robot of her husband from his social media data.
Microsoft’s patent sounds eerily similar. 

According to the patent, the chatbot would have a per­son­ality derived from “images, voice data, social media posts, elec­tronic mes­sages, and written letters.” There is also an option to man­ually input the per­son­ality into the chatbot yourself. 

Microsoft’s chatbot opens up a number of ethical ques­tions and con­cerns. Anyone on the internet could find enough social media data on a famous actress and imper­sonate her with a chatbot.

The ability to res­urrect  a deceased partner or family member as a chatbot is of special concern. 

Business Insider writes that the patent would run into several issues as 23 states rec­ognize post­mortem rights. Post­mortem rights are defined as allowing “a deceased person any­where from 10 to 100 years of pro­tection from unau­tho­rized use of their identity.”

These rights would come into question if someone was trying to replicate a loved one, depending on the state they lived in.

Jordan Wales, a the­ology pro­fessor who has done research into arti­ficial intel­li­gence, has his reser­va­tions about Microsoft’s chatbot. 

“It would prevent someone from going through the healthy process of grief. In terms of human flour­ishing and psy­chology, it’s pretty obvious that sub­sti­tutes for grief don’t work,” Wales said. 

A bot of a deceased loved one, Wales claimed, would simply be a “rebound robot.”

“There would also be a subtle instru­men­tal­ization of the other person. If the dead or absent person is unable to provide what I want, there’s a risk of erosion of the rela­tionship. It is a mode of self-soothing that uti­lizes the apparent presence of the other that ends up reducing the person,” Wales said.

Wales argued that to create a chatbot based off a deceased loved one shows a fun­da­mental mis­un­der­standing of grief.

“Not under­standing grief pre­vents one from making the final gift of self to the loved one which is the accep­tance of the loved one’s passing,” Wales said. “Everyone has to come to mat­u­ration. Having the chatbot for the person who died, one is not giving oneself to the person, one is clinging to the person.”