Father
Professional
- Messages
- 2,602
- Reaction score
- 795
- Points
- 113
In different countries of the world, so-called deadbots are gaining popularity — special chatbots with artificial intelligence that mimic the speech patterns and character traits of the deceased using the digital traces they leave. However, according to experts, such simulators can be used by intruders to manipulate users. For more information about who and why creates "dead bots", as well as what risks such services carry, read the Izvestia article.
Digital Resurrection
Study the features of using "deadbots" (from the English deadbots or griefbots. - Ed.) was recently decided by experts in AI ethics from the Leverhulme Center for the Study of the Future of Intelligence at the University of Cambridge (Great Britain). Today, such chatbots are already being launched by some companies in Western countries and in China, providing a fundamentally new type of "post-mortem presence".
On the one hand, artificial intelligence allows users to conduct text and voice conversations with their deceased loved ones, thus keeping in touch with them. On the other hand, according to researchers, such chatbots can cause psychological harm to users.
In particular, interested companies and malefactors can use these simulators for hidden advertising of goods or for manipulating users who will be told that their deceased relatives are still with them.
"People can develop strong emotional connections with such simulators, which makes them particularly vulnerable to manipulation," said co-author Dr. Tomasz Hollanek. — We recommend that you develop protocols that prevent the use of "dead bots" for disrespectful purposes, such as advertising or active presence in social networks.
In the wake of the coronavirus
As Olga Peskova, a professor at the HSE School of Communications, says in an interview with Izvestia, today there are a number of platforms that allow you to animate images of deceased people for a reasonable fee. For example, this is Project December — the world's first such system that uses a patented neural network technology. Ta allows you to simulate both images and text dialogues with the deceased.
"Thanks to Project December, Canadian writer Joshua Barbeau "revived" his fiancee, who died at the age of 23, "says the expert." To do this, he painted the role of a bot and uploaded recordings to recreate the character and communication style of the departed girlfriend with a virtual interlocutor. According to Project December, the more materials you send to the deadbot, the more accurate the image will be.
According to Olga Peskova, a similar service introduced the genealogical site MyHeritage in 2021, giving it the name Deep Nostalgia. It creates animated videos based on photos of deceased relatives. The story immediately went viral, but many users found the experience creepy and immoral.
And in 2017, Russian Yevgenia Kuida, who left for Silicon Valley (USA), introduced the Replika chatbot, originally conceived as a virtual assistant. After a while, he began to help communicate with deceased relatives. Yevgenia was the first to "resurrect" her best friend Roman Mazurenko after the sudden death, creating a kind of"digital monument".
"During the COVID-19 pandemic, the number of Replika users exceeded 10 million worldwide, helping people survive isolation and grief," notes Olga Peskova. — At first, "deadbots" looked like a kind of entertainment, like services for creating deepfakes, but over time they changed their positioning. They have advertised themselves as therapeutic services that help people survive loss, and help children communicate with their deceased parents.
Russian perspectives
The trend associated with the creation of "deadbots" has not yet reached Russia. As Olga Peskova explains, other experiments are still being conducted in the country: for example, with the help of artificial intelligence, the famous artist Yuri Nikulin was "revived", who will appear in a new family comedy as a digital avatar. However, experts do not rule out the development of full-fledged chatbots.
"At the moment, such services are not widely used in Russia, but given the global interest in AI and digital technologies, we can assume that they may eventually appear on the Russian market," says Alexandra Vereteno, senior lecturer at the Faculty of Management of St. Petersburg State University of Economics (SPbSEU), an expert in the field of neural networks..
Dmitry Ovchinnikov, Chief Specialist of the Integrated Information Security Systems Department at Gazinformservis, agrees that such bots will appear in Russia in the future. However, the expert doubts that such services will become popular in the country, since Russians in general are much more sober about the death of their loved ones and do not try to "resurrect" the dead by listening to voice recordings. Therefore, according to Ovchinnikov, "deadbots" are unlikely to gain popularity among the general population — rather, they will become a "toy" only for wealthy people.
However, against the background of the spread of such services, questions related to the security of their use are becoming more acute, as experts from the University of Cambridge warned about. One of these questions is ethical and moral. According to Olga Peskova, it is necessary to protect children from such bots, who may get used to communicating with virtual dead people.
- Adults also have a psychological attachment, - the interlocutor of Izvestia notes. — In this case, after refusing to use the "deadbot", they experience an aftershock, a "second" death. It is useful here to recall the multiple mental breakdowns in adolescents due to dead tamagotchi pets, which instrumentally resemble the interlocutor created by the neural network.
Risk factors
Emotional pain, and in a pessimistic scenario, even psychological disorders caused by unhealthy attachment or blocking of "deadbots" are the main risks from using such services, Olga Peskova believes. In this regard, Project December separately warns users that the "life" of the chatbot is limited.
- The Canadian writer Joshua Barbeau, who uses the Project December platform, admitted that the chat bot recreated the manner of communication of his late fiancee Jessica "with frightening accuracy," the expert notes. — As a result, he spent many hours with 'Jessica', which greatly affected his emotional state.
Because of all these factors, the prospects for the development of "resurrection" bots are ambiguous, says Alexandra Vereteno. On the one hand, they offer innovative approaches to psychological support and preserving the memory of loved ones, but on the other hand, they raise serious ethical and psychological questions. At the same time, from the point of view of legislation, such services are still insured against restrictions.
As Olga Peskova explains, the fact is that users upload data on their own, at their own will, by signing consent to the provision of data with one button — thus, the decision on "resurrection" remains their prerogative. If the neural network receives permission from the "donor" himself to use his personal data after death, this can be equated with the last will of the deceased.
- The prospects for the development of such applications are very vague, since until now they have more resembled experiments and promotions — - concludes the interlocutor of Izvestia. — However, it seems that the digital industry will not be hindered by the "bill of rights of users", which comes into force not only during life, but also after death.
Digital Resurrection
Study the features of using "deadbots" (from the English deadbots or griefbots. - Ed.) was recently decided by experts in AI ethics from the Leverhulme Center for the Study of the Future of Intelligence at the University of Cambridge (Great Britain). Today, such chatbots are already being launched by some companies in Western countries and in China, providing a fundamentally new type of "post-mortem presence".
On the one hand, artificial intelligence allows users to conduct text and voice conversations with their deceased loved ones, thus keeping in touch with them. On the other hand, according to researchers, such chatbots can cause psychological harm to users.
In particular, interested companies and malefactors can use these simulators for hidden advertising of goods or for manipulating users who will be told that their deceased relatives are still with them.
"People can develop strong emotional connections with such simulators, which makes them particularly vulnerable to manipulation," said co-author Dr. Tomasz Hollanek. — We recommend that you develop protocols that prevent the use of "dead bots" for disrespectful purposes, such as advertising or active presence in social networks.
In the wake of the coronavirus
As Olga Peskova, a professor at the HSE School of Communications, says in an interview with Izvestia, today there are a number of platforms that allow you to animate images of deceased people for a reasonable fee. For example, this is Project December — the world's first such system that uses a patented neural network technology. Ta allows you to simulate both images and text dialogues with the deceased.
"Thanks to Project December, Canadian writer Joshua Barbeau "revived" his fiancee, who died at the age of 23, "says the expert." To do this, he painted the role of a bot and uploaded recordings to recreate the character and communication style of the departed girlfriend with a virtual interlocutor. According to Project December, the more materials you send to the deadbot, the more accurate the image will be.
According to Olga Peskova, a similar service introduced the genealogical site MyHeritage in 2021, giving it the name Deep Nostalgia. It creates animated videos based on photos of deceased relatives. The story immediately went viral, but many users found the experience creepy and immoral.
And in 2017, Russian Yevgenia Kuida, who left for Silicon Valley (USA), introduced the Replika chatbot, originally conceived as a virtual assistant. After a while, he began to help communicate with deceased relatives. Yevgenia was the first to "resurrect" her best friend Roman Mazurenko after the sudden death, creating a kind of"digital monument".
"During the COVID-19 pandemic, the number of Replika users exceeded 10 million worldwide, helping people survive isolation and grief," notes Olga Peskova. — At first, "deadbots" looked like a kind of entertainment, like services for creating deepfakes, but over time they changed their positioning. They have advertised themselves as therapeutic services that help people survive loss, and help children communicate with their deceased parents.
Russian perspectives
The trend associated with the creation of "deadbots" has not yet reached Russia. As Olga Peskova explains, other experiments are still being conducted in the country: for example, with the help of artificial intelligence, the famous artist Yuri Nikulin was "revived", who will appear in a new family comedy as a digital avatar. However, experts do not rule out the development of full-fledged chatbots.
"At the moment, such services are not widely used in Russia, but given the global interest in AI and digital technologies, we can assume that they may eventually appear on the Russian market," says Alexandra Vereteno, senior lecturer at the Faculty of Management of St. Petersburg State University of Economics (SPbSEU), an expert in the field of neural networks..
Dmitry Ovchinnikov, Chief Specialist of the Integrated Information Security Systems Department at Gazinformservis, agrees that such bots will appear in Russia in the future. However, the expert doubts that such services will become popular in the country, since Russians in general are much more sober about the death of their loved ones and do not try to "resurrect" the dead by listening to voice recordings. Therefore, according to Ovchinnikov, "deadbots" are unlikely to gain popularity among the general population — rather, they will become a "toy" only for wealthy people.
However, against the background of the spread of such services, questions related to the security of their use are becoming more acute, as experts from the University of Cambridge warned about. One of these questions is ethical and moral. According to Olga Peskova, it is necessary to protect children from such bots, who may get used to communicating with virtual dead people.
- Adults also have a psychological attachment, - the interlocutor of Izvestia notes. — In this case, after refusing to use the "deadbot", they experience an aftershock, a "second" death. It is useful here to recall the multiple mental breakdowns in adolescents due to dead tamagotchi pets, which instrumentally resemble the interlocutor created by the neural network.
Risk factors
Emotional pain, and in a pessimistic scenario, even psychological disorders caused by unhealthy attachment or blocking of "deadbots" are the main risks from using such services, Olga Peskova believes. In this regard, Project December separately warns users that the "life" of the chatbot is limited.
- The Canadian writer Joshua Barbeau, who uses the Project December platform, admitted that the chat bot recreated the manner of communication of his late fiancee Jessica "with frightening accuracy," the expert notes. — As a result, he spent many hours with 'Jessica', which greatly affected his emotional state.
Because of all these factors, the prospects for the development of "resurrection" bots are ambiguous, says Alexandra Vereteno. On the one hand, they offer innovative approaches to psychological support and preserving the memory of loved ones, but on the other hand, they raise serious ethical and psychological questions. At the same time, from the point of view of legislation, such services are still insured against restrictions.
As Olga Peskova explains, the fact is that users upload data on their own, at their own will, by signing consent to the provision of data with one button — thus, the decision on "resurrection" remains their prerogative. If the neural network receives permission from the "donor" himself to use his personal data after death, this can be equated with the last will of the deceased.
- The prospects for the development of such applications are very vague, since until now they have more resembled experiments and promotions — - concludes the interlocutor of Izvestia. — However, it seems that the digital industry will not be hindered by the "bill of rights of users", which comes into force not only during life, but also after death.