Cambridge researchers are raising concerns about the psychological impact of ‘deadbots,’ AI programs designed to mimic deceased individuals, and are advocating for ethical guidelines and consent protocols to prevent misuse and ensure respectful interaction.
According to experts at the University of Cambridge, AI chatbots that simulate the language patterns and personality traits of the deceased using their digital footprints could potentially cause psychological harm and even digitally “haunt” those left behind if not properly regulated.
These ‘Deadbots’ or ‘Griefbots’ are being developed by some companies, offering a novel form of “postmortem presence” by recreating lost loved ones through text and voice conversations.
The Cambridge researchers outline various design scenarios for platforms in the emerging “digital afterlife industry,” warning of potential consequences if not carefully designed.
They caution against the misuse of deadbots for advertising purposes, distressing relatives by insisting that a deceased loved one is still present, or overwhelming users with constant interactions that become emotionally burdensome.
The study emphasizes the importance of safeguarding the dignity of the deceased and ensuring that financial motives do not override ethical considerations. It calls for consent from both data donors and users interacting with AI afterlife services.
Existing platforms offering AI recreation services, such as ‘Project December’ and ‘HereAfter,’ are already operational, raising concerns about privacy, manipulation, and emotional manipulation.
The researchers recommend age restrictions for deadbots and advocate for transparency in AI interactions to ensure users are aware they are engaging with artificial intelligence.
They also highlight the need for opt-out protocols that allow users to terminate their relationships with deadbots in ways that provide emotional closure.
Overall, the study underscores the importance of considering the social and psychological risks associated with digital immortality and calls for proactive measures to address these challenges.