Tech Insight : What Are ‘Deadbots’?

Table of Contents

Following warnings by ethicists at Cambridge University that AI chatbots made to simulate the personalities of deceased loved ones could be used to spam family and friends, we take a look at the subject of so-called “deadbots”.

Griefbots, Deadbots, Postmortem Avatars 

The Cambridge study, entitled “Griefbots, Deadbots, Postmortem Avatars: on Responsible Applications of Generative AI in the Digital Afterlife Industry” looks at the negative consequences and ethical concerns of adoption of generative AI solutions in what it calls “the digital afterlife industry (DAI)”. 

Scenarios 

As suggested by the title of the study, a ‘deadbot’ is a digital avatar or AI chatbot designed to simulate the personality and behaviour of a deceased individual. The Cambridge study used simulations and different scenarios to try and understand the effects that these AI clones trained on data about the deceased, known as “deadbots” or “griefbots”, could have on living loved ones if made to interact with them as part of this kind of service.

Who Could Make Deadbots and Why?

The research involved several scenarios designed to highlight the issues around the use of deadbots. For example, the possible negative uses of deadbots highlighted in the study included:

– A subscription app that can create a free AI re-creation of a deceased relative (a grandmother in the study), trained on their data, and which can exchange text messages with and contact the living loved one, in a similar way that the deceased relative used to (via WhatsApp) giving the impression that they are still around to talk to. The study scenario showed how the bot could be made to mimic the deceased loved one’s grandmother’s “accent and dialect when synthesising her voice, as well as her characteristic syntax and consistent typographical errors when texting”. However, the study showed how this deadbot service could also be made to output messages that include advertisements in the loved one’s voice, thereby causing the loved one distress. The study also looked at how further distress could be caused if the app designers did not fully consider the user’s feelings around deleting the account and the deadbot, such as if provision is not made to allow them to say goodbye to the deadbot in a meaningful way.

– A service allowing a dying relative (e.g. a father and grandfather), to create their own deadbot that will allow their younger relatives (i.e. children and grandchildren) to get to know them better after they’ve died. The study highlighted negative consequences of this type of service, such as the dying relative not getting consent from the children and grandchildren to be contacted by the ‘deadbolt’ and the resulting unsolicited notifications, reminders, and updates from the deadbot, leaving relatives distressed and feeling as though they were being ‘haunted’ or even ‘stalked’.

Examples of services and apps that already exist and offer to recreate the dead with AI include ‘Project December’, and apps like ‘HereAfter’.

Many Potential Issues 

As shown by the examples in the Cambridge research (there were 3 main scenarios), the use of deadbots raise several ethical, psychological and social concerns. Some of the potential ways they could be harmful, unethical, or exploitative (along with the negative feelings they might provoke in loved ones) include concerns, such as:

– Consent and autonomy. As noted in the Cambridge study, a primary concern is whether the deceased gave consent for their personality, appearance, or private thoughts to be used in this way. Using someone’s identity without their explicit consent could be seen as a violation of their autonomy and dignity.

– Accuracy and representation: There is a risk that the AI might not accurately represent the deceased’s personality or views, potentially spreading misinformation or creating a false image that could tarnish their memory.

– Commercial exploitation. The study looked at how a deadbot could be used for advertising because the potential for commercial exploitation of a deceased person’s identity is a real concern. Companies could use deadbots for profit, exploiting a person’s image or personality without fair compensation to their estate or consideration of their legacy.

– Contractual issues. For example, relatives may find themselves in a situation where they are powerless to have an AI deadbot simulation suspended, e.g. if their deceased loved one signed a lengthy contract with a digital afterlife service.

Psychological and Social Impacts 

The Cambridge study was designed to look at the possible negative aspects of the use of deadbots, an important part of which are the psychological and social impacts on the living. These could include, for example:

– Impeding grief. Interaction with a deadbot might impede the natural grieving process. Instead of coming to terms with the loss, people may cling to the digital semblance of the deceased, potentially leading to prolonged grief or complicated emotional states.

– There’s also a risk that individuals might become overly dependent on the deadbot for emotional support, isolating themselves from real human interactions and not seeking support from living friends and family.

– Distress and discomfort. As identified in the Cambridge study, aspects of the experience of interacting with a simulation of a deceased loved one can be distressing or unsettling for some people, especially if the interaction feels uncanny or not quite right. For example, the Cambridge study highlighted how relatives may get some initial comfort from the deadbot of a loved one but may become drained by daily interactions that become an “overwhelming emotional weight”.  

Potential for Abuse 

Considering the fact that, as identified in the Cambridge study, people may develop strong emotional bonds with the deadbot AI simulations thereby making them particularly vulnerable to manipulation, one of the major risks of the growth of a digital afterlife industry (DAI) is the potential for abuse. For example:

– There could be misuse of the deceased’s private information (privacy violations), especially if sensitive or personal data is incorporated into the deadbot without proper safeguards.

– In the wrong hands, deadbots could be used to harass or emotionally manipulate survivors, for example, by a controlling individual using a deadbot to exert influence beyond the grave.

– There is also the real potential for deadbots to be used in scams or fraudulent activities, impersonating the deceased to deceive the living.

Emotional Reactions from Loved Ones 

The psychological and social impacts of the use of deadbots as part some kind of service to living loved ones, and/or misuse of deadbots could therefore lead to a number of negative emotional reactions. These could include :

– Distress due to the unsettling experience of interacting with a digital replica.

– Anger or frustration over the misuse or misrepresentation of the deceased.

– Sadness from a constant reminder of the loss that might hinder emotional recovery.

– Fear concerning the ethical implications and potential for misuse.

– Confusion over the blurred lines between reality and digital facsimiles.

What Do The Cambridge Researchers Suggest?

The Cambridge study led to several suggestions of ways in which users of this kind of service may be better protected from its negative effects, including:

– Deadbot designers being required to seek consent from “data donors” before they die.

– Products of this kind being required to regularly alert users about the risks and to provide easy opt-out protocols, as well as measures being taken to prevent the disrespectful uses of deadbots.

– The introduction of user-friendly termination methods, e.g. having a “digital funeral” for the deadbot. This would allow the living relative to say goodbye to the deadbot in a meaningful way if the account was to be closed and the deadbot deleted.

– As highlighted by Dr Tomasz Hollanek, one of the study co-authors: “It is vital that digital afterlife services consider the rights and consent not just of those they recreate, but those who will have to interact with the simulations.” 

What Does This Mean For Your Business? 

The findings and recommendations from the Cambridge study shed light on crucial considerations that organisations involved in the digital afterlife industry (DAI) must address. As developers and businesses providing deadbot services, there is a heightened responsibility to ensure these technologies are developed and used ethically and sensitively. The study’s call for obtaining consent from data donors before their death underscores the need for clear consent mechanisms to be built in. This consent is not just a legal formality but a foundational ethical practice that respects the rights and dignity of individuals.

Also, the suggestion by the Cambridge team to implement regular risk notifications and provide straightforward opt-out options is needed for greater transparency and user control in digital interactions. This could mean incorporating these safeguards into service offerings to enhance user trust and digital afterlife services companies perhaps positioning themselves as a leaders in ethical AI practice. The introduction of a “digital funeral” to these services could also be a respectful and symbolic way to conclude the use of a deadbot, as well as being a sensitive way to meet personal closure needs, e.g. at the end of the contract.

The broader implications of the Cambridge study for the DAI sector include the need to navigate potential psychological impacts and prevent exploitative practices. As Dr Tomasz Hollanek from the study highlighted, the unintentional distress caused by these AI recreations can be profound, suggesting that their design and deployment strategies should really prioritise psychological safety and emotional wellbeing. This should involve designing AI that is not only technically proficient but also emotionally intelligent and sensitive to the nuances of human grief and memory.

Businesses in this field must also consider the long-term implications of their services on societal norms and personal privacy. The risk of commercial exploitation or disrespectful uses of deadbots could lead to public backlash and regulatory scrutiny, which could stifle innovation and growth in the industry. The Cambridge study, therefore serves as an early but important guidepost for the DAI industry and has highlighted some useful guidelines and recommendations that could contribute to a more ethical and empathetic digital world.

Recent Blog Posts

As the countdown to October 14, 2025 continues, the end of support for Windows 10 ...

What is an IT support specialist? Technology is at the core of every ...

Google has filed a set of proposed remedies in its high-profile antitrust case ...

Client Testimonials

Stuart B.
10:16 10 Sep 24
Flyford have helped out IT throughout our growth. So, matching our systems to not only what we need now, but what we will be needing in the future; future proofing.They just make it all easy, and take the stress out of IT for us.
Xanthe S.
12:12 06 Jul 22
We would highly recommend Flyford’s services. They are always on hand for help and advice, nothing is ever too much trouble. All the guys are knowledgable, helpful and friendly. You can’t want much more from a company! Many thanks from us all at Green Mile Trees.
Alan G.
18:08 23 May 22
I messed up my Dell laptop downloading non standard updates and lost the inbuilt system update. I tried for a week to rectify my mistake and couldn’t. Then I installed BT Cloud and that wouldn’t work either. John from Flyford sorted it out within 30 minutes. Big thanks to everyone, highly recommended.👍
Hanicks L.
11:34 26 Mar 22
Excellent support
Stephanie M.
15:33 10 Feb 22
Flyford run the IT for our accountancy firm in Retford. We rely heavily on IT for our business and they are always efficient dealing with our requests and keeping us up and running at all times. They also help us forward plan for our growing needs, keeping in mind budgets and working to our time frames
The team at Flyford are great for our business based in Lincoln. We are only a small company, so its great to know we have help with our computers should we need it. We have recommended Flyford to other business’s in the area.

Areas we cover

We provide it support, telephony and it managed services to the following locations and their surrounding areas:

Doncaster, Sheffield , Tickhill , Maltby , Rotherham, Swallownest , Barnsley , Lincoln, Nottingham, Worksop, Retford, Newark, Harworth, Edwinstowe, Barlborough

Freqently Asked
Questions:

If you have any further questions please feel free to contact us

Contact Us >

We offer a wide range of services, including IT computer supportmanaged services IT supportIT consultancycybersecurity, and more. Whether you’re looking for company IT support or help with specific issues, we’ve got you covered.

We pride ourselves on providing tailored IT support solutions for businesses of all sizes. Our expert team delivers high-quality, 24/7 IT support, ensuring that your systems are always running smoothly.

IT support is essential for increasing productivity, safeguarding your valuable data, and reducing downtime, helping your business run smoothly and efficiently. With reliable IT support in place, you can focus on your core operations while knowing your systems are secure, reliable, and performing at their best. Additionally, you’ll have the peace of mind that expert assistance is always on hand whenever you need it.
There are different levels of IT support, including remote IT support, 24/7 IT support, and on-site assistance. Services range from basic troubleshooting to full IT management support.
IT support provides assistance for managing and troubleshooting technology. It’s essential for businesses that rely on technology to ensure smooth operations and reduce downtime.

2nd Line / 3rd Line IT Support Engineer 

We are an established MSP providing in-house IT Services and rapid response IT Support to companies across Doncaster and the surrounding areas. Due to business growth, we are looking to add a 2nd Line / 3rd Line IT Support Engineer to our growing team. The ideal candidate will have all the experience, skills, and personality to thrive in this new role.

  • Microsoft 365 Services

  • Azure Services

  • Windows Virtual Desktop

  • Intune

  • Firewall Configuration

  • Hyper and Vmware Infrastructure

  • Solid knowledge of networking technologies and concepts such as LAN/WAN, DHCP etc

  • Experience of and enjoy providing customer service as well as building and maintaining customer relationships.

  • Ability to communicate clearly and concisely at all levels.

  • An ability and desire to adapt and learn new software and programs.

  • Good time-keeping and organisational skills.

  • You love solving problems.

  • Ability to recognise where improvements can be made internally and for clients, then plan ,schedule and execute the project

  • Reliable and punctual.

  • Driving licence in case you need to visit with clients across the area (usually up to about a one-hour radius of Doncaster).

  • Experience with Microsoft Power Platform particularly Power Automate and PowerApps is advantageous but not essential.
  • Provide remote technical support via email and telephone to end users so that operational problems and queries are diagnosed and resolved as quickly as possible.

  • Implementing MS365 and networking solutions for client and internal projects.

  • Onsite support and installation of hardware and software.

  • Specifying, recommending, providing, configuring, and implementing many varied items of equipment i.e., desktops, servers, printers etc. and supporting software in accordance with client requirements.

  • Production of standard configurations, documentation, and procedures.

  • Consider where the team can streamline processes and produce efficiencies within the company and in the services provided to clients.

  • Building and maintaining relationships with new and existing clients where the Company provide their IT support.

  • Liaising with 3rd party vendors and suppliers on behalf of clients.

Upload Your CV