Artificial Intelligence (AI) continues to push the boundaries of technology, but it’s also raising critical ethical questions. One disturbing instance came to light when an AI chatbot mimicked a young woman who was murdered 18 years ago, causing deep distress to her family.
What Happened?
According to a report by Business Insider, Drew Crecente was shocked when a Google Alert informed him that his daughter Jennifer Ann, who was tragically killed by her ex-boyfriend 18 years ago, had been brought to life as an AI chatbot. Jennifer, who was a high school senior at the time of her death, has been remembered through a non-profit foundation her father created to raise awareness about teen dating violence.
The chatbot appeared on Character.ai, a platform that allows users to create AI “characters.” This bot used Jennifer Ann’s name and her yearbook photo, portraying her as an expert in journalism—a nod to her uncle, Brian Crecente, a noted video game journalist.
For Drew, this discovery reopened old wounds. The bot had already participated in 69 conversations before Drew found it, intensifying his grief. He had no idea who created the chatbot but quickly reached out to Character.ai, demanding they remove it and ensure no future bots use his daughter’s name or image without consent.
Drew’s brother, Brian, also expressed his anger on social media. In a post on X (formerly Twitter), Brian condemned the platform for using his niece’s likeness without permission, calling it “disgusting.” His post gained widespread attention and support from others outraged by the incident.
Character.ai responded within hours, stating that the bot had been removed as it violated their policies, which prohibit impersonation of real people. Despite this, Drew remains deeply troubled and has requested that Character.ai retain all records of who created the bot.
Ethical Concerns in AI Technology
This case highlights the ethical issues surrounding AI, particularly when it involves real people’s identities being used without consent, especially in highly sensitive cases like this. The misuse of AI can not only reopen painful wounds for victims’ families but also spark broader debates on privacy and the responsible use of technology.