A new AI-powered mobile application is stirring controversy for offering users the ability to virtually “communicate” with deceased loved ones. Developed by startup 2Wai, the app features a tool called “HoloAvatar,” which uses just a few minutes of video and audio footage to recreate lifelike, interactive avatars of the dead. These digital personas can simulate conversations in more than 40 languages, raising serious ethical concerns over consent, exploitation of grief, and the boundaries of technological innovation.
The company was co-founded by actor Calum Worthy, known for his role on the Disney Channel’s “Austin & Ally,” and entertainment producer Russell Geyser. The app’s beta version launched on iOS on November 11, quickly drawing attention and criticism from both the public and experts in digital ethics.
At the heart of the controversy is the question of consent. Critics argue that the deceased cannot agree to have their likenesses and voices replicated, even if the content used was publicly available or provided by family members. The idea of reviving someone through AI without their explicit permission has been labeled by some as “demonic,” while others see it as a dangerous commodification of mourning.
Privacy advocates also warn of the potential misuse of personal data, especially sensitive materials like voice recordings, facial features, and behavioral patterns. With AI models increasingly capable of mimicking speech and facial expressions, concerns are mounting over how securely this data is stored and whether it could be used beyond its intended purpose.
Beyond individual concerns, the app has sparked a broader debate about the rise of “grief tech” — a growing industry focused on using technology to help people process loss. While some find comfort in digital memorials or AI chatbots that simulate loved ones, others argue this blurs the line between healing and denial, potentially delaying the natural grieving process.
Supporters of HoloAvatar claim it can offer solace to those struggling with loss, especially in cases of sudden or traumatic death. They argue that the ability to “talk” once more with a lost parent, child, or partner can provide emotional closure or even therapeutic benefits. However, psychologists caution that such interactions might create emotional dependencies or deepen feelings of loss, rather than resolving them.
In the business world, investors are watching the grief tech space closely, seeing it as both a lucrative and controversial frontier. Startups in this sector are attracting funding, but also facing heightened scrutiny from regulators and the public alike. Monetizing grief, especially through subscriptions or premium features that put a price on remembrance, is perceived by many as ethically murky.
2Wai maintains that their app includes safeguards, such as requiring consent from the next of kin before avatars are created. They also emphasize that the intention is not to deceive users into believing their loved ones are truly alive, but rather to provide a form of digital memorialization. Still, critics argue that such measures are insufficient in addressing the deeper moral questions involved.
The psychological impact of such technology is another area of concern. Researchers in mental health suggest that interacting with digital replicas of the deceased could interfere with the acceptance phase of grief. Prolonged virtual conversations might prevent individuals from moving forward emotionally, creating a dependency on an illusion.
From a technical standpoint, the app demonstrates the rapid advancements in generative AI, particularly in natural language processing and deepfake video synthesis. It raises questions about how far this technology should go and whether society is prepared to manage the consequences.
Cultural and religious perspectives also come into play. In many belief systems, death is viewed as a sacred transition, and attempts to recreate the dead may be seen as offensive or even taboo. The app’s existence forces a confrontation between traditional views of mortality and the emerging capabilities of AI.
Additionally, legal experts are beginning to weigh in on the implications of “posthumous personality rights.” In many jurisdictions, there is no clear legislation governing the use of a deceased person’s likeness or voice, especially when consent cannot be obtained. This opens up legal gray areas that could lead to future litigation.
Looking ahead, the development of apps like HoloAvatar signals a future where digital resurrection may become increasingly common — not just for personal grief, but potentially for commercial, political, or entertainment purposes. Imagine celebrities or public figures being digitally “revived” for films, endorsements, or even political campaigns — all without clear regulation.
As the boundaries between life and digital afterlife continue to blur, society must grapple with profound questions: Should we bring the dead back in any form? Who gets to decide what happens to a person’s digital legacy? And how do we ensure that technology serves the living without exploiting the memory of the departed?
Ultimately, the controversy around 2Wai’s app highlights a pivotal moment in the intersection of AI, ethics, and human emotion. Whether seen as a revolutionary tool for healing or a troubling step toward digital necromancy, HoloAvatar is forcing a reckoning with how we grieve, remember, and interact with the dead in the digital age.
