Digital clones are becoming more visible online, from branded replicas on X and LinkedIn to OnlyFans creators and reported “virtual human” salespeople in China. The technology stitches together hyperrealistic video, voice cloning from minutes of speech, and conversational models to produce replicas that do not simply answer questions but attempt to ‘‘think’ like a specific person. Startups such as Delphi and Tavus pitch these replicas as ways to scale access to personalities and expertise. Delphi, which the article says recently raised Not stated million from funders including Anthropic and Olivia Wilde’s Proximity Ventures, offers celebrity-backed clones and positions them as a way to deliver leaders’ wisdom at scale.
The author tested a Tavus clone to see whether such a replica could act as a useful stand-in at work. The onboarding required reading a script for voice training and recording one minute of silence. The avatar appeared within hours and resembled the author, but conversational performance lagged. The author uploaded roughly three dozen published stories to inform the clone yet withheld other reporting materials because of consent concerns for people who appear in those records. In interactions the clone acted overly enthusiastic about unrealistic pitches, repeated itself, and claimed to check a calendar it had no access to, leaving conversations that looped and could not be cleanly ended. Tavus cofounder Quinn Favret attributed some behaviors to developers’ instruction sets and to reliance on Meta’s Llama, which he said tends to be ‘‘more helpful than it truly is.
Despite shortcomings, clones have practical use cases. Tavus customers use replicas for health-care intake, job interviews, corporate role-play, mentorship, and qualification tasks such as preliminary loan screening. For influencers and high-volume sales roles, the tradeoffs of occasional errors may be acceptable. The article warns that teaching clones genuine discernment, critical thinking, and the idiosyncrasies of an individual remains out of reach. As companies emphasize humanlike features and scale, there is concern that replicas will be used for roles or decisions they should not be trusted to make. The story originally appeared in The Algorithm newsletter.