Imagine, if you happen to will, a digital doppelgänger. A clone that appears, talks and behaves identical to you, created from the depths of artificial intelligence, reflecting your every mannerism with eerie precision. As thrilling as it’d sound, how would you’re feeling about it?

Our research on the University of British Columbia turns the highlight onto this very query. With advancements in deep-learning technologies corresponding to interactive deepfake applications, voice conversion and virtual actors, it’s possible to digitally replicate a person’s appearance and behavior.

This mirror image of a person created by artificial intelligence is known as an “AI clone.” Our study dives into the murky waters of what these AI clones could mean for our self-perception, relationships and society. We identified three kinds of risks posed by AI replicas: doppelgänger-phobia, identity fragmentation and living memories.

Cloning AI

We defined AI clones as digital representations of people, designed to reflect some or multiple elements of the real-world “source individual.”

Unlike fictitious characters in digital environments, these AI clones are based on existing people, potentially mimicking their visual likeness, conversational mannerisms, or behavioural patterns. The depth of replication can vary greatly, from replicating certain distinct features to making a near-perfect digital twin.

AI clones are also interactive technologies, designed to interpret user and environmental input, conduct internal processing and produce perceptible output. And crucially, these are AI-based technologies built on personal data.

As the amount of private data we generate continues to grow, so too does the fidelity of those AI clones in replicating our behaviour.

Fears, fragments and false memories

We presented 20 participants with eight speculative scenarios involving AI clones. The participants were diverse in ages and backgrounds, and reflected on their emotions and the potential impacts on their self-perception and relationships.

Vast amounts of user-generated data will be used to create AI clones.
(Shutterstock)

First, we found that doppelgänger-phobia was a fear not only of the AI clone itself, but in addition of its potential misuse. Participants anxious that their digital counterparts could exploit and displace their identity.

Secondly, there was the specter of identity fragmentation. The creation of replicas threatens the unique individuality of the person being cloned, causing a disturbance to their cohesive self-perception. In other words, people worry that they may lose parts of their uniqueness and individuality within the replication process.

Lastly, participants expressed concerns about what we described as “living memories.” This pertains to the danger posed when an individual interacts with a clone of somebody they’ve an existing relationship with. Participants anxious that it may lead to a misrepresentation of the person, or that they might develop an over-attachment to the clone, altering the dynamics of interpersonal relationships.

Preserving human values

It is clear that the event and deployment of AI clones wield profound implications. Our study not only contributes invaluable insights to the critical dialogue on ethical AI, however it also proposes a brand new framework for AI clone design that prioritizes identity and authenticity.

The onus lies with all stakeholders — including designers, developers, policymakers and end-users — to navigate this uncharted territory responsibly. This involves conscientiously considering moderation and user-generated data expiration strategies to stop misuse and over-reliance.

Further, it’s imperative to acknowledge that the implications of AI clone technologies on personal identity and interpersonal relationships represent just the tip of the iceberg. As we proceed to tread the fragile path of this burgeoning field, our study findings can function a compass guiding us to prioritize ethical considerations and human values above all.

This article was originally published at theconversation.com