Earlier this month, Meta announced it’s working on a set of ethical guidelines for “virtual influencers” – animated, typically computer-generated, characters designed to draw attention on social media.

When Facebook renamed itself Meta late last 12 months, it heralded a pivot towards the “metaverse” – where virtual influencers will presumably someday roam of their 1000’s.

Even Meta admits the metaverse doesn’t really exist yet. The constructing blocks of a persistent, immersive virtual reality for every thing from business to play are yet to be fully assembled. But virtual influencers are already online, and are surprisingly convincing.

Mark Zuckerberg’s Metaverse announcement. 30 October 2021.

But given its recent history, is Meta ( Facebook) really the correct company to be setting the moral standards for virtual influencers and the metaverse more broadly?

Who (or what) are virtual influencers?

Meta’s announcement notes the “rising phenomenon” of synthetic media – an umbrella term for images, video, voice or text generated by computerised technology, typically using artificial intelligence (AI) or automation.

Many virtual influencers incorporate elements of synthetic media of their design, starting from completely digitally rendered bodies, to human models which can be digitally masked with characters’ facial expression.

A Topography of Virtual Influencers by Rachel Berryman, Crystal Abidin, and Tama Leaver (October 2021).

At each ends of the dimensions, this process still relies heavily on human labour and input, from art direction for photo shoots to writing captions for social media. Like Meta’s vision of the metaverse, influencers which can be entirely generated and powered by AI are a largely futuristic fantasy.

But even of their current form, virtual influencers are of significant value to Meta, each as attractions for his or her existing platforms and as avatars of the metaverse.

Interest in virtual influencers has rapidly expanded over the past five years, attracting huge audiences on social media and partnerships with major brands, including Audi, Bose, Calvin Klein, Samsung, and Chinese e-commerce platform TMall.

A competitive industry specialising within the production, management and promotion of virtual influencers has already sprung up, even though it stays largely unregulated.

So far, India is the one country to deal with virtual influencers in national promoting standards, requiring brands “confide in consumers that they usually are not interacting with an actual human being” when posting sponsored content.

Ethical guidelines

There is an urgent need for ethical guidelines, each to assist producers and their brand partners navigate this recent terrain, and more importantly to assist users understand the content they’re engaging with.

Meta has warned that “synthetic media has the potential for each good and harm”, listing “representation and cultural appropriation” as specific problems with concern.

Indeed, despite their short lifespan, virtual influencers have already got a history of overt racialisation and misrepresentation, raising ethical questions for producers who create digital characters with different demographic characteristics from their very own.

But it’s removed from clear whether Meta’s proposed guidelines will adequately address these questions.

Becky Owen, head of creator innovation and solutions at Meta Creative Shop, said the planned ethical framework “will help our brand partners and VI creators explore what’s possible, likely and desirable, and what’s not”.

This seeming emphasis on technological possibilities and brand partners’ desires results in an inevitable impression that Meta is once more conflating business potential with ethical practice.

By its own count, Meta’s platforms already host greater than 200 virtual influencers. But virtual influencers exist elsewhere too: they do viral dance challenges on TikTok, upload vlogs to YouTube, and post life updates on Sina Weibo. They appear “offline” at malls in Beijing and Singapore, on 3D billboards in Tokyo, and star in television commercials.

Virtual influencer Rozy stars in a business for Shinhan life insurance.

Gamekeeper, or poacher?

This brings us back to the query of whether Meta is the correct company to set the bottom rules for this emerging space.

The company’s history is tarred by unethical behaviour, from Facebook’s questionable beginnings in Mark Zuckerberg’s Harvard dorm room (as depicted in The Social Network) to large-scale privacy failings demonstrated within the Cambridge Analytica scandal.

In February 2021 Facebook showed how far it was willing to go to defend its interests, when it briefly banned all news content on Facebook in Australia to force the federal government to water down the Australian News Media Bargaining Code.

Last 12 months also saw former Facebook executive Frances Haugen very publicly turn whistleblower, sharing a trove of internal documents with journalists and politicians.

These so-called “Facebook Papers” raised quite a few concerns concerning the company’s conduct and ethics, including the revelation that Facebook’s own internal research showed Instagram can harm young people’s mental health, even resulting in suicide.

Today, Meta is fighting US antitrust litigation that goals to restrain the corporate’s monopoly by potentially compelling it to sell key acquisitions including Instagram and WhatsApp.

Meanwhile, Meta is scrambling to integrate its messaging service across all three apps, effectively making them different interfaces for a shared back end that Meta will doubtless argue cannot feasibly be separated, irrespective of the outcomes of the present litigation.

Given this back story, Meta seems removed from the perfect alternative as ethical guardian of the metaverse.

The already extensive distribution of virtual influencers across platforms and markets highlights the necessity for ethical guidelines that transcend the interests of 1 company – especially an organization that stands to achieve a lot from the upcoming spectacle.

This article was originally published at theconversation.com