Feb 2, 2024 Article

It Takes a Village to Protect Children in the Metaverse

Over a decade ago, prior to the launch of social platforms like Horizon Worlds, some scholars questioned whether 3D virtual spaces could become “sufficiently realistic,” so that users could feel psychologically and emotionally immersed in it. Today, the United Nations International Children's Emergency Fund (UNICEF) is fighting against that outmoded thinking. The UN agency is urging the international community to take heed of children’s safety in virtual spaces and raising awareness that abuse, actually, “can feel more ‘real’ in immersive virtual environments.” According to a study in the journal Child and Adolescent Psychiatry and Mental Health, psychologists concluded it is important to treat online sexual abuse as “a serious form of sexual abuse even if the victim and perpetrator have not met outside the Internet.” Children’s advocacy groups in the United States, as well as groups combatting human trafficking, like Skull Games, are also calling for increased protections for minors in virtual spaces, who may be sexually targeted and abused.

To better understand the risks encircling this technology, it is important to first define what the nascent metaverse is.

What is the Metaverse?

The metaverse is an immersive social communications and gaming platform that links virtual worlds together.

This term was coined by science-fiction novelist Neal Stephenson in his novel Snow Crash, which described this digital realm as an amalgamation of virtual reality (VR) and augmented reality (AR). From a systems design perspective, the metaverse is an integrated network of three-dimensional virtual worlds that combine to form a 3D spectacle within a spectacle of virtual reality worlds. With an estimated global market value of $344.7 billion by 2027, the metaverse stands to foster economic growth and innovation in new Web 3.0 technologies, alongside emerging tech like nonfungible tokens (NFTs) and cryptocurrencies.

Gaming industry leaders anticipate that the metaverse will change how companies interact with customers and structure their business models. For instance, The Wall Street Journal revealed a private internal memorandum in 2023 from Meta (formerly Facebook), that officials were seeking to boost user retention by marketing its virtual social platform, Horizon Worlds, to 13 to 17-year-olds. According to a 2022 business study by McKinsey & Company, which surveyed 3,400 consumers and executives, 59 percent of consumers reported being “excited about transitioning their everyday activities to the metaverse” for social interactions, entertainment, digital travel, and shopping. The metaverse’s expected market value may reach $5 trillion by 2030. Despite this flashy economic forecast, the metaverse also presents challenges for preventing online child sexual exploitation and abuse, as seen in the 2024 virtual rape case of a child in the United Kingdom.

United Kingdom

In the UK, police are investigating the first instance of a virtual sexual offense in that nation—an alleged rape of a child in the metaverse. The Times announced that this virtual attack was perpetuated by several adult men against a child user, a girl under the age of 16. The child was not physically injured, but according to the police, is experiencing significant psychological and emotional trauma following the “gang rape” in this immersive virtual reality environment.

In an interview with The Times, Ian Critchley, the UK’s National Police Chiefs’ Council’s lead for Child Protection and Abuse Investigation, said that “the metaverse creates a gateway for predators to commit horrific crimes against children.” According to the London-based think tank, Cityforum, this case is generating controversy about how the police should best balance limited resources pursing online offenses, while forces and prosecutors are “inundated with a backlog” of rape cases in physical space. In addition to the UK, South Korea is also grappling with how to reduce the safety and security risks to children in the metaverse.

South Korea

In 2021, a 30-year-old South Korean man falsified his age in order to coerce minors on the metaverse to send inappropriate images in exchange for virtual gifts. South Korea’s Ministry of Gender Equality and Family also reported a case where a 14-year old girl had been “lured into taking off her metaverse avatar’s clothes and ordered to perform sexual acts with her avatar.” To better protect minors in this virtual space, in 2022 the Ministry began considering uploading the headshots of registered sex offenders in the metaverse. South Korean officials have also been partnering with metaverse companies, like Meta, to discuss how to protect minors online.

The South Korean Ministry of Science and ICT (MSIT) and National Data Policy Committee also announced that it would develop metaverse-specific regulation amendments, including measures to protect South Korean minors from online sexual harassment and assault. At this time, MSIT has published the first draft of its metaverse ethical principles strategy, which focuses on protecting youth, privacy, and intellectual property in the metaverse. According to a translation by Metaverse Insider, the draft proposal also includes eight principles to guide the government’s approach to regulating the metaverse, which prioritizes data protection, autonomy, fairness, and respect for privacy, to name a few.

China

China is also taking measures to protect minors in the metaverse from a wide array of online harms, issuing the Regulations on the Protection of Minors in Cyberspace, which took effect on January 1, 2024. These regulations require online service providers to conduct and report routine impact assessments on protecting minors online from a wide range of content that “that promotes obscenity, pornography, violence, cults, superstition, gambling, self-harm, suicide, terrorism, separatism, extremism, etc.,” says international law firm Bird & Bird.

China’s Ministry of Industry and Information Technology (MIIT) also established a working group to develop domestic standards to regulate the metaverse industry. The 60-member group is comprised of government officials, industry leaders from companies like Tencent Holdings and Huawei Technologies, and academic experts from across China. Based on MIIT’s fall 2023 proposal, translated by South China Morning Post, they will address “building and maintaining a system of metaverse industry standards,” cultivating workforce development, and “encourag[ing] local companies and institutions to deeply engage in international standard-setting activities.” Beijing aspires to become a leader of metaverse innovation and unveiled a cross-sector joint plan to develop a minimum of three metaverse companies “with global influence” by 2025.

Promoting Global Coordination

Apart from different nations focusing on preventing online child sexual exploitation and abuse (OCSEA) crimes, international civil society groups are rallying around this issue to strengthen global coordination. For instance, the Council of Europe established the End Online Child Sexual Exploitation and Abuse @ Europe (EndOCSEA@Europe) initiative to support national and international efforts to combat OCESA crimes.

The Council of Europe also developed the 2007 Convention on the Protection of Children against Sexual Exploitation and Sexual Abuse, the “Lanzarote Convention” to prosecute perpetrators. The Convention states that the sexual exploitation and sexual abuse of children using information and communication technologies has “grown to worrying proportions at both national and international level” and “preventing and combating such sexual exploitation and sexual abuse of children require international co-operation.” All 46 member states of the Council of Europe have signed and ratified the Lanzarote Convention.

In addition to OCSEA-related crimes in virtual space, UNICEF warns that information about child users’ non-verbal behavior in virtual environments could allow some companies to generate “hyper-personalized profiling, advertising and increased surveillance, impacting children’s privacy and other rights and freedoms.”

As the United Nations ​Ad Hoc Committee to Elaborate a Comprehensive International Convention on Countering the Use of Information and Communications Technologies for Criminal Purposes gathers in New York to discuss revisions to the UN Cybercrime Treaty, policymakers should discuss updating the language of Article 13 (Offences related to online child sexual abuse or child sexual exploitation material) and Article 14 (Solicitation or grooming for the purpose of committing a sexual offence against a child), to expressly include protecting children from sexual abuse and exploitation offenses committed in the metaverse. Just as it “takes a village to raise a child,” it similarly takes a global village to protect children in the metaverse.

Carnegie Council for Ethics in International Affairs is an independent and nonpartisan nonprofit. The views expressed within this article are those of the author and do not necessarily reflect the position of Carnegie Council.

You may also like

Uruguay signs the Artemis Accords, February 15, 2024, Washington, DC. CREDIT: NASA HQ Photo.

MAR 6, 2024 Article

Empowering the Artemis Accords Coalition for Peace and Stability

As missions ramp up, Zhanna Malekos Smith writes that the U.S. should lead an effort with the Artemis Accords for space sustainability and security.

Launch of OSIRIS-REx, September 2016, Florida. CREDIT: NASA Goddard Space Flight Center.

NOV 29, 2023 Article

A Human-Centric Epic for NATO Space Domain Awareness

In this report on NATO's annual space policy summit, Visiting Fellow Zhanna Malekos Smith analyzes the challenges the institution faces in the final frontier.

Moon. CREDIT: Greg Hewgill.

OCT 3, 2023 Article

Howling at the Moon? China’s Wolf Warrior Transition in Space

As Xi Jinping's China tries to soften its "wolf warrior" style of diplomacy, how is this reflected in its space policy?