InterrogationDownloadBackground2BackgroundCrossClosePointsclose-crossarrow-leftarrow-downarrow-up
9 min. reading
A metaverse-proof legal framework
A relatively well-developed legal framework already governs a large proportion of online interaction and social interaction at both national and European level. This framework is in great part applicable to the Metaverse and metaverse.

Fundamental rights

There are some higher standards that apply to everyone, both online and offline, and that form the basis of democracies: fundamental rights. In fact, they constitute the highest standards that apply to metaverses. At supranational level, several texts are dedicated to fundamental rights likely to be impacted by their mass adoption, including:

  • the 1948 Universal Declaration of Human Rights;
  • the European Convention for the Protection of Human Rights and Fundamental Freedoms, enforced by the European Court of Human Rights, which is based in Strasbourg;
  • the Charter of Fundamental Rights of the European Union, which has been enforceable by the Member States since 2009: any citizen can refer to it if their rights are not respected.

“With regard to case law, in particular that of the Court of Justice of the European Union, for example the Schrems I and II decisions, we invariably come back to the Charter of Fundamental Rights of the European Union. The fundamental right to privacy appears more than 40 times in the Schrems II decision. What we need is an interpretation of fundamental rights transposed to the Metaverse.”
Yaniv Benhamou
Yaniv Benhamou
Associate Professor, University of Geneva

Non-binding instruments

The legal framework that applies to the Metaverse, which is made up of international, regional, and national texts, is supplemented by instruments that are not legally binding. At European level, these include the Code of Practice on disinformation and the Code of Conduct to combat illegal hate speech online, to which online service providers adhere on a voluntary basis, and which are monitored and reported on by the European Commission.

At the same time, some private players are developing their own codes of conduct, such as Meta with its "Code of conduct on virtual experiences". Additionally, some non-governmental organisations (NGOs), such as Respect Zone in France, have created their own charter for a trusted Metaverse, which metaverse operators or owners can sign up to.

Protection against criminal offences and harmful behaviour

In today's democracies, legal frameworks exist to protect individuals from criminal offences and harmful behaviour, both in the physical world and online. The answer to the question "Can a metaverse owner decide to allow practices in their virtual world that are against the law in the physical world?", which was asked on the third day of the Metaverse Dialogues, is no, in theory. The EU Charter of Fundamental Rights and the French Penal Code, for example, apply to the Metaverse (as they do to the internet in general). In France, cyberbullying is considered an offence, in the same way as moral or sexual harassment, and is punishable under article 222- 33-2-2 of the French Penal Code.

“There are two layers. On the one hand, you have the law, which applies to everyone, including in the Metaverse. We need to find a way of applying it to virtual spaces. And if the current framework proves too weak, we need to create new ones. On the other hand, there are the rules set by the various owners of virtual worlds and accepted by users, which apply in certain private spaces, particularly if they are gaming spaces, role-playing spaces, etc.”
Régis Chatellier
Régis Chatellier
“Innovation and Foresight” Manager, CNIL (French Data Protection Authority) Innovation Laboratory (LINC)

Despite this framework, it can be difficult, in the age of the Metaverse and social networking platforms, to clearly define these offences and therefore to counter them. How can harassment be defined in an immersive virtual space? Can rape take place in the Metaverse? Should a distinction be made between private and public spaces? How can the perpetrators of criminal offences and harmful behaviour be held accountable? How can such behaviour be proven, and victims compensated?

There are major gaps in our legal systems, starting with the lack of definitions. For example, it is still difficult to establish a legal definition of what is illegal online. At European level, for example, there is no harmonised definition of "hate speech". Prior to 15 March 2017, and the coming into force of the Counter-Terrorism Directive, even terrorism did not have a common definition across Member States. Nor is there a harmonised Penal Code in Europe – each Member State had its own. What constitutes illegal content or behaviour therefore varies from country to country. In Denmark and Germany, for example, Holocaust denial is not punishable, whereas it is in France. So how can we identify and combat illegal behaviour in the Metaverse? Illegal according to whom? Illegal where?

On a separate note, the criminalisation of virtual rape has been under discussion for over thirty years and is well documented99. In France, under Article 222-23 of the Penal Code, "[a]ny act of sexual penetration, whatever its nature, committed against another person by violence, coercion, threat, or surprise is rape". For rape to be considered a criminal offence, there must be physical contact. In other words, in France, rape is not currently recognised as such in cyberspace, insofar as it is deemed that there is no physical penetration. Could the development of metaverses change this by increasing immersion to the point where a person physically feels the effects of someone else's behaviour towards their avatar or immersed body?

The DSA, which came into force on 25 August 2023 and establishes harmonised rules for online service providers, does not address the problem of the lack of a definition. Article 3(h) defines "illegal content" as “any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law.” The DSA therefore refers to Union law, which is incomplete, and to the law of the Member States, which is not harmonised (see "Legal inconsistencies and implementation complexities”).

Defining and countering harmful content or behaviour is just as complex, if not more so. What is harmful is not necessarily illegal, which can leave room for different interpretations.

Protecting personal data and privacy

To maximise immersion, immersive devices, particularly virtual/augmented/mixed reality headsets, are multiplying the number of sensors and systems used to analyse people's behaviour (gestures, postures, etc.), expressions (particularly facial and eye expressions) and emotions. This makes it possible to detect the degree of dilation in an iris, an object on which the gaze is focused, or a sign of hesitation in speech. As well as the commercial issues involved, this raises questions about the processing of biometric data [[[“Personal data resulting from specific technical processing relating to the physical, physiological, or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data [fingerprints]”. Source: Article 3 §13 of Directive (EU) 2016/680 of 27 April 2016, Article 4 §14 of Regulation (EU) 2016/679 of 27 April 2016 and Article 3 §18 of Regulation (EU) 2018/1725 of 23 October 2018.]]], behavioural data, and emotional data.

Personal data is any information relating to an identified or identifiable natural person. Like many current laws aimed at regulating the digital space, the GDPR applies to the Metaverse. As this regulation abides by the principle of technological neutrality, it applies to any technology, and any processing of personal data taking place via the Metaverse will have to comply with its rules. Two other European regulations, the Data Governance Act and the Data Act, establish horizontal rules for data sharing, and give users control over the data generated by their connected devices.

Video interview of Étienne Drouard, Partner at Hogan Lovells LLP, shot during the third Metaverse Dialogue.

Emotional data, which makes it possible to infer our emotions, emotional state, or state of mind, does not necessarily make it possible to identify a person on an individual basis. So, does this data qualify as personal data, and is it therefore covered by the GDPR? Emotional data is essentially interpreted and deduced by capturing personal data. In the real world, emotional data can be interpreted by capturing an individual's image or writings. In metaverses, it is an avatar's behaviour and speech that could be used to collect emotional data. Consequently, if this data can be linked to an individual, it is personal data. However, this qualification has its limits. Taken individually and totally detached from personal data, emotional data could completely escape this classification.

Finally, in accordance with Article 7 of the Charter of Fundamental Rights of the European Union, "everyone has the right to respect for his or her private and family life, home and communications”. The right to privacy is a fundamental right, which therefore is applicable online as it is in the physical world. On the internet in particular, it is largely governed at European level by the Directive of 12 July 2002 on the protection of privacy in the electronic communications sector. In order to update this directive, the European Commission published the proposed ePrivacy Regulation in 2017. At the time of this report's publication, it has still not been finalised According to one participant on the third day of the Metaverse Dialogues, the main reason why the 2022 revision of the ePrivacy Directive has not yet been completed is that the Member States cannot agree on an adequate level of protection. As a result, there is no governance mechanism for online privacy protection at European level (see "Legal inconsistencies and implementation complexities”).

Intellectual and industrial property

Avatars, virtual objects and, more generally, productions in the Metaverse raise the question of ownership. Is it possible, for example, to take a handbag design from a major luxury brand and turn it into a virtual object worn by an avatar? If so, under what conditions? Should a distinction be made between commercial and other uses? Or should this possibility be reserved for the brands themselves? These questions are likely to arise for a whole range of everyday objects, from clothing and accessories to means of transport, personal and urban furniture, works of art and so on. These concerns have increased tenfold since the mass adoption of generative artificial intelligence solutions by the wider public. Capable of reproducing human cognitive capacity in a comprehensive and versatile way, generative AI, coupled with the Metaverse, heightens concerns about plagiarism, counterfeiting, and copyright infringement.

If virtual worlds are envisaged as spaces that leave plenty of room for creation (of content, "worlds", avatars, objects, etc.), the protection of intellectual property could potentially constitute a major challenge. However, the European legal framework applicable to the Metaverse in terms of intellectual and industrial property appears relatively comprehensive. The 2016 Directive on the protection of business secrets, the EU Trademark Regulation, which came into force in 2017, and the Directive on copyright in the digital market, adopted in 2019, all apply to virtual worlds.

According to Alain Strowel, a lawyer at the Brussels Bar specialising in copyright law, this legal framework is well suited to the Metaverse However, as with the protection of biometric data, the real difficulty lies in implementing existing rules. While substantive law is broadly appropriate, enforcement and dispute resolution systems are not (see "Legal inconsistencies and implementation complexities”).

“We need to embrace a new approach that takes technological worlds into greater account – their speed, and their ubiquity – and focus on implementing alternative mechanisms for enforcing the law in these new worlds.”
Alain Strowel
Alain Strowel
Lawyer of the Brussels Bar, copyright law specialist

He also points out that, while some authorities exist to protect personal data, there is no independent authority at European level or in the Member States to enforce intellectual property rights. Legal action can be taken in a court of law, but the traditional judicial procedure is unsuitable (particularly because of the courts' slowness and lack of specialisation). Besides, intellectual property offices are not empowered to punish behaviour that violates these standards.

FR