2 min. reading
The concept of governance
According to the Council of Europe, internet governance refers to "the development and application by governments, the private sector, and civil society, in their respective roles, of shared principles, norms, rules, decision-making procedures, and programmes that shape the evolution and use of the Internet”.
With regard to online content, the Content Policy & Society Lab (CPSL) at Stanford University identifies four levels of governance :
- content regulation, i.e. the set of rules adopted by a regulator or legislator, which serves as a framework for content moderation;
- content policy, i.e. the rules that represent a company's societal vision regarding acceptable content on its platform(s) (found in the Terms & Conditions or community rules of an online service, for example);
- content governance, i.e. a system of rules that regulate the process of creating and implementing content policies, as well as the distribution of powers among the entities responsible for each task;
- content moderation, which can be carried out in different ways (algorithmic, manual, upstream or downstream of content publication) and can take a variety of forms (deletion, labelling or reducing the visibility of content).
To think ahead on these issues, we first need to look at what already exists. Given the similarities between the issues facing the web today and those likely to emerge with the advent of the Metaverse, could and should today's governance models be replicated in the Metaverse and tomorrow's internet? Are current regulatory frameworks adapted to the Metaverse, or do certain characteristics specific to the Metaverse require them to be updated? Will the role and scope of the various stakeholders involved in internet governance change?
The articles that make up this chapter aim to answer those questions.