In technology, first-mover advantage is often significant. This is why BigTech and other online platforms are beginning to acquire software businesses to position themselves for the arrival of the Metaverse. They hope to be at the forefront of profound changes that the Metaverse will bring in relation to digital interactions between people, between businesses, and between them both.
What is the Metaverse? The short answer is that it does not exist yet. At the moment it is vision for what the future will be like where personal and commercial life is conducted digitally in parallel with our lives in the physical world. Sounds too much like science fiction? For something that does not exist yet, the Metaverse is drawing a huge amount of attention and investment in the tech sector and beyond.
Here we look at what the Metaverse is, what its potential is for disruptive change, and some of the key legal and regulatory issues future stakeholders may need to consider.
What is the Metaverse?
The Metaverse is still just an idea. What is the idea and what are the basic building blocks for it? The Metaverse will be the outcome of the convergence of a range of nascent and extant digital and online technologies. It may start off as a focus for gaming, virtual reality, digital meeting spaces, digital assets (such as non-fungible tokens - see our briefing, Anatomy of an NFT), and perhaps even brain-to-machine interactions, but it will not end with that. Its scope and impact may expand when Artificial Intelligence is included, and when data from the physical world is brought in via the Internet of Things (integrating humans as well as businesses in ever more digital / physical co-existence).
The Metaverse’s true potential lies, however, not in ever more convergence for its own sake, but in the outcome of that. It may evolve to become a universal digital platform for personal and commercial interactions – the platform replacing the current technology stack of the world wide web operating on top of the Internet - and become the source of the most valuable data about consumers available to the business world.
What are the characteristics of the Metaverse?1
- Persistence: the Metaverse will exist regardless of time and place.
- Synchronicity: participants of the Metaverse will be able to interact with one another and the digital world in real time, reacting to their virtual environment and each other just like they would in the physical world.
- Availability: everyone will be able to log on simultaneously and there will be no cap on the number of participants.
- Economy: participants - including businesses - will be able to supply goods and services in exchange for value recognised by others. That value may start off as (or include) the kind of value that video games players already use now (for example, fiat currency exchanged for virtual gold and in-game items). It may also include non-fungible tokens, cryptocurrency, and e-money, along with more traditional fiat currency. Such exchanges of value may depend upon technologies such as distributed ledger technologies and smart contracts, and technologies not even thought of yet.
- Interoperability: the Metaverse will allow a participant to use his or her virtual items across different experiences on the Metaverse. For instance, a user experience may include cross-platform capability, allowing, say, a vehicle unlocked in a racing game to be used in a different adventure game, or an item of clothing purchased on the Metaverse to be “worn” and used in games, concerts, and any other virtual environments available. As the Metaverse moves beyond gaming, businesses participating may need to move beyond existing proprietary methods of shoring up their market positioning – controls over formats for the exchange of data and over verification of ID, for example, will need to change.
Why would the Metaverse be any different from the World Wide Web?
To achieve that, different infrastructure will be required, perhaps on a distributed or decentralised basis. While based on Internet infrastructure, there are already successful distributed/decentralised computing models (for example, distributed ledger technology and cloud computing) that might point to the future of what the infrastructure might be like.
“The Metaverse will require countless new technologies, protocols, companies, innovations, and discoveries to work. And it won’t directly come into existence; there will be no clean ‘Before Metaverse’ and ‘After Metaverse’. Instead, it will slowly emerge over time as different products, services, and capabilities integrate and meld together.”2
What are the potential legal issues?
The revolutionary nature of the Metaverse is likely to give rise to a range of complex legal and regulatory issues. We consider some of the key ones below. As time goes by, naturally enough, new ones will emerge.
Participation in the Metaverse will involve the collection of unprecedented amounts and types of personal data. Today, smartphone apps and websites allow organisations to understand how individuals move around the web or navigate an app. Tomorrow, in the Metaverse, organisations will be able to collect information about individuals’ physiological responses, their movements and potentially even brainwave patterns, thereby gauging a much deeper understanding of their customers’ thought processes and behaviours.
Users participating in the Metaverse will also be “logged in” for extended amounts of time. This will mean that patterns of behaviour will be continually monitored, enabling the Metaverse and the businesses (vendors of goods and services) participating in the Metaverse to understand how best to service the users in an incredibly targeted way.
The hungry Metaverse participant
How might actors in the Metaverse target persons participating in the Metaverse? Let us assume one such woman is hungry at the time of participating. The Metaverse may observe a woman frequently glancing at café and restaurant windows and stopping to look at cakes in a bakery window, and determine that she is hungry and serve her food adverts accordingly.
Contrast this with current technology, where a website or app can generally only ascertain this type of information if the woman actively searched for food outlets or similar on her device.
Therefore, in the Metaverse, a user will no longer need to proactively provide personal data by opening up their smartphone and accessing their webpage or app of choice. Instead, their data will be gathered in the background while they go about their virtual lives.
This type of opportunity comes with great data protection responsibilities. Businesses developing, or participating in, the Metaverse will need to comply with data protection legislation when processing personal data in this new environment. The nature of the Metaverse raises a number of issues around how that compliance will be achieved in practice.
Who is responsible for complying with applicable data protection law?
In many jurisdictions, data protection laws place different obligations on entities depending on whether an entity determines the purpose and means of processing personal data (referred to as a “controller” under the EU General Data Protection Regulation (GDPR)) or just processes personal data on behalf of others (referred to as a “processor” under the GDPR).
In the Metaverse, establishing which entity or entities have responsibility for determining how and why personal data will be processed, and who processes personal data on behalf of another, may not be easy. It will likely involve picking apart a tangled web of relationships, and there may be no obvious or clear answers – for example:
- Will there be one main administrator of the Metaverse who collects all personal data provided within it and determines how that personal data will be processed and shared?
- Or will multiple entities collect personal data through the Metaverse and each determine their own purposes for doing so?
Either way, many questions arise, including:
- How should the different entities each display their own privacy notice to users?
- Or should this be done jointly?
- How and when should users’ consent be collected?
- Who is responsible if users’ personal data is stolen or misused while they are in the Metaverse?
- What data sharing arrangements need to be put in place and how will these be implemented?
Virtual reality headsets and glasses will likely be commonplace in the Metaverse (unless they are replaced by something more sophisticated in the meantime, such as direct electronic/brain interfaces). Such devices have the potential to collect a wide range of sensitive data about the wearer (for example, eye and body movements, physiological responses and even brainwave patterns, etc.)
To the extent that this data is used by actors in the Metaverse to learn about the user or to make decisions about them, then it will be considered to be special category data under the GDPR.
This means that extra conditions would need to be satisfied. Most importantly, the user would most likely need to give their explicit consent for each purpose for which the data is used. Let’s take the hungry woman example described above. If the woman was targeted with food adverts using gaze analysis technology, for this to be lawful she would have needed to have given her express permission. A general marketing consent would not suffice. Quite how this consent would be sought and given is a question that goes to the issue of whether the Metaverse can operate on a decentralised/distributed model, discussed below (see Decentralised / distributed models).
Consent to marketing
A key driver in the development of the Metaverse is its potential to enable new forms of marketing which are seamlessly integrated into the fabric of the Metaverse. For example, an individual heading to a store in the Metaverse might be shown deals on his/her favourite products in real time as he/she is browsing the shelves based on his/her previous behaviour.
This is likely to constitute direct marketing under many countries’ data protection laws, which could require the consent of the Metaverse users.
The precise nature of the obligations would likely depend on whether the brands themselves instigate the marketing and how the marketing is presented, including whether the presentation of marketing is more akin to online behavioural advertising or social media marketing (where a network of participants operate to present relevant advertising).
However, in all cases, thought needs to be given as to how and when any required consent would be collected and, in particular, whether “real world” consent can be relied on by brands in the Metaverse and vice versa.
It is one thing to process personal data of adults in the Metaverse, but it is quite another where children are concerned. Many countries’ data protection laws provide special protection for children’s personal data, and data protection authorities and other similar regulators often come down particularly hard on organisations that do not comply with the rules.
In many circumstances, parental consent is required if a child is to participate in an online service, and the GDPR explicitly states that specific protection is required where children’s personal data are used for marketing purposes or creating personality or user profiles.
Sophisticated age verification techniques, enforcing age restrictions and implementing measures to deter children from providing their personal data are therefore going to be essential components of increasing data protection compliance in the Metaverse.
To enable interoperability, data collected by one entity in the Metaverse may have to flow seamlessly between different operators and even platforms. As interoperability improves and the consumers are allowed to move digital assets and avatars between platforms and across the Metaverse, software developers and brands will need to establish bilateral or multilateral data sharing agreements to improve the seamlessness of the consumer experience.
This is not altogether different from the current environment in which databases are bought/sold, but there are conditions which must be met first.
For example, one requirement under many data protection laws is that the receiving party’s privacy notice must be provided to an individual shortly after it receives the data to explain to the individual how their personal data will be processed. These conditions will become increasingly difficult to meet in the Metaverse, where data exchange is rapid and involves a multitude of participants.
One solution to this might be for a central administrator of the Metaverse to give users a clear description of how their data will be used and (if necessary) the opportunity to give consent for various uses. However, data protection regulators have expressed distaste for this type of “catch-all”, bundled approach.
These types of objections are likely to be more forceful in relation to the Metaverse, where the amount of data collected and complexity of data sharing networks is significantly greater in scale.
Data export and localisation
“Seamlessness” in the Metaverse demands that data crosses boundaries at speed and without friction. It will be challenging for organisations and/or central Metaverse administrators to manage this while the rules around data export and localisation are becoming increasingly strict.
Many countries are also beginning to roll out “data localisation” laws which can impose onerous restrictions on data leaving the country in which it was collected (see our publication, Free Flow of Data). It would not be surprising to see developers and/or brands getting together to try and agree large, overarching data sharing/export agreements, although how feasible such initiatives might be remains to be seen.
Responsibility for data breaches and cyber-attacks
As with any online platform, the Metaverse will face the usual challenges of fending off cybersecurity incidents and data breaches. However, in the Metaverse these types of attacks may also take more ‘sci-fi’ type forms through deep fakes and hacked avatars.
These types of incidents may therefore be harder to identify, verify and bring under control, and it may also be difficult to ascertain where responsibilities lie in respect of breach notification to users and data protection authorities, given the complex web of relationships that entangle the Metaverse.
Decentralised / distributed models
The discussion on data, above, underscores a number of competing tensions that will need to be addressed in the Metaverse:
- Participants will want a seamless experience in traversing the subsystems of the Metaverse.
- The platform technology itself may be decentralised. How will data sharing and a seamless user experience be possible in such circumstances if there is not central co-ordination by, say, an administrator? How will vendors who do not know each other and may have no commercial connection co-operate in relation to the exchange of data?
- Vendors will want to have customer “ownership”. To do that, they may want their own terms and conditions to which a participant subscribes. Will this mean that large areas of the Metaverse will be gated (greatly reducing the user experience)? At the moment, if we want access to the world wide web, we subscribe to an ISP’s terms and conditions for that access, but such terms and conditions do not prescribe the terms applicable to our access to particular websites on the world wide web. We are used to “partitioned” access to websites, governed by separate clickwrap or webwrap terms and conditions. As that approach does not lend itself to seamlessness, how will it be addressed in the Metaverse? Universal terms and conditions seem unlikely, so would technology provide the solution (for example, self-executing smart contracts)?
Competition law issues may arise as a consequence of both developer and participant conduct. Businesses developing Metaverse products and services on their own are unlikely to face antitrust concerns. However, the global and interoperable nature of the Metaverse will inevitably encourage multiple businesses to communicate and co-operate with each other in order to provide greater choice and a better experience to participants. Where they are competitors, communications or co-operation between Metaverse offerings could give rise to antitrust issues, which will need to be examined with caution.
For example, while co-operation among competing Metaverse businesses to facilitate interoperability will most likely be viewed as pro-competitive, any sharing of competitively sensitive information (especially pricing) or agreeing on separate areas of focus and development could constitute serious antitrust infringements and lead to high fines.
To mitigate this risk, Metaverse businesses will need to implement competition policies and training programmes, not only for their employees but, potentially, for certain Metaverse participants as well.
Similar to other online gaming platforms, participants in the Metaverse could engage in conduct that would contravene antitrust laws in the real world. Where online products and services hold real-world value, real-world antitrust laws (such as the prohibition on cartels and joint boycotts) will also apply, which could have both civil and criminal consequences for those participating.
Social Media Regulation
Will social media regulation impact upon Metaverse stakeholders? It is difficult to speculate, so far in advance, on what the legal position will be in relation to the Metaverse when social media itself is not yet much regulated globally.
BigTech, as incumbents, have a particular interest in the evolution of the Metaverse. Some commentators are calling for tougher regulation in order to make BigTech more accountable for content that appears on their platforms.
Social media regulation: BigTech case studies
- EU: EU regulators are considering introducing conditions that would require BigTech to pay for news services, and to require them to inform publishers about changes to how they rank news stories on their sites3. The EU’s proposed Digital Markets Act (discussed below) aims to prohibit unfair practices such as “self-preferencing” (for example, when BigTech prioritise their own goods / apps / services in product displays), and would introduce interoperability and data sharing obligations. See our blog, Shaken, not stirred: EU mixes big tech regulation and antitrust, for more detail.
- UK: the UK government plans to introduce an Online Safety Bill in 2021, which - reportedly - will introduce a new duty of care for a number of online service providers and will determine what steps must be taken to address illegal and harmful content. It is also reviewing whether the criminal law can more effectively address online abuse.
- Australia: Australia recently passed the News Media Bargaining Code with the aim of requiring certain BigTech companies to pay for news content on their platforms. The code also requires Tech platforms to inform publishers of changes to the algorithms that determine which stories are presented. There were suggestions that BigTech might withdraw from Australia, with one such business introducing a temporary ban on news content. However, individual deals have now been agreed between the BigTech and local media companies4.
- USA: in the United States:
- Section 230 of the 1996 Communications Decency Act (47 U.S.C. § 230) has once again been called in to question. (Scrutiny of section 230 is not new, and the technology industry has long held that it is a vital protection.) Section 230 generally protects the owners of any “interactive computer service” from liability for content posted by third parties, and entitles internet platforms and social media sites to moderate (but not substantively change) content from third parties without being held liable for what they say.
- Because most social media sites are not run by government entities, the freedom of speech requirements of the First Amendment to the U.S. Constitution do not apply - social media sites are free to permit or to deny access to anyone as long as that decision does not violate laws applicable to private entities.
- In February 2021, the Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Harms (SAFE TECH) Act was introduced into the US Senate (S. 299). It aims to limit the scope of protection offered by section 230 and to enable social media businesses to be held accountable for enabling cyber-stalking, targeted harassment and discrimination on their platforms. As at the date of writing it is not yet clear what will come of the proposal within Congress.
To track further developments in social media law in North America, see our Social Media Law Bulletin.
There is no doubt that change is coming in many jurisdictions for BigTech and social media platforms. As the Metaverse emerges, key stakeholders will face the same kind of scrutiny in relation to the same kind of content.
Intellectual property rights
If you collaborate with others to generate intellectual property rights, who owns the created rights? Principles of joint authorship and co-ownership are complicated, and their application will become more so in complex virtual world scenarios where a community of stakeholders may have been involved.
It is for these types of reasons that the European Commission is considering legal reforms to clarify the position on “co-generated” data arising out of new technologies, as well as in relation to machine-generated data. Metaverse stakeholders will need to navigate these kinds of issues when participating in the Metaverse.
An IPR licence is a permission to do that which would otherwise be forbidden by intellectual property rights. The fast-moving world of the Metaverse may involve character “mash-ups” and the bringing together of intellectual property rights owned by separate stakeholders. Infringements caused by “use in combination” with other intellectual property rights is a typical carve-out in indemnities included in licences for intellectual property, but “use in combination” is precisely the kind of scenario that the Metaverse will bring about. Traditional risk allocations in IPR licences will need to be reviewed, as will scope of use provisions.
European Regulatory Initiatives
Some recent European legislative initiatives illustrate the kind of approach that regulators might take in dealing with the Metaverse (of course, the legislation might be in a very different form by then).
The Platform to Business Regulation
The e-commerce aspects of the Metaverse would be subject to laws that regulate e-commerce in Europe. The EU Regulation on Promoting Fairness and Transparency for Business Users of Online Intermediation Services (Regulation (EU) 2018/0112), known as the “Platform to Business Regulation” (P2B Regulation), deals with harmful trading practices by providers of online intermediary services (including online marketplaces, social media platforms and search engines) when dealing with their business users (that is, vendors). Its purpose is to establish a fair, predictable, sustainable and trusted online business environment, by enhancing transparency and redress obligations for business users (that is, vendors) of online platforms. As there will be a multiplicity of vendors operating in the Metaverse, regulations like these will be important.
Among other things, the P2B Regulation imposes requirements on digital platforms in relation to vendors whose content they carry or host (or whose goods or services are sold on the platform) to:
- Provide a description of any differentiated treatment which the platform gives in relation to goods or services offered to consumers by itself, on the one hand, and by other vendors, on the other hand.
- Not terminate a vendor’s participation on a platform without clear reasons, and to provide a right of appeal thereafter.
- Disclose the parameters that the platform uses to rank vendor goods and services in terms of search results.
Digital Services Act
The European Commission (EC) has proposed a new piece of legislation, the Digital Services Act (DSA), whose aim is to increase the transparency and safety for users in online environments and to allow innovative digital businesses to grow.
What does the Digital Services Act provide?
While still just a proposal at the moment, the DSA draft seeks to (among other things):
- Clarify definitions of illegal information and illegal activity.
- Harmonise the scope of covered online services - an up-to-date definition of “information society services”.
- Increase liability of intermediaries.
- Provide for responsibility for timely removal of illegal/harmful content.
- Incentivise the adoption of standard technical measures.
- Provide greater transparency around online advertising and smart contracts and other emerging issues and opportunities.
A key aspect of the DSA is to introduce liability and safety rules for digital platforms, services and products, and that raises the issue of how to strike a balance between:
- Ensuring digital intermediary service providers are held responsible for content moderation, data sharing and usage, and regulatory oversight.
- Avoiding imposing unreasonable penalties on service providers.
Vendors operating in the Metaverse should expect to navigate the same kinds of issues.
Digital Markets Act
The EU’s Digital Markets Act (DMA) will:
- Create a new framework to identify gatekeeper platforms.
- Require or proscribe certain gatekeeper practices.
- Give the EC new powers to conduct investigations and impose both behavioural and structural remedies, such as divestitures.
According to the EC, gatekeeper platforms benefit from extreme scale economies, very strong network effects, a significant degree of user dependence, lock-in effects, a lack of multi-homing for the same purpose by end users, vertical integration, and data driven-advantages.
These characteristics make users vulnerable to practices that can substantially undermine the contestability of “core platform services” and lead to unfair treatment of business and end users.
What are “core platform services” under the DMA?
Under the DMA (Article 3):
- The EC will designate providers of “core platform services” as gatekeepers if they have a significant impact on the internal market; their core platform service(s) serve as an important gateway for business users to reach end users; and they enjoy an entrenched and durable position.
- “Core platform services” include online intermediation services; online search engines; online social networking services; video-sharing platform services; “number-independent interpersonal communication services” (i.e., messaging services not linked to mobile phones); operating systems; cloud computing services; and online advertising services.
Clearly key stakeholders in the Metaverse might run the risk of being regarded as providing core platform services within the meaning of the DMA.
The EC has identified the core platform services on the basis that these services are typically controlled by highly concentrated multi-sided platforms who act as gateways for business users (vendors, in the Metaverse context) to reach their customers and vice-versa.
According to the EC, the gatekeeper power is often misused by means of unfair behaviour vis-à-vis economically dependent business users (vendors) and customers, leading to barriers to entry and weak contestability of the markets for these services.
Gatekeepers would be subject to a range of enforceable obligations in Article 5. Of particular relevance in relation to the Metaverse are that gatekeepers must:
- Refrain from combining personal data sourced from their core platform services with personal data from other services offered by the gatekeeper or from third-party services or from signing in end users to other services to combine personal data without consent. As noted in our discussion of data earlier, the sharing of data within the Metaverse would be a key way of creating seamlessness of a participant journey through it, so such a prohibition would have significant consequences in the absence of such consent.
- Allow business users to promote offers to and conclude contracts with end users, and allow end users to access and use content, subscriptions, features or other items from the business user, via the business user’s app on the gatekeeper’s platform, regardless of whether the end user uses the gatekeeper’s core platform services. As noted in our discussion on decentralised / distributed models earlier, gating subsystems in the Metaverse with separate terms and conditions could impact upon the seamlessness of a participant journey.
- Refrain from requiring business users to use, offer or interoperate with an identification service of the gatekeeper. In general, identity passporting across the Metaverse will be a key issue, and one that touches on the extent to which Metaverse stakeholders can own customer relationships.
See our blog, Shaken, not stirred: EU mixes big tech regulation and antitrust, for more detail about the DMA.
Proposed EU AI Regulation
The European Commission has published a proposal for an AI Regulation (see our blog, EU proposes new artificial intelligence regulation). Many human interactions within the Metaverse may be enabled through artificial intelligence. The regulation would prohibit certain AI practices and require providers and users (among others) to comply with various obligations in relation to high risk AI systems, and to comply with transparency obligations.
If much human/system interaction is seamless, and is to driven by AI, within the Metaverse (particularly any AI interpreting or manipulating human responses or simulating reality through “deep fakes”), relevant stakeholders should expect that they will need to comply with regulatory requirements of this type in the years ahead.
As aspects of the Metaverse materialise, so too will other legal issues we cannot yet foresee. We can, however, expect that Metaverse stakeholders will need to deal with the full range of other issues that would apply in any trading and digital context, such as anti-money laundering issues, sanctions and technology export restrictions, financial services regulation, infringement of intellectual property rights, etc.
The Universe is expanding. One day that might also be true of the Metaverse.