European Union Launches Revolutionary Proposals to Regulate “Big Tech”
On December 15, the European Commission (Commission) proposed drafts of two landmark digital legislative packages — the Digital Markets Act (DMA), which proposes new competition rules for so-called “gatekeeper” platforms to address alleged unfair practices and make them more contestable by competitors, and the Digital Services Act (DSA), which recommends revamping content moderation rules for “very large online platforms.”
The new rules, if they pass into law in their current form, would impose a stringent regulatory regime on Big Tech and give the Commission new enforcement powers. The draft regulations foresee severe fines for noncompliance — up to 10% of a company’s global revenues under the DMA and up to 6% under the DSA. The Commission would also be able to impose structural remedies, such as obliging a gatekeeper to sell all or part of a business, on companies that repeatedly engage in anticompetitive behavior prohibited by the DMA.
The proposals mark the beginning of a legislative process that is likely to be controversial and hotly contested, as there are marked differences of opinion on whether these proposals go too far, do not go far enough, or are necessary at all in light of preexisting competition powers.
I. The Digital Markets Act
The draft DMA proposes new restraints on the behavior of so-called gatekeeper platforms, with the power to issue fines (and structural separation in the most egregious cases) for noncompliance.
Under the DMA, companies would be presumed to be gatekeepers if they provide “core platform services”1 and meet the following quantitative criteria:
a) achieve turnover in the European Union (EU) in excess of €6.5 billion over the past three financial years or have an average market capitalization in excess of €65 billion and provide core platform service in at least three member states
b) control an important gateway between EU businesses and final consumers (presumed to be the case if the company operates a core platform service with more than 45 million active end users in the EU and more than 10,000 business users in the EU)
c) maintain an entrenched and durable position (presumed to exist if the other two criteria have been met in each of the last three financial years)
Companies would be obliged to proactively notify the Commission within three months of meeting the quantitative thresholds, and gatekeeper status would be presumed to apply unless service providers can demonstrate the application of some narrow exceptions. The Commission would review gatekeeper status designation every two years or where there is a substantial change in the facts.
Obligations on gatekeeper platforms
The Commission proposes that gatekeeper platforms should “carry an extra responsibility to conduct themselves in a way that ensures an open online environment that is fair for businesses and consumers.” Within six months of being designated as a gatekeeper platform, companies would be obliged to comply with a number of obligations and restrictions. The Commission distinguished between (i) “self-executing” obligations and (ii) obligations that are “susceptible to specification,” meaning that the Commission would have the ability to further detail the precise measures a particular gatekeeper must take to comply with the overarching obligation in context.
Under the “self-executing” obligations, gatekeeper platforms would be required to
a) refrain from combining user data gathered from core platform services with user data gathered from other services offered by the gatekeeper or third parties without proactive user consent
b) allow business users of online intermediation services to offer goods and services on more favorable conditions than those offered by the gatekeeper platform
c) allow business users to promote offers and conclude contracts with end users, both through the core platform services of the gatekeeper and through alternative channels
d) refrain from inhibiting business users from raising concerns about unfair behavior by gatekeepers, for example through confidentiality clauses
e) refrain from requiring business users to make use of the identification systems of the gatekeeper over alternative identification systems as a condition to the provision of core platform services or products
f) refrain from requiring business users or end users to use any other core platform services as a condition to access of the gatekeepers’ core platform service
g) provide business users of the gatekeepers’ online advertising services transparency about the price paid for different advertising services
Obligations susceptible to specification
The obligations that are “susceptible to specification” require gatekeepers to
a) refrain from using data gathered from business users as a provider of core platform services to compete against the same business users
b) allow end users to uninstall preinstalled software applications on core platform services
c) allow the installation of third-party software applications and software application stores and allow such software to be accessed outside of the core platform services of the gatekeeper
d) refrain from ranking a gatekeeper’s products and services more favorably than those offered by a third party and apply fair and nondiscriminatory conditions to such ranking
e) refrain from technically restricting the ability of end users to switch between and subscribe to different software applications
f) allow business users access to and interoperability with operating systems, hardware, and software used by the gatekeeper
g) provide advertisers and publishers with access to the performance measuring tools of the gatekeeper and the information necessary to carry out their own independent verification of ad inventory
h) provide business users and end users portability of data generated through the use of the relevant platform
i) provide business users access to data generated in the context of the use of a core platform service
j) provide third-party providers of online search engines access to ranking, query, click, and view data generated by end users on the online search engines of the gatekeeper
k) apply fair and nondiscriminatory general conditions of access for business users on software application stores
Market investigation tool
The draft DMA would also empower the Commission to open and carry out market investigations for any one of the following purposes: (i) identifying or designating gatekeepers, (ii) investigating systematic noncompliance with the obligations laid down in the DMA, and (iii) examining new services and practices to determine whether they should be added to the list of core platform services. Member states would also be entitled to request the Commission to open an investigation.
Enhanced merger rules
Gatekeeper platforms would be required to inform, rather than fully notify, the Commission of any acquisition they make, regardless of its size and of whether it meets relevant merger filing thresholds.
Enforcement under the DMA
The Commission would be empowered to issue noncompliance decisions where it finds that a gatekeeper has not complied with the obligations set out in the DMA. Companies would have the opportunity to be heard on preliminary findings and proposed measures. Infringing gatekeeper platforms would be liable for fines of up to 10% of their global revenues. The Commission would also be entitled to (i) issue requests for information, (ii) conduct on-site inspections, (iii) impose interim measures on the basis of a prima facie finding of an infringement, and (iv) accept commitments from gatekeeper platforms.
In the case of systematic infringements (where three noncompliance or fining decisions have been issued to a gatekeeper platform within a period of five years), where necessary to achieve compliance, and where no alternative, equally effective measures are available, the Commission could impose additional remedies including structural remedies, such as obliging a gatekeeper to sell all or part of a business.
II. The Digital Services Act
The Digital Services Act modernizes and clarifies the regulatory framework from 2000. It would apply to online intermediaries, which include services such as internet service providers, cloud services, messaging, marketplaces, and social networks.
Obligations on intermediary service providers
All providers of intermediary services would have to
a) establish a single point of contact to facilitate direct communication with member states’ authorities (i.e., digital service coordinators), the European Board for Digital Services, and the Commission
b) designate a legal representative in the EU (if they are not established in the EU but offer their services in the EU)
c) set out in their terms and conditions any restrictions they may impose on the use of their services and to act responsibly in applying and enforcing those restrictions
d) comply with transparency reporting obligations in relation to the removal and the disabling of information considered to be illegal content or contrary to the providers’ terms and conditions
Obligations on hosting services and online platforms
Specific due diligence obligations apply to hosting services and online platforms. In particular, all online platforms (including social networks, content-sharing platforms, app stores, online marketplaces, online travel, and accommodation platforms), except the smallest, would be required to
a) set up complaint and redress mechanisms and out-of-court dispute settlement mechanisms
b) cooperate with trusted flaggers and take measures against misuse/abuse notices
c) inform competent enforcement authorities of any information giving rise to a suspicion of serious criminal offenses involving a threat to the life or safety of persons
d) vet the credentials of third-party suppliers
e) publish reports on the removals and the disabling of information considered to be illegal content or contrary to their terms and conditions
f) ensure greater advertisement transparency by letting users know “in a clear and unambiguous manner and in real time” that they are viewing an ad; consumers would also have to be told who is behind the ad and be given “meaningful information about the main parameters used to determine” why they were targeted
Obligations on “very large online platforms”
A subset of rules specified in the Digital Services Act focuses on “very large online platforms” — that is, those companies with at least 45 million users in the EU (i.e., representing 10% of the European population). In particular, in addition to the obligations set out above, the proposed draft DSA would require “very large online platforms” to
a) assess and mitigate the “systemic risks stemming from the function and use of their services.” The draft identifies three categories of systemic risk: (i) the spread of illegal content, such as child sexual abuse material or illegal hate speech, as well as the conduct of illegal activities; (ii) the impact of the service on the exercise of fundamental rights, including the freedom of expression and information, the right to nondiscrimination, and the right to private life; and (iii) the intentional and, oftentimes, coordinated manipulation of the platform’s service that can affect public health, civic discourse, electoral processes, public security, and protection of minors
b) share relevant data with authorities, independent auditors, and vetted researchers on how they comply with the rules
c) set out in their terms and conditions “in a clear, accessible and easily comprehensible manner” the main parameters used in their recommender systems (if they are using one)
d) compile and make publicly available a repository containing various information about the content, display, and recipients of ads
e) comply with strict transparency reporting obligations
“Very large online platforms” would also be required to appoint “one or more” compliance officers who would be responsible for making sure that a company abides by the obligations and would need to cooperate with the Digital Services Coordination Board and the Commission.
New duties for EU member states
The draft DSA also sets out new duties for EU member states, which must name a digital service coordinator to act as a single contact point for the Commission and take part in a new advisory group — the European Board for Digital Services (Board). If a digital service coordinator finds that a “very large online platform” has failed to comply with the rules, such platform would be required to come up with an “action plan” to address the violations of the regulation.
Codes of conduct
The Board and the Commission would encourage and facilitate the drawing up of codes of conduct to facilitate the application of the rules. Where a “significant systemic risk” emerges and concerns several “very large online platforms,” the Commission may invite, in addition to the platform concerned, other online platforms, providers of intermediary services, civil society organizations, and other interested third parties to participate in the drawing up of codes of conduct.
Enforcement under the DSA
Failure to comply can lead to fines of up to 6% of the company’s total turnover in the previous financial year, depending on the severity, length, and recurrence of violations. In addition to fines, the Commission would be able to enforce the regulation through (i) periodic payments that can be of up to 5% of the average daily turnover in the preceding financial year and (ii) interim measures, requiring the company to immediately cease the violations.
The Commission would also be able to conduct its own investigations following complaints or upon a request from a digital service coordinator or advice from the Board. The Commission would be given “strong investigative and enforcement powers to allow it to investigate, enforce and monitor certain of the rules laid down in this Regulation.” The Commission’s enforcement powers would extend from conducting dawn raids and investigations to obliging companies to hand over “data-bases and algorithms of undertakings” and interviewing employees.
III. Next Steps
The European Parliament and EU member states will debate the draft DMA and DSA in the new year.
The EU proposals are in line with, and bolstered by, similar moves in other jurisdictions such as the UK, Germany,2 Japan, and Australia, which are also individually drawing up new tech-focused legislative proposals. Similar proposals are gaining momentum in the U.S. Earlier this month, the Commission published a new trans-Atlantic agenda for global change, which, among others, calls for a greater trans-Atlantic cooperation on antitrust enforcement in digital markets, standard-setting, and data governance.
The approval process of the new laws could take two or more years. With so many stakeholders likely to offer views, the proposals are unlikely to be adopted without amendment, but the direction and impact of any amendments will be among the more hotly debated dossiers facing the member states and European Parliament for some time to come.
1 “Core platform service” is defined as (a) online intermediation services; (b) online search engines; (c) online social networking services; (d) video-sharing platform services; (e) number-independent interpersonal communication services; (f) operating systems; (g) cloud computing services; (h) advertising services, including any advertising networks, advertising exchanges, and any other advertising intermediation services, provided by a provider of any of the core platform services listed in (a) to (g).
2 Article 1(5) of the draft DMA states that “Member States shall not impose on gatekeepers further obligations by way of laws, regulations or administrative action.” However, the German government issued a statement earlier today saying that it will “work to ensure that national and European digital rules complement each other well.” The new German competition rules for online platforms should come into force in early 2021, before the DMA and DSA are finalized.