Two years after the UK Government first put forward its intention to introduce a new regime to address illegal and harmful content online, the UK Government published the Online Safety Bill (“Bill”) on 12 May 2021. The Bill imposes duties of care on providers of digital services, social media platforms and other online services to make them responsible for content generated and shared by their users and to mitigate the risk of harm arising from illegal content (e.g., by minimising the spread of such content). The Bill also aims to ensure that users are able to express themselves freely online and requires platforms to consider the importance of freedom of expression when fulfilling their duties.
The Bill designates the Office of Communications (“OfCom”) to oversee and enforce the new regime and requires OfCom to prepare codes of practice to outline recommendations for businesses to comply with their duties. OfCom will have the power to fine companies of up to £18 million, or ten (10) percent of qualifying revenue if they fail in their new duty of care.
In terms of next steps, the Bill will now undergo pre-legislative scrutiny by a joint committee of MPs during the current parliamentary session, before a final version of Bill will be introduced to Parliament later this year.
Set out below are some key points to consider for in-scope service providers when preparing for the Bill:
- Scope of the Bill: The Bill applies to two types of services: (i) “user-to-user services” – meaning an internet service enabling an individual user of that service to generate, upload or share content on that service such as Twitter; and (ii) “search services” –
- meaning search engine services such as Google. Notably, only services that allow for online interactions between users, such as direct private messaging, are in-scope. Services provided both within and outside of the UK are in-scope where the service has some nexus to the UK (i.e., is capable of being used in the UK or where the UK forms a target market for the service).
A number of services are expressly carved out of the scope of the Bill including services relating to emails, SMS and MMS, as well as internal business services.
- Illegal and harmful content: According to the Bill, illegal content involves content relating to terrorism and child sexual exploitation and abuse. Harmful content is not specifically defined in the Bill but includes content that the service provider should reasonably identify as having a material risk of an adverse physical or psychological impact on a child or adult of “ordinary sensibilities”, taking into account how many users may encounter such harmful content and how quickly such content may be shared through a service. However, the potential financial impact of any content is not to be considered in determining whether it is harmful, meaning online fraud and cyber-scams will not be caught by the harmful content provisions in the Bill.
- Tiered duties of care: The Bill divides services providers into four (4) categories when it comes to their duty of care obligations: (i) providers of regulated user-to-user services; (ii) providers of user-to-user services that are likely to be accessed by children; (iii) providers of “Category 1” services (i.e., providers with additional duties to protect certain types of speech); and (iv) search engine providers. Each category requires service providers to:
o carry out a risk assessment of illegal content;
o take steps to mitigate the risks identified and design and use proportionate systems and processes to minimise the presence of illegal content and remove content as necessary;
o consider users’ rights to freedom of expression and privacy when designing and implementing user safety policies and procedures;
o establish user reporting mechanisms which allow users to report content they consider to be illegal or harmful; and
o record risk assessments and regularly review compliance with relevant duties.
In addition to the duties common for all service providers, Category 1 service providers will have a further duty of care to consider the importance of “content of democratic importance” (e.g. content promoting or opposing a political party) when making decisions on whether to take the content down. Providers will need to ensure that a diversity of political opinion is treated equally in making that decision. Category 1 service providers will also have a statutory duty to safeguard users’ access to “journalistic content” shared on their platforms. Under the Bill, articles by recognised news publishers, which includes citizen journalists as well as professional journalists, shared on in-scope services should not be removed as part of a platform’s content moderation obligations.
- OfCom’s enforcement powers: According to the Bill, OfCom, the current regulator of electronic communications and broadcast, will act as an online safety regulator of in-scope service providers. The Bill provides OfCom with a range of enforcement mechanisms, which include (i) issuing financial penalties up to the amount of the greater of £18 million or 10% of global turnover; (ii) taking “business disruption measures”, which include seeking court orders to disrupt the activities of non-compliant providers or prevent access to their service where it is deemed there is a risk of significant harm to individuals in the UK; and (iii) issuing directions and notices of non-compliance. The Bill also provides the Government with deferred powers to introduce a new criminal offence for senior managers where it decides that action is necessary to promote compliance.
Companies within scope of the new framework should begin to prepare for the introduction of the Bill into legislation, including ensuring compliance with the interim codes of practice published to date and preparing or revising existing policies, procedures and systems governing the use of such platforms.
This post is as of the posting date stated above. Sidley Austin LLP assumes no duty to update this post or post about any subsequent developments having a bearing on this post.