The UK’s Online Safety Bill (“Bill”), once legislated, will impose duties of care on providers of digital services, social media platforms and other online services to make them responsible for content generated and shared by their users and to mitigate the risk of harm arising from illegal content, and if services are deemed accessible by children, a duty to protect children from harm. As currently drafted, the Bill applies to any service or site that has users in the UK, or targets the UK as a market, even if it is not based in the country. The Bill is currently at the Committee Stage of the legislative process. Although the Bill is expected to receive Royal Assent during 2023, the timeline as to when the provisions will come into force is still unclear.
While the Bill passes through Parliament, the future regulator of the Bill, the Office of Communications (“OfCom”), announced its plan to produce various guidance and materials to assist and support in-scope services in carrying out risk assessments of illegal content as required under the new regime and OfCom’s Code of Practice. As part of this plan, OfCom will also publish a sector-wide register of risk which assesses the risks of harm as a consequence of illegal content on user-to-user and search services as well as risk profiles setting out key risk factors services should consider when conducting risk assessments.
OfCom states the importance for regulated services to “understand what good practice looks like for risk assessment and risk management” in order to improve online safety. To that end, OfCom’s proposed guidance will help services identify potentially illegal and harmful content (especially as it relates to children), explain how this content might appear on a service, and promote good practice around risk management.
Noting that there is no one-size-fits-all approach, OfCom anticipates its forthcoming guidance will outline a four-stage process for assessing risk (which is aligned with the regulatory requirements under the Bill):
- Establish the risk of harm in consultation with the risk profiles produced by OfCom.
- Review identified risks on platform and assess the likelihood of harmful content appearing and severity and impact of harm.
- Identify and implement risk mitigating measures and record outcomes of risk assessment.
- Monitor and review the effectiveness of risk mitigation measures on an ongoing basis.
Once the Bill enters into force and OfCom’s powers as regulator commence, OfCom plans to launch a consultation on its approach to illegal content risk assessments to enable services and relevant stakeholders to provide feedback. A separate consultation on children’s risk assessments will follow. In-scope services should monitor this space as once OfCom publishes its risk assessment guidance, services will only have three (3) months to carry out their first illegal content risk assessments.
OfCom will have significant enforcement powers and responsibilities under the future regulation. Among its broad powers is the ability to impose fines amounting to the greater of £18 million or 10% of annual global qualifying revenue on companies breaching their duties of care. OfCom will have the ability to issue notices of contravention to service providers and individuals, which (among other things) may include a requirement to use proactive technology, as well as seek court orders to block non-compliant services. Notably, the UK Government has recently announced that it will put forward several amendments to the Bill in the House of Lords including the ability for OfCom to bring criminal sanctions against senior managers who fail to ensure their company complies with Ofcom’s information requests, or who deliberately destroy or withhold information.
This post is as of the posting date stated above. Sidley Austin LLP assumes no duty to update this post or post about any subsequent developments having a bearing on this post.