March 30, 2022

The Online Safety Bill: Our initial analysis

By Professor Lorna Woods, Professor of Internet Law, University of Essex, William Perrin, Trustee, Carnegie UK and Maeve Walsh, Carnegie Associate

This is our initial response to the Online Safety Bill. We have written extensively about our concerns on the draft Bill and are pleased to see that many of those have been addressed in the Bill. As always, we acknowledge the immense effort that has gone into getting the proposed legislation to this stage – from countless DCMS and other Government officials, multiple Ministers, the members of the Joint Committee who scrutinised the draft so effectively and other Parliamentary Committees whose inquiries have shaped the final Bill, and an array of civil society campaigners whose insight, analysis and tenacity has led to some significant Government concessions.

The Bill is by no means perfect, however; and we still have many concerns that we will need to work through in the coming weeks and during the passage of the Bill through the House. For now, this initial response is structured around a number of tests that we had set for the Bill before its publication which reflect many of the areas and topics we have written about in recent years.  You can download the full analysis here, or click through the links below to read supporting detail on the issues we have highlighted. More will follow from us in due course. In the meantime, for those coming to this fresh, we have also published a short explainer on what the Online Safety Bill does, which can be read alongside this initial analysis.

Summary: areas of improvement and areas of concern

We note the following improvements:

  • Compared to the draft, the approach has become more clearly systems-based, which is welcome. But it is still some way short of a truly systems-based approach: the structure of the Bill, the addition of content-specific new sections, and its drafting still pushes services to addressing harmful content, rather than systems – including the business model and algorithms. Further amendments to consolidate its focus on systems would make the regime more effective and provide reassurance to those worried about its impact on free speech. Read more on our analysis here.
  • The protections for children have been significantly strengthened, including with the introduction of measures in relation to access to pornography, although children’s rights groups still have concerns about the Child Access Assessment.
  • The enforcement measures seem to be more effective and tied back more explicitly to companies’ risk assessments and the obligations that flow from them. Read more on this here.

There are still areas of concern including:

  • The Bill remains too complex. Some simplifications have been brought in. But the overall structure is still too difficult to navigate, the principle-based elements (such as the Online Safety Objectives, which appear in Schedule 4) and definitions are buried at the back, rather than upfront, and new concessions have been, in the main, bolted on. Complexity, which is the Government’s choice here, increases the regulatory burden. Read our detailed analysis here.
  • The decision to stick with different categories of services, which we and the Joint Committee had argued against in the draft Bill, does not fit with a proportionate or risk-based regime and will not catch harms on fast-growing platforms.  This is a significant hole in a systems-based approach: very large size itself can be an absolute indicator of risk and using very large size as a proxy brings administrative simplicity, but it is wrong to suppose that smaller size means lower risk. We continue to recommend that categories of providers are removed and risk assessment duties apply across the board.
  • Too many powers remain with the Secretary of State, with too much of the regime’s detail left until secondary legislation. We suggest that the Secretary of State’s powers to direct OFCOM on the detail of its work (such as codes) are removed. For National Security, government should have carefully constrained powers. OFCOM’s Board needs to be bolstered to oversee National Security issues[1]. Read our analysis of the Secretary of State’s powers here.
  • The OSB remains weak in relation to addressing harms to adults and wider societal harms. Given that the Government is listing harms in the Bill, it should list harms both to children and to adults in a new Schedule 7(a) and 7(b) respectively to enable debate and give victims and companies more certainty. Three years on from the White Paper, the Government must know what it intends to include. Read our analysis on whether the Bill protects the most vulnerable here and on its protections for fundamental rights here.
  • Measures to tackle mis/disinformation are still largely absent from the Bill. The Government should reform the unaccountable system of nudging service providers to deal with issues using civil service teams, on a possibly piecemeal basis, and bring disinformation into scope. It should create a formal mechanism for state actor disinformation, such as that from Russia, and for public health issues such as COVID. Read more on how to address this here.
  • The Bill envisages cross-organisation working between OFCOM and other regulators but does not create the powers that may be needed for domestic inter-regulator cooperation. The Government should give clear powers to OFCOM (similar to those in Clause 97) to ensure that case files can flow through and between regulatory systems in accordance with the law. Read more on this topic here.
  • Fraudulent advertising has the same impact on victims, whatever platform it occurs on. The new fraudulent advertising powers should apply with similar strength to all companies, not just Category 1 and 2A.
  • Human trafficking offences are a serious omission from Schedule 7 that should be rectified. Advertising is an important route for modern indentured servitude and should clearly be in scope for that offence.
  • The regime will not start working properly until 2024. The Government should seek to bring some aspects forward; one area worth more study could be making companies’ terms and conditions enforceable under the regime from the moment of Royal Assent.

We work through these areas in more detail below.

  1. Is the Bill workable?
  2. Is it systemic and risk-based?
  3. Does the Bill protect fundamental rights?
  4. Does the Bill protect the most vulnerable?
  5. The role of the state
  6. Transparency
  7. Societal harm
  8. Effective enforcement
  9. Is the Bill future proofed?
  10. Working with other regulators

Conclusion

We will return to many of these issues – and more – in blogs and more detailed publications in the coming months and will continue to work in support of civil society partners, policymakers and Parliamentarians as the Online Safety Bill makes its way through Parliament. In the meantime, as always, we welcome feedback on our analysis – where we have got things right and, as importantly, where we may also have got things wrong. Do contact us at [email protected].

 

[1] This paragraph was amended 08/04/22 to clarify the recommendations on the Secretary of State’s powers.