We welcome the report of the Joint Committee on the Draft Online Safety Bill, published today, and the thorough, committed approach taken by the Committee members under the Chairmanship of Damian Collins MP. It is no small task to undertake pre-legislative scrutiny of such a complex draft Bill. The Committee has reviewed over 230 pieces of written evidence, heard from over 50 witnesses over the course of 11 oral evidence sessions, held four roundtables, visited Brussels; and produced a detailed, measured report in little more than three months. This period has also seen increasing political pressure on the Government to “get on with” the Bill and a wholesale change in the Ministerial team at DCMS. In Parliament, parallel inquiries from multiple other Parliamentary committees have added to the demands for amendments to the Bill, while the global impact of Frances Haugen’s Facebook revelations, which broke midway through the Committee’s deliberations, raised the profile of this topic still further.
There is much to consider in the report which – commensurate with a Bill of this nature – is long and detailed. We will write more on that detail and what it means for the Government’s next draft of the Bill in the new year once we have had time to digest the detail.
For now, there are a few headline points to make.
- The Committee has taken on board the recommendations made by Carnegie UK and others re the complexity of the Bill, and brought the focus much more firmly back to regulating companies’ systems and processes through measures that are proportionate, while ensuring rigorous regulatory oversight and accountability. We have recently recommended the introduction of a “foundation” duty of care and are pleased to see that much of the detail in the Committee’s recommendations, such as the new suite of safety objectives and their link through to OFCOM’s codes of practice and powers, aligns with our intention.
- We welcome the fact that the Committee has proposed the removal of clause 11 – the contentious “legal but harmful” clause – and proposed instead that companies should have in place “proportionate systems and processes to identify and mitigate reasonably foreseeable risks of harm arising from regulated activities defined under the Bill”. This approach and language aligns with our recommendations. The Committee’s definitions reference “specific areas of law recognised in the offline world or specifically recognised as legitimate grounds for interference in freedom of expression”. We will do further work to identify how far the examples suggested by the Committee correspond to the revised definition of harm we proposed recently, but commend the Committee for its work here.
- The Committee has also responded positively to the arguments advanced by ourselves and others, that the Secretary of State’s powers were too far-reaching, threatening the independence of Ofcom, and has proposed the removal of some of these powers. It is also welcome that, in proposing new Ofcom-drafted Codes of Practice to support the Committee’s revisions, the Committee has underlined the importance of Ofcom starting work on this vital architecture for the new regime quickly. We hope that DCMS will instruct them to do so at the earliest opportunity.
- In terms of scope, we are pleased to note that the Committee has addressed the following areas, which have been central to many civil society campaigns:
- bringing fraud and paid-for adverts into the scope of the Bill;
- addressing mis- and disinformation, specifically in relation to electoral integrity, as well as its wider societal impacts;
- removing the unnecessary distinction between categories of companies, which would potentially have excluded many smaller companies from accountability for significant harm, and addressing the very specific risks to children and vulnerable groups arising from cross-platform activity.
The Committee has also made a number of proportionate recommendations to address, for example the risks arising from online anonymity in relation to the targeting of hate and abuse; age assurance, specifically in relation to children’s access to pornography online; and the need for greater transparency from platforms as to the actions they are taking to mitigate the risks of harm, or to address it when it has been identified on their services.
What happens next?
By publishing the Bill in draft for pre-legislative scrutiny, the Government made it clear that there was room for improvement and invited open debate and discussion. It has been notable in the past three months that, for a regulatory proposal that is novel and – in many ways – contentious, there is growing consensus amongst parties on all sides of the debate as to where those improvements should be made. These include: simplifying the Bill and ensuring it focuses more on systems and processes, rather than individual items of content; protecting the independence of the regulator and reining in the powers of the Secretary of State; being clearer about what constitutes harm online; and addressing some of the challenges of online anonymity without removing the protections this offers to many.
The Committee has done an exemplary job in addressing these high-profile issues, suggesting practical solutions that balance competing pressures and inherent tensions, while being mindful of the need to avoid a suite of recommendations that could scupper the Bill entirely. We wrote previously that it would be akin to a “sunk cost” fallacy for the Government to stick too closely to the text of the existing draft Bill. It’s welcome that the Secretary of State has indicated to the DCMS Committee that the version of the Bill the Department is currently working on is substantively different from the version she inherited. The Secretary of State now needs to deliver on her stated enthusiasm for teamwork with Parliament and review these recommendations in good faith. The final Bill will, in our view, be much better for them.