We welcome the publication of the Online Harms final response: it has been long delayed but there is now a real opportunity to deliver significant improvements to the safety of users of online services in the UK. There is much detail in the government’s publications today – not just the White Paper response, but also the two interim codes of practice and the transparency report – which we will review and analyse before publishing a fuller response.
However, we are encouraged to see that the language in the Government response is very much one that mirrors our work on a statutory duty of care over the last three years: one that is built on the need for robust and continuous risk assessment of the design and safety of the systems and processes online platforms use to deliver their services. The decision to enable codes of practice and priority harms to be determined through the introduction of secondary legislation is a pragmatic way to build in flexibility and future-proofing for the regulatory scope.
It is disappointing, therefore, that the Government has categorically ruled out fraud and scams from the scope and that action on misinformation and disinformation in respect of adults is limited to that which causes significant physical or psychological harms to individuals. The Government may well point to other regulators and other work (such as the Defending Democracy programme in relation to electoral harms) but these are underpowered and don’t protect users and society from, for instance, damage to electoral processes. We will reserve judgement on the wider scope until we see the proposals for the “priority harms”: there has been a notable lack of focus in the debates today on online abuse, intimidation of adults and the intersectionality of the impact of abuse on minority groups. A commitment to freedom of expression does not excuse the urgent need to take action on the impact of this, not least in the underlying intent to prevent the freedom of others to speak.
We are also not convinced that the obligation should only be on the biggest companies to take action to prevent harms to adults: it is often on the smallest platforms where the most damaging and abusive behaviour towards and between adults takes place.
But we do wholeheartedly welcome the final confirmation that Ofcom will be appointed as the independent regulator and that the government now wants them to get on with it. They are well placed to do so and have already today published some promising statements on how they will approach their role, with a focus on systems and processes rather than regulating individual pieces of content. We also welcome the nod to our arguments for the need for “regulatory interlock” to address the broadest range of harms, and bring in specialist expertise, with a commitment that Ofcom should work with other regulators in a “co-designation” framework.
So, there is a good deal to be (cautiously) optimistic about and much more to chew over in the days and weeks ahead. Carnegie UK Trust is charged with improving the lives and wellbeing of the people of the United Kingdom and Ireland. We shall examine carefully how the government’s proposals work in respect of devolution in the UK and also the interface or overlap with the recent proposals from the Government of Ireland in this area.
We will return to this with a more detailed blog post shortly and we hope that the legislative timescale, including pre-legislative scrutiny and the introduction of the Bill, will now not be subject to further delays.
Whilst we shall continue to press for progress we recognise the incredibly difficult political and national circumstances that civil servants, regulatory staff and politicians have been working in while producing this significant regulation. There are eerie similarities to the Health and Safety at Work Act 1974 which was piloted through the house with cross party support in the aftermath of a national crisis. With the right focus the Online Safety Act could be just as effective and long-lasting.
The Government’s Online Harms response is here.