Stay in touch with all the latest developments on online harms policy and research with our regular Online Harms newsletter: sign up here to receive future editions direct and view previous editions here.
Social media has changed the landscape of communication for millions across the globe. The benefits that social networks can bring are plentiful and well documented, but the harm that many people have suffered through abusive or negative engagement with other users on these platforms, or through the impact of design choices that shape or influence the content they view, can be troubling.
Over the past two years, Carnegie UK Trust has shaped the debate in the UK on reduction of online harm through the development of a proposal to introduce a statutory duty of care to reduce Online Harms. Our proposal is for social media companies to design and run safer systems – not for government to regulate individual pieces of content. Companies should take reasonable steps to prevent reasonably foreseeable harms that occur in the operation of their services, enforced by a regulator.
The proposal has been developed by Professor Lorna Woods (Professor of Internet Law, University of Essex), William Perrin (Carnegie UK Trustee) and Maeve Walsh (Carnegie UK Trust Associate). It draws on well-established legal concepts to set out a statutory duty of care backed by an independent regulator, with measuring, reporting and transparency obligations on the companies. A focus on the outcome (harm) makes this approach futureproof and necessarily systemic. We propose that, as in health and safety regulation companies should run their systems in a proportionate, risk-based manner to reduce reasonably foreseeable harm. Broadcast regulation demonstrates that a skilled regulator can work to assess harm in context, regulate it and balance this with maintaining free speech. Proportionality in regulation allows for innovation and market entry by SMEs.
A draft Online Harm Reduction Bill
We have developed a draft Bill to demonstrate how a regulatory scheme involving a duty of care enforced by an independent regulator might operate. We hope that this Bill can focus public debate on vital elements of how the regime might work. It is short (just 60 clauses) and sets out: a definition of a duty of care; who the duty applies to and the risk management steps that a company should take. It also makes Ofcom the regulator and grants them powers, via amendments to the Communications Act 2003, to take on these new responsibilities. The draft Bill and explanatory notes are here and we will be engaging with Parliamentarians, policymakers and civil society groups to discuss and scrutinise this approach and its adoption.
Paving Bill: Online Harms Reduction Regulator (Report) Bill
To accelerate progress, the Trust has also supported Lord McNally in the preparation of a short paving Bill [Online Harms Reduction Regulator (Report) Bill], to require Ofcom to prepare for the introduction of an Online Harms Reduction Regulator; it was introduced into the Lords on 14 January 2020. A date for the second reading debate is yet to be confirmed.
A Statutory Duty of Care and a Regulator: full report
The Full Report, published in April 2019, consolidates the body of work that has been produced throughout the project. The Full Report provides an extensive backdrop of the existing legal context, clear rationale for which platforms should be included within the new regulation, delineates which harms should be covered and delivers a comprehensive exploration of how the proposal can work in practice. It draws on the thinking set out in a series of blogposts previously published on this website which can be accessed via our blogs page.
Response to the Online Harms White Paper June 2019
In its Online Harms White Paper (April 2019), the Government proposed a statutory duty of care to keep UK users safe online. This Paper is the Carnegie UK Trust response to and is accompanied by a summary blog focussing on the issues for further consideration [here].
Further discussion on the differences between the Government’s proposal and the Carnegie UK Trust work, along with responses to some of the FAQs and objections on the White Paper’s proposals, is also set out in this blog from August 2019.
We have recently published detailed thinking on a system of “regulatory interlock” to enable Ofcom to work with other regulators to address online harms as well as a blog on Covid19 misinformation and public health harms.
Developing our proposal
We have developed our work in an open, collaborative and constructive way, sharing our thinking with academic and legal experts for critique and collaborating with policymakers, regulators campaigners, NGOs and other groups representing those who may experience harm online. We have put in many submissions to Parliamentary inquiries and other consultations (see download section below), hosted workshops and other events, and published further blogs and articles on our developing thinking.
Please see downloads for additional content including full reports, briefing notes and consultation responses.
In addition to the submissions to consultations which are available below in the download section, our work has fed into a number of media and journal articles including:
- Me and My Trolls, File on 4 podcast with contribution from Professor Lorna Woods (2020)
- Perspectives on children navigating a digital world, Panel discussion with Professor Lorna Woods (2020)
- Why UK digital regulation makes good sense for UK digital businesses, Maeve Walsh (2020) Digital Agenda
- The Time has come for Action on Online Harms, Maeve Walsh (2020) RSA
- Will 2020 be the Year Regulation Catches up with Social Media? Maeve Walsh (2020) International Institute of Communications (IIC)
- Grand International Committee on Disinformation and Fake News (Transcript), Professor Lorna Woods (2019)
- Protecting Social Media Users: Arguing for a Duty of Care, a RightsCast podcast with Professor Lorna Woods
- Duty of Care, Lorna Woods (2019) International Institute Of Communications
- Introducing A Duty Of Care For Social Media, Maeve Walsh (2018) Digital Leaders
- Detoxifying Social Media Would Be Easier Than You Might Think, Will Perrin (2018) The Guardian