March 24, 2021

Increased online safety for people involved in the democratic process in the UK

by William Perrin, Trustee, Carnegie UK Trust and Professor Lorna Woods, Professor of Internet Law, University of Essex

Pragmatic, simple steps by regulators and social media platforms can improve the safety of those involved in elections in the United Kingdom and support the electoral process itself.  Building on the excellent work of Parliamentary Committees in the Lords and Commons we describe here steps that would fit an amended UK online safety approach and support the Government’s ‘Defending Democracy’ programme.  This post, the first of two, is a continuation of themes from our 29 January blog that discussed how the insurrection in Washington DC on 6 January overtook the UK government’s online safety proposals of a few weeks earlier.

Online risks in the democratic process

Debate is essential to the healthy functioning of democracy, allowing ideas to be tested and those with power to be held to account. Within this frame some speakers play a particularly significant role. As we have noted, within jurisprudence related to freedom of speech the Court of Human Rights has noted that “while freedom of expression is important for everybody, it is especially so for an elected representative of the people. He represents his electorate, draws attention to their preoccupations and defends their interests” (Castells v Spain, para 42). The same court has emphasised that the media “affords the public one of the best means of discovering and  forming  an  opinion  of  the  ideas  and  attitudes  of  political  leaders” (Lingens v Austria, para 42).

The United Kingdom government is considering a new law to prevent intimidation of candidates and campaigners in parliamentary elections as part of its ‘Defending the Debate’ programme. The government has also recently published an Action Plan on the Safety of Journalists reflecting that ‘[t]he societal impact of their professional activities exposes media workers to significantly more risks stemming from actions seeking to prevent or limit their speech than does the average citizen’ (Draghici, C,  Woods L).  In both cases online issues arise, yet the UK government has taken a policy decision to exclude electoral matters from its online harms proposals and treat the position of what we might term “public interest speakers” who are central to public debate as being just the same as everyone else despite the high risk nature of their roles.

As the Committee on Standards in Public Life noted in its 2017 Report on Intimidation on Public Life

Some MPs and candidates have disengaged entirely from social media due to the intimidation they have received; others who may be interested in engaging in public life are being put off by the tone and intensity of political discussion online. (Page 31)

Elections moreover are particularly high risk periods.

‘By their very nature, elections are competitive and adversarial, and political tensions run high during election campaigns. Social media provides a means by which citizens can engage with the political process during these times, but the darker side of such engagement is the intimidation that Parliamentary candidates, party campaigners, and others in public life experience.’ Committee on Standards in Public Life 2017[1]

 The analysis of Representative Audit of Britain (RAB) survey of 2019 candidates’ responses indicates that 49% of candidates reported that they suffered some form of abuse, harassment or intimidation while campaigning. This is an increase of 11 percentage points compared with 2017. University College London Constitution Unit[2]

While elections are particularly high risk, the problems do not stop there, and both election related abuse and abuse outside the election period may have long-term effects.

Female MPs also suggested that the levels of online abuse and treatment from the media they experience would have stopped them going forward to selection if they had known the extent to which they would be affected by it at the time they first put themselves forward.[3]

This potential silencing of voices, which damages the market place of ideas by skewing it towards dominant and entrenched views, seems to have a particular gendered aspect. This is particularly worrying given endemic violence against women.  Those in minoritised groups are likely also to be badly affected; intersectional groups likely most of all.

There is then both a high-risk group of people, defined by their involvement in the democratic process as participants or reporters and those who enable elections: some of these people suffer continuous enhanced risk. And a particularly high-risk time period in the run up to and immediately after elections. Both these risks should be assessed and mitigated by significant platforms under regulatory supervision.

Moving to action

The report of the Lords Committee on Democracy and Digital Technology (Chaired by Lord Puttnam) and the Commons DDCMS Select Committee made a series of pragmatic recommendations about social media and the functioning of democracy that the government has largely ignored.

The Puttnam Committee, as well as making recommendations about new powers for the Electoral Commission and in respect of online advertising, recommended that the scope of the online safety duty of care should extend to actions that undermine democracy

‘The Online Harms work should make clear that platforms’ duty of care extends to actions which undermine democracy. This means that the duty of care extends to preventing generic harm to our democracy as well as against specific harm to an individual.’

The Commons Department of Digital Culture, Media and Sport Committee chaired by Damian Collins MP, in its 2019 report ‘Disinformation and ”fake news”’ recommended that:

The Government should carry out a comprehensive review of the current rules and regulations surrounding political work during elections and referenda….The Government should explore ways in which the Electoral Commission can be given more powers to carry out its work comprehensively, including…the legal right to compel organisations that they do not currently regulate, including social media companies, to provide information relevant to their inquiries;

‘The Government needs to ensure that social media companies share information they have about foreign interference on their sites including who has paid for political adverts, who has seen the adverts, and who has clicked on the adverts—with the threat of financial liability if such information is not forthcoming.

In this post, we suggest that a systemic approach to regulation can introduce some safeguards for public interest speakers recognising the high risk nature of their position and also address the specific risks of harm to those involved in elections. We also respond here to suggestions that operation of online platforms can cause harm to the integrity and probity of the electoral process itself.

The Government’s position might be evolving, as suggested by the Prime Minister’s exchange with Darren Jones MP in the debate on the government’s Integrated Defence Review, which examined global threats to the UK:

Darren Jones MP: I welcome the recognition in the integrated review of threats to our democracy and the role that technology, disinformation and other forms of hybrid warfare play in those threats. On that basis, can the Prime Minister confirm that the online safety Bill that will be presented to the House this year will contain sufficient powers to tackle collective online harms, including threats to our democracy?

PrimeMinister: Yes, I can.[4]

Some harms persist year around, but we also use here the British concept of ‘the election period’ during which there is increased regulation and vigilance by public bodies and broadcasters. Online platforms sometimes also voluntarily bring in measures that recognise the sensitivities of that period.  Our proposals build on current practice and increase public scrutiny and accountability according to local UK norms.

We set out a series of steps that are congruent with a risk managed approach but that would require government to change its policy as set out in the December 15th statement, the most up to date exposition of online safety policy.

Legal context

International human rights law gives considerable scope for parliament to make rules about speech to protect the electoral process and/or people from harm. We described the state of the law in our blog post of 29 January ‘Freedom of Expression, Speech Rights & Modern Regulation’.  The European Court of Human Rights has ruled:

‘..freedom of political debate is undoubtedly not absolute in nature (see Castells, cited above, § 46). The Court has already indicated that some regulation may be considered necessary in order to prevent forms of expression such as direct or indirect calls for violence (see Karácsony and Others[5]).

The United Kingdom has a strict approach to political advertising on broadcast media (see discussion of the issues in the Neil Committee report) and the Advertising Standards Authority has recently called for regulation of non-broadcast political advertising.

International Context

We note that Canada and the Republic of Ireland have made rules or proposals to govern online political advertising – the LSE provides a good global overview of such rules in a 2020 study. The major platforms have brought in their own rules on elections to varying degrees around the world, as reported by Privacy International.  The University of Amsterdam 2019 study for the Netherlands Interior Ministry surveys regulation of political advertising and disinformation (separately).  We also note that the European Commission’s proposed Digital Services Act will require the largest platforms to assess the risk of electoral impact as part of their overall risk assessment.

Proposed changes – in summary

The government should bring into the online safety regime the safety of candidates in their role as candidates, the safety of others involved in elections and threats to the probity and integrity of the election process, rather than apparently excluding these. The government (or OFCOM) should then:

  • Give the Electoral Commission new powers under the online safety regime, using the ‘co-designation’ power;
  • Ask the National Police Chiefs Council to ensure their guidance on candidates takes advantage of the online safety regime, working with the regulators.

Democratic actors comprise a high risk group and the election period a high risk time zone. Under the supervision of the Electoral Commission and in co-ordination with OFCOM and the NPCC significant platforms should:

  • appoint a senior person with appropriate skills to own publicly the risk of harm arising from operation of the platform during electoral periods and the mitigation of that risk (a Senior Risk Owner: Elections (“SRO: Elections”));
  • assess election specific risks of harm arising from the operation of their platforms on a rolling basis and publish that assessment;
  • seek views from potential victims of harm (democratic actors who are ‘public interest speakers’ such as candidates, campaigners and journalists as well as officials who administer elections etc) to inform the risk assessment and demonstrate how they will be mitigated;
  • put in place a mitigation plan for the risks of harm agreed with all actors and regulators that reflect the risk profile of types of actor;
  • provide the platform SRO: Elections with resources to offer rapid support to all democratic actors in an enhanced support service;
  • provide enhanced transparency and responsiveness during the election period in particular;
  • regulate and provide real time transparency for political and issue advertising during the election period overseen by the SRO: Elections;
  • establish formal processes for detecting and combatting large scale disinformation campaigns that could harm the integrity and probity of an election in the manner of a cyber attack;
  • co-operate with a ‘skilled person audit’ for the regulators at regular intervals;
  • co-ordinate the above under regulatory supervision without fear of being accused of cartel-like behaviour.

The regulators in consultation with regulated platforms and democratic actors will determine performance levels and KPIs for each significant platform.

We address each of these in turn below, save for political advertising and disinformation which will be the subject of a subsequent blog post.

Regulatory structure and code of practice

OFCOM should ‘co-designate’ the Electoral Commission to lead on addressing harms arising from the operation of online platforms during the electoral period. The Government should ensure that the Electoral Commission has the powers and resources to act in this manner. Co-designation is the process that allows OFCOM to bring in specialist regulators to act within its regime. The precise process has not yet been set out.  The National Police Chiefs Council[6] (who play a formal role in electoral supervision and safety) should be asked to work with the Electoral Commission and OFCOM.

The regulators should produce a code of practice on online electoral harms in consultation with victims of harm, political parties and platforms amongst others. The code should be consulted upon and be a public document. The code should cover at least the issues raised below.

The application of the code should be proportionate to risk of harm agreed with the regulators. By default the largest elections and referendums will be high risk. However, some small elections such as by-elections for Parliament or local authorities might be still carry a high risk of harm.

Ownership and management of risk

Large platforms already plan for risks around major elections and some have nominated contacts for major political parties for instance on political advertising. We suggest formalising and building upon this. The regulators should consult upon which platforms are significant in electoral processes – for instance Facebook, YouTube, Twitter, TikTok.

Those significant platforms should be required to appoint a suitably skilled and politically neutral Senior Risk Owner Elections. The SRO Elections may for instance have experience of working on elections at a senior level in a UK broadcaster. The SRO Elections should have the managerial position and sufficient resources to assess and manage risks around elections on that platform. The development of a more robust risk management culture as part of the wider online safety regime will support the SRO Elections in their role.  The SRO Elections should be publicly identifiable.

This extends and formalises the 2019 Cabinet Office ‘Defending the Debate’ concept of a ‘pop up social media team for elections’ mentioned in ‘Protecting the Debate: Intimidation, Influence and Information Government Response’.

Assessment of risk – forward look and victim perspective

The SRO Elections should undertake an annual risk assessment of harm during elections working with candidates, potential candidates, campaigners, political parties, campaign groups, election officials, office holders and journalists likely to cover an election – ‘Democratic Actors’. In assessing risk, the SRO Elections should also work with the regulators and with their counterparts in other significant platforms.  The regulator (or the government) should ensure that working across platforms to manage risk is not regarded as cartel-like behaviour.

The SRO Elections should make particular effort to seek the views of potential and past victims of harm as well as those in groups likely to experience disproportionate risk of harm and reflect such views in the risk assessment and mitigation plan. All candidates and potential candidates should be able to input their concerns to the risk assessment.

As an integral part of risk assessment, The SRO Elections for each significant platform will publish annually an elections forward look for the UK (similar in spirit to the BBC forward look) and guidelines for the platform on how managing risk is addressed. Risk assessment should be continuously reviewed.

The regulator will assess the sufficiency of the risk assessment and mitigation plan having consulted widely.

However, specific risks that are already apparent should have special focus in risk assessment and mitigation.

Harassment and intimidation

These risks, with severe effect and a strong online component already feature in joint guidance from the NPCC, the Crown Prosecution Service and the Electoral Commission and in further guidance for Parliamentarians from the CPS. That guidance is passive in respect of social media companies. The government is drafting a new offence of intimidation of a parliamentary candidates and their campaigners. The online safety regime already suggests that regulated platforms will have to reduce harm arising from harassment and intimidation of platform users and others even before any electoral measures are put in place.

As democratic actors are a defined high risk group regulated platforms should take extra steps to reduce the risk of harm to democratic actors from harassment and intimidation.  For some the risk will be continuous, for others only around an election period.  The measures should be rooted in victim experience and overseen by the regulators. This will require regulated platforms to play a more active, responsive role than the current official guidance suggests.

Enhanced complaint and resolution service for democratic actors

Democratic actors are a recognisable group that suffer increased risk of harm and require rapid solutions to threats of harm that arise online. Accordingly, the SRO: Elections should have sufficient resources to provide an expedited rapid response and dispute resolution service to democratic actors. Such a service will build upon an improved service standard that will anyway come about through the online safety regimes. The enhanced service standards offered by the SRO: Elections should be agreed with the regulator and interested parties. The service would address the full range of harms arising in a manner proportionate to risk of harm. The service standards should be informed by the election forward look and increase when risk increases such as around election periods.

Access to platforms

Significant platforms are currently open to all and this should be formalised. Official Candidates in elections (who have had their papers accepted) should have fair, reasonable and non-discriminatory access to significant platforms. Candidates will be expected to follow platform rules. The online safety regime should improve the operation of enforcement and appeals processes in general – candidates unhappy with the application of platform rules will have access to the enhanced procedure via the SRO Elections.

Co-ordination between regulated platforms

The regulators should work with significant platforms to co-ordinate their activity in respect of electoral harms through information exchange between the various SRO Elections. The regulators and the CMA should reassure companies that such activity within reason is not a cartel.

Transparency and audit

Significant platforms should be more transparent during electoral periods, in particular of the activity of the SRO: Elections and their team.  During an election period SRO: Elections should publish daily data on their platform’s activities in respect of elections, the nature of that data to be agreed with the regulator. Published data should include information about complaints made and their path to resolution.

The regulator should request an expert person audit of platform operations in respect of elections after each General Election or national referendum and from time to time as required.

Report to Parliament

The regulators should report to Parliament after each major set of elections on the effectiveness of harm mitigation carried out under this regime.

The proposals we put forward above codify and enhance best practice to allow democratic actors to make best use of significant online platforms and should address many harmful issues which arise from the operation of online platforms.

Comments are welcome to [email protected]

[1] Intimidation in Public Life: A Review by the Committee on Standards in Public Life Page 43 December 2017  https://www.gov.uk/government/publications/intimidation-in-public-life-a-review-by-the-committee-on-standards-in-public-life

[2] ‘The 2019 election campaign shows that abuse, harassment and intimidation of candidates is getting worse, especially for women’ Sofia Collignon https://constitution-unit.com/2020/11/06/the-2019-election-campaign-shows-that-abuse-harassment-and-intimidation-of-candidates-is-getting-worse-especially-for-women/

[3] Fawcett Society Strategies for Success: Women’s Experiences of Selection and Election in UK Parliament https://www.fawcettsociety.org.uk/strategies-for-success

[4] HC Deb, 16 March 2021, c175 https://www.theyworkforyou.com/debates/?id=2021-03-16a.161.0#g175.0

[5] See European Court of Human Rights in SELAHATTİN DEMİRTAŞ v. TURKEY (No. 2) (Grand Chamber)

[6] The National Police Chiefs Council ‘brings police forces in the UK together to help policing coordinate operations, reform, improve and provide value for money’ in the complex field of electoral law the police and the Crown Prosecution Service play a significant role. The NPCC produces joint guidance on electoral law with the Electoral Commission. Each police force has an ‘Election Single Point of Contact’.