December 5, 2022

Online Safety Bill – Government amendments for Committee stage

by Professor Lorna Woods, Professor of Internet Law, University of Essex; William Perrin, Trustee, Carnegie UK; Maeve Walsh, Carnegie Associate

NOTE: this is an updated version of the technical note published on 5 December which was based on the Government’s “indicative amendments”, published via WMS on 30 November. This new version refers to the amendments now formally tabled on 7 December by the Government for the OSB’s recommittal to Committee.

In this technical note we discuss the Government’s tabled amendments to the Online Safety Bill. The Online Safety Bill is a long running saga and for expediency we do not here rehearse all the background- new readers might wish to start with our one page guide to the OSB and read the Government’s recent press releases and Written Ministerial Statements.

Following two recent changes of administration the Government in two Written Ministerial Statements has promised changes to the Online Safety Bill. Some were passed at Report stage day 2 and an amended version of the Bill subsequently published; some will be brought forward in the Lords or in other legislation or policy processes. A programme motion passed at Report stage, with hearings due to take place on Tuesday 13 and Thursday 15 December.  This note examines the 30 November WMS and the amendments subsequently tabled formally that will be scrutinised at Committee.

The tabled amendments do the following:

  • Remove the harmful to adults’ risk assessment originally found in cl 12;
  • Remove the harmful to adults’ transparency obligations originally found in cl 13;
  • Introduce a duty around enforcement of Terms of Service (ToS);
  • Remove category of content harmful to adults and consequently priority content harmful to adults but introduces list of content areas to which the user empowerment duties (cl 14) apply;
  • Amend user empowerment duties (cl 14);
  • Amend rules in relation to determination of Category 1 services;
  • Leave illegal content duties as before, although the press release and WMS refer to these duties as one element of the triple shield for adults (note additional criminal offences will be introduced but these are not part of the amendment package); and
  • Tighten drafting around age verification in relation to children’s safety duties.

We now consider some key elements in more detail.  We are still working through the many consequential changes but our early thoughts on the most significant amendments are as follows. We would welcome any corrections etc.

Terms and conditions

There are dozens of consequentials (see, e.g. 18-75) following from two amendments introducing new duties that:

Ensure that the largest or most risky online service providers design systems and process so that the service provider can’t take down or restrict content (so that a person cannot see it without further action by the user) or ban users unless it is in its own ToS or constitutes breaking the law or the OSB regime.(NC3)

This duty is referred to as the ‘Duty not to act against users except in accordance with terms of service’.

Improve the effectiveness of terms of service so that:

if the provider’s ToS say that certain types of content are to be taken down, or restricted or that certain behaviours will lead to banning then providers must run systems and process to make sure that these things happen;

People are able to report breaches easily and that a complaints service delivers ‘appropriate action’ in response to complaints; and

The service allows certain types of complaints – including about the service provider itself. (Mainly NC 4)

This ‘effectiveness’ duty is referred to as ‘Further duties about terms of service’.

Both these duties are focussed narrowly on banning, take down and restriction, rather than upstream design changes or softer tools that might be more effective and allow different types of users to coexist on the same service more easily. Nonetheless, the duties bear on the quality of systems and processes that companies operate to achieve this effect in general, rather than a hard liability for one wrong decision.  Notably the systems and processes required by the new duties should be ‘proportionate’ (as is the case for the other safety duties). This can be taken to be proportionate to risk, but the government here chooses to stress (NC6(7)) proportionality to the ‘size and capacity’ of the service provider, stating that these factors are “in particular, relevant”. This is despite the fact that the obligations will apply only to Cat 1 services (in the other User to User safety duties (cls 9(9) 11(9)), riskiness is also a factor of proportionality).

Service providers have terms of service (defined in cl 203 as ‘all documents (whatever they are called) comprising the contract for use of the service (or of part of it) by United Kingdom users’ ) that may go beyond the criminal law and encompass material that they believe is harmful to their users, doesn’t ‘fit’ the service or is unpopular with advertisers. The Government list of harmful content provided for the “Third Shield” (user empowerment tools, below) are an example.

In relation to large and risky services (Category 1), the Government – having dropped some of the ‘harmful but legal’ provisions – seems to expect that if those providers claim to be tackling such material they must deliver on that promise to the customer/user (NC4(3)).

This reflects a widespread view that companies pick and choose their application of ToS or implement them with little vigour.  These failings lead to harm when people encounter things which they had thought would not be there when they signed up. The IAs also include provisions for making ToS more legible (NC4(7)) which reflects obligations found in the original transparency obligations (cl 13(6)). These obligations carry echoes of measures in financial services which had similar problems and where the ToS complexity issue hasn’t been solved.

Service providers which do not fall within Category 1 need not enforce their terms of service, or may do so erratically or discriminatorily – and that includes search engines no matter how large.

Impact on service provider behaviour

The new duties will make the largest and riskiest companies expend more effort on enforcing their ToS for UK users. The Government has not yet presented modelling nor, say, game-theory-type work on what this effect this will have on company ToS.

There are risks arising from the fact that there is no minimum content specified for the terms of service for adults – though of course all providers will have to comply with the illegal content duties (and these include specific lists of priority illegal content in Schedules 5-7). One former social media company senior executive felt it would make ToS much longer and ‘lawyered’. This might be especially so in circumstances in which “restrictions” are in play – and this might militate against a company using non-take-down methods to control content.

Another view is that, faced with stringent application companies might make their ToS shorter, cutting out harmful material that is hard to deal with because they now might be liable if they don’t deal with it.  Or, if a service provider does deal with it, they may suffer competitively and reputationally if they run into issues with OFCOM and end up having to publish breach notices (following amendments being introduced in the Lords).  By comparison, companies that chose to do nothing have an easier life – they might suffer reputational hits from content-harm when that becomes public for example because of whistleblower action or media reporting but not from the regulator under the new duties.

The Government presents no evidence either way. It isn’t clear what companies might say now about what they will do in future to seek to influence policy thinking.

The fact that there is no minimum requirement in the regime means that companies have complete freedom to set ToS for adults – and this may not reflect the risks to adults on that service. Service providers do not even have to include ToS in relation to the list of harmful content proposed by the Government for the user empowerment duties (below). Moreover, the removal of both the risk assessment for harms to adults (cl 12) and the obligation to summarise and publish the results (cl 13(2)) means that users will lack vital information to make an informed choice as to whether they want to engage with the service or not.

It is possible that the way ‘Terms of Service” is employed here means that platforms’ advertising content policies will be outside the scope of this clause. The definition of the term in cl 203 refers only to the documents constituting the relationship between service provider and user. Concerns about targeted advertising will likewise not be assuaged by the regime.

OFCOM is required to produce guidance on the above.

User Empowerment -amendments 8-17

The so-called ‘Third Shield’, these allow a user to manage what harmful material they see by requiring the largest or most risky service providers to provide tools to allow a person to ‘effectively reduce the likelihood of the user encountering OR effectively alert the user to’ certain types of material. The government proposes a list of such material in amendment 15  to go on the face of the Bill as follows:

if it encourages, promotes or provides instructions for—

(a) suicide or an act of deliberate self-injury, or

(b) an eating disorder or behaviours associated with an eating disorder.

(8C) Content is within this subsection if it is abusive and the abuse targets any of the following characteristics—

(a) race,

(b) religion,

(c) sex,

(d) sexual orientation,

(e) disability, or

(f) gender reassignment.

Content is within this subsection if it incites hatred against people—

(a) of a particular race, religion, sex or sexual orientation,

(b) who have a disability, or

(c) who have the characteristic of gender reassignment.

There is no linkage of these terms to existing definitions in statute or Crown Prosecution Service guidance. Some terms (‘eating disorders’, ‘abuse’) might have no current definition in statute or common law. The above list is similar to the WMS of July in which Secretary Dorries set out a list of legal but harmful material with the notable exception of harmful health misinformation (eg drink bleach to cure cancer).

Note that the provisions for user empowerment tools in relation to unverified users remain in the Bill.

Government amendment 9 requires the user empowerment tools to be ‘effective’ (they had not been required so previously) and amendment 12 requires the tools now to be ‘easy to access’. OFCOM has to produce guidance on the above. We assume that OFCOM will need to include the above in its risk assessment processes and, possibly, risk profiles.

Some victim representation groups will want this Third Shield turned on by default so that people have to opt-in to see such material. This might particularly be the case where victims are suffering from poor mental health and might lack the faculty at a crucial time to turn on such filters, or be harmed by the process of engaging with content to block future similar content. In the work we did with Anna Turley MP in 2016, we proposed an abuse blocker that was on by default in PMB Malicious Communications (Social Media) 2016:

‘Operators of social media platforms ..must have in place reasonable means to prevent threatening content from being received by users of their service in the United Kingdom during normal use of the service when the users―

(a)access the platforms, and

(b)have not requested the operator to allow the user to use the service without filtering of threatening content.’

One question is how the user empowerment tools relate to the Terms of Service duties. If a tool gives rise to false positives would that constitute a restriction for the purposes of the new duty? NC2 (3)(a) seems to exclude effects arising from the user empowerment duty.  It is unclear whether user tools that are useable beyond the context of the areas listed in the user empowerment duty would benefit from NC2 (3)(a) or whether– when used outside the areas envisaged by cl 14 –they would constitute ‘restrictions’? Of course, this could be covered in the ToS but that might give rise to the risk of ‘lawyering’ and the disincentives noted above. The precise relationship between the two duties should be clarified.

Risk Assessment?

As noted, the companies are not obliged to carry out risk assessments for either of these duties, making the duties very different from others in the Bill.    In amendment 13, the Government proposes probability of occurrence of a type of harm as a factor in assessing the proportionality of user protection. In another, the Government says that measures have to be ‘effective’. This suggests that a company deciding whether or not to offer a tool would have had to carry out a risk assessment, especially as, in assessing whether the user empowerment duties had been met, Ofcom would be likely to investigate what grounds the provider had for determining that particular tools were thought to be effective. We think that there perhaps should be consequential amendments (not yet made) to OFCOM’s risk profiles to ensure that this aspect can be included.

The removal in this block of amendments of the adult risk assessment obligation (in original  Clause 12 companies had to assess the risk of harm to adults and (original Clause 13) report it), will mean it is much harder for users and civil society to assess what problems arise on the platforms – and the role of product design in those problems.  The consequential removal of the transparency obligation (service providers had been required to inform customers of the harms their risk assessment had detected) means that users (as consumers) will not have the information to assess the nature/risk on the platform.

It seems that the requirement for risk assessment has moved from being explicit and public to implicit and private between the companies and the regulator, insofar as it exists at all.

‘Categorisation’ of companies NC7, amendments 48, 49, 76-93

The Government appears to be preparing to broaden the criteria for selecting which companies are likely to be in Category One. They add the ‘characteristics’ of a company’s service to the rather crude size and functionality metrics employed before. The Government also allows for a list to be drawn up of companies that are close to the margins of categories or ‘emerging Category One’. This will give more regulatory certainty. Whether this deals with all concerns arising with regard to the drawing of boundaries (e.g. Secretary of State’s powers in this regard) is another question.

Child safety

We note the amendments on child safety (1-5) and wait for a considered view on these from civil society organisations expert in child protection.