The Bill has been significantly strengthened with regard to children, with the new duties in relation to pornography (part 5), and the new requirement on providers to report the existence of CSEA to the NCA (part 4, chapter 2). This is on top of the existing measures with regard to CSEA and the requirement for a children’s access assessment. We will leave others more expert in this area to comment on these measures for now and focus our analysis here on vulnerable groups of adults.

The definition of harm (clause 187) now includes a welcome reference[1] to the way in which an individual can do something which affects another individual due to their characteristics or membership of a particular group.  The reference to a group means that consideration is not limited to the categories identified in the Equalities Act, or for the purposes of hate crime, and presumably could be understood to include intersectional groups also. Note, however, that in assessing whether content is harmful there is a quantitative threshold of “an appreciable number” of adults[2] or children. Does this remove smaller groups from protection? In sophisticated user-to-user networks, or search services, with tens of millions of users, advertisers can address numerically very small audiences – and commercial claims are made of personalisation of user experience – so “appreciable” could be a small number. We note that the Explanatory Notes say that “kinds of content that affect only one person or very few people” are not in scope”;[3] but more guidance would be helpful. Moreover, to require action, the harm must be significant and the risk must be material.

In general, however, the Bill remains weak on harms to adults – all the more so because the types of harms to be prioritised are as yet unknown and will not be set out until the Bill has received Royal Assent; and the duty applies only to Category 1 providers (which we refer to above). Non-designated content that is harmful does not require action on the part of service providers, even though by definition it is still harmful.

We know that many campaigners are concerned that protections for women and girls are not included in the draft Bill, a concern supported by the Petitions Committee in its report on Online Abuse. [4]While Schedule 7 does include a list of sexual offences and aggravated offences, to which the Government has referred in its response to the Petitions Committee,[5] the Bill makes no concessions here and the wider context of Violence Against Women and Girls (VAWG) is not addressed. We will be looking further at the types of harms that might need to be included to ensure the protections required are delivered.

Human trafficking offences (noted by Frances Haugen’s whistleblowing revelations) are a serious omission from Schedule 7 that should be rectified. Advertising is an important route for modern indentured servitude and should clearly be in scope for that offence.

We also have some concern that the overarching risk assessment for adults in clause 12 (which only covers category 1 companies), expects services to consider the fact that some groups are more likely to encounter harmful content and behaviour and are more likely to be harmed by it (cl 12(5)(d), is constrained by the scope of harmful content. As noted above, there is a quantitative threshold for determining which content is harmful (and therefore to be taken into account in the risk assessment) which might not well serve smaller groups.

 

 

[1] Clause 187 (4) (b) https://publications.parliament.uk/pa/bills/cbill/58-02/0285/210285.pdf

[2] Clause 54(3)(b) https://publications.parliament.uk/pa/bills/cbill/58-02/0285/210285.pdf

[3] EN para 329; p54 https://publications.parliament.uk/pa/bills/cbill/58-02/0285/210285.pdf

[4] We recommend that the Online Safety Bill should include abuse based on the characteristics protected under the Equality Act and hate crime legislation as priority harmful content in the primary legislation. It should also list hate crime and Violence Against Women and Girls offences as specific relevant offences within the scope of the Bill’s illegal content safety duties and specify the particular offences covered under these headings, as the draft Bill already does for terrorism and Child Sexual Exploitation and Abuse offences. (https://committees.parliament.uk/work/307/tackling-online-abuse/publications/)

[5] “We agree with the Committee’s suggestion that hate crime offences and offences associated with Violence against Women and Girls should be listed as priority illegal offences on the face of the Bill. Government announced this change on 5 February. This includes offences relating to sexual images (i.e. revenge and extreme pornography), and harassment and stalking offences, as well as acts intended to stir up racial hatred, religious hatred or hatred on the grounds of sexual orientation and racially or religiously aggravated harassment and public order offences. This means all services will need to take steps to remove and prevent users from being exposed to this content. This will result in women and girls being better protected online and proactive measures to tackle illegal abuse on the grounds of the listed characteristics. Beyond the priority offences, all services will need to ensure that they have proportionate systems and processes in place to quickly take down other illegal content directed at women and girls once it has been reported or they become aware of its presence”. (https://publications.parliament.uk/pa/cm5802/cmselect/cmpetitions/1224/report.html)