Harm reduction in social media – what can we learn from other models of regulation?

May 4, 2018

Share this story


by William Perrin, trustee of Good Things Foundation, Indigo Trust and 360 Giving. He is a former senior civil servant in the UK government. And Professor Lorna Woods, University of Essex.

This blog is the third in a programme of work on a proposed new regulatory framework to reduce the harm occurring on and facilitated by social media services.  The authors William Perrin and Lorna Woods have vast experience in regulation and free speech issues.  William has worked on technology policy since the 1990s, was a driving force behind the creation of OFCOM and worked on regulatory regimes in many economic and social sectors while working in the UK government’s Cabinet Office.  Lorna is Professor of Internet Law at University of Essex, an EU national expert on regulation in the TMT sector, and was a solicitor in private practice specialising in telecoms, media and technology law.  The blogs posts form part of a proposal to Carnegie UK Trust and will culminate in a report later in the Spring.

Assuming that some sort of regulation (or self or co regulation) is necessary to reduce harm, what form should it take?

The regulatory environment provides a number of models which could serve as a guide.  In adopting any such models we need, at least until Brexit, to be aware of the constraints of the e-Commerce Directive. We also need to be aware of the limitations on governmental action arising from human rights considerations, specifically (though not limited to) freedom of expression.  As we have discussed in previous blogs, limitations on rights must meet certain requirements, notably that they be proportionate. Neither of these regimes foreclose regulatory activity entirely.

In this blog we provide a short overview of a range of regulatory models currently being deployed and identify regulatory tools and approaches such as risk assessments, enforcement notices etc, within these that may have some relevance or applicability for a harm-based regulation model for social media.  We have discussed the regulatory frameworks frequently cited in relation to social media services, namely the electronic communications sectors and data protection, as data processing is at the heart of social media services.  We argue in forthcoming blog posts that social networks have strong similarities to public spaces in the physical world and have therefore also included some other regimes which relate to the safeguarding of public or semi-public spaces. Harm emanating from a company’s activities has, from a micro economic external costs perspective, similarity to pollution and we also discuss environmental protection.

Telecommunications

Given that social media platforms are not directly content providers but rather channels or platforms through which content is transferred from one user to another, a sensible starting point is to look at the regulatory context for other intermediaries who also connect users to content: those providing the telecommunications infrastructure.  The relevant rules are found in the Communications Act 2003. The telecommunications regime is expressly excluded from the e-Commerce Directive provisions (Article 4) prohibiting the prior licensing of “information society service providers” (the immunities could of course apply).  There is in fact no prior license required under the Communications Act to provide electronic communications services, but a person providing a relevant “electronic communications network” or “electronic communications service” must under section 33 give prior notification to OFCOM (the independent sector regulator). While any person may be entitled to provide a network or services, that entitlement is subject to conditions with which the provider must comply.  The conditions are “general conditions” and “special conditions”.  As the name implies, “general conditions” apply to all providers, or all providers of a class set out in the condition. Special conditions apply only to the provider(s) listed in that special condition (see section 46 Communications Act). The conditions are set by the regulator in accordance with the Communications Act.

Section 51 sets down matters to which general conditions may relate.  They include

“conditions making such provision as OFCOM consider appropriate for protecting the interests of the end-users of public electronic communications services”

which are elaborated to cover matters including the blocking of phone numbers in the case of fraud or misuse, as well as, in section 52, the requirement to have a complaints mechanism. This latter point is found in General Condition 14 (GC14) which obliges communications providers to have and to comply with procedures that conform to the Ofcom Approved Code of Practice for Complaints Handling when handling complaints made by domestic and small business customers.   General conditions also cover public safety (in relation to electro-magnetic devices).  The specific conditions mainly relate to competition in the telecommunications market, specifically the rights of businesses to have access to networks on fair, reasonable and non-discriminatory terms.

In terms of enforcement, section 96 gives OFCOM the power to impose fines on providers for non-compliance with the conditions.  Ultimately, OFCOM has the power to suspend the entitlement to provide the service (see section 100).

Digital Economy Act 2017

The Digital Economy Act 2017 covers a range of topics – we will focus on just one aspect: the provisions in relation to age verification and pornography which is found in Part 3 of the Act (we have considered the Social Media Code of Practice elsewhere.  This part of the Act is not yet fully in force.

The obligation is to ensure that pornographic material is not made available online to people under 18.  The operator has freedom in how to attain this goal, but the Age Verification Regulator may check these steps.  It may also issue guidance as to how age verification may be carried out.  It may issue enforcement notices and/or impose penalties if a person has failed in this duty (or refuses to give the regulator information requested).  The Act also empowers the regulator to issue notices to others who are dealing with the non-complying operator (section 21), such as credit card or other payment services. According to the Explanatory Memorandum, the purpose of these provisions is “to enable them to consider whether to withdraw services”. In relation to extreme pornography only, the regulator has the power to request that sites are blocked.

Data Protection

Another possible model in the sphere of information technology is that of data protection and specifically, the General Data Protection Regulation, which replaces (from 25 May 2018) the regime established by the Data Protection Directive, as implemented in the UK by the Data Protection Act 1998.  The regime requires an independent regulatory authority as an essential part of the regime, and the minimum standards of independence are set down in the GDPR.

The current regime requires those processing personal data to register with the Information Commissioner’s Office (ICO), to renew the registration annually, and to pay a fee (which varies depending on the size and nature of the organisation).  Failure to do so is a criminal offence.  The GDPR removes the annual renewal obligation but data protection fees still apply, by virtue of the Digital Economy Act 2017.  Some information (e.g. name, contact details) will have to be submitted with this fee, but the current notification regime which required details about the data processing will cease to exist.

Central to the GDPR is the principle of accountability which can require a data controller to show how it complied with the rules. These essentially put obligations on controllers to process data in accordance with the ‘data processing principles’ set down in Article 5 GDPR. Another theme is a form of precautionary principle in that data controllers must comply with both privacy and security by design.  This could be described as a risk-based approach, as can be seen in the requirements regarding data security.  For example, data controllers are required to “ensure a level of data security appropriate to the risk” and in general they should implement risk-based measures for ensuring compliance with the GDPR’s general obligations. Controllers should ensure both privacy and security by design. High risk processing activities trigger the need for a privacy impact assessment (PIA) to be carried out (article 33 GDPR).  Article 34 specifies that where the PIA suggest that there is a high risk, the controller must ask the supervisory authority before proceeding.

As regards enforcement, these themes feed into the factors that the supervisory authorities take into account when assessing the size of fines to impose on a controller (or processor) in breach under the GDPR as the authority will have “regard to technical and organisational measures implemented” by the processor.  Note that individuals also have a right to bring actions for data protection failings.

Health and Safety

Another model comes from outside the technology sector: health and safety.  The Health and Safety at Work Act does not set down specific detailed rules with regards to what must be done in each workplace but rather sets out some general duties that employers have both as regards their employees and the general public.  So section 2(1) specifies:

It shall be the duty of every employer to ensure, so far as is reasonably practicable, the health, safety and welfare at work of all his employees.

The next sub-section then elaborates on particular routes by which that duty of care might be achieved: e.g provision of machinery that is safe; the training of relevant individuals; and the maintenance of a safe working environment. The Act also imposes reciprocal duties on the employees.

While the Health and Safety at Work Act sets goals, it leaves employers free to determine what measures to take based on risk assessment.  Exceptionally, where risks are very great, regulations set down what to do about them (e.g. Control of Major Accident Hazards Regulations 1999). In respect of hazardous industries it may operate a permission regime, in which activities involving significant hazard, risk or public concern require consent; or a licensing regime to permit activities, such as the storing of explosive materials, that would otherwise be illegal.

The area is subject to the oversight of the Health and Safety Executive, whose functions are set down in the Act.  It may carry out investigations into incidents; it has the power to approve codes of conduct. It also has enforcement responsibilities and may serve “improvement notices” as well as “prohibition notices”.  As a last measure, the HSE may prosecute.  There are sentencing guidelines which identify factors that influence the heaviness of the penalty.  Points that tend towards high penalties include flagrant disregard of the law, failing to adopt measures that are recognised standards, failing to respond to concerns, or to change/review systems following a prior incident as well as serious or systematic failure within the organisation to address risk.

Environmental Protection

Another regime which deals with spaces is the Environmental Protection Act 1990.  It imposes a duty of care on anyone who produces, imports, keeps, stores, transports, treats or disposes of waste and brokers or those who control waste (“waste holders”) (section 34), as well as on householders (section 75(5)).  Waste holders must register with the possibility of a fine for non-compliance; there is a prohibition on unauthorised disposal of waste backed up with a criminal penalty.

More detail on what the duty of care requires is set down in secondary legislation and codes of practice give practical guidance. As regards waste holders, they are under a duty to take all reasonable steps to:

  1. prevent unauthorised or harmful deposit, treatment or disposal of waste;
  2. prevent a breach (failure) by any other person to meet the requirement to have an environmental permit, or a breach of a permit condition;
  3. prevent the escape of waste from their control;
  4. ensure that any person you transfer the waste to has the correct authorisation; and
  5. provide an accurate description of the waste when it is transferred to another person.

The documentation demonstrating compliance with these requirements must be kept for two years.  Breach of the duty of care is a crime.

Householders’ duties are more limited: they have a duty to take all reasonable measures to ensure that any household waste produced on their property is only transferred to an authorised person – a householder could be prosecuted for fly-tipping of waste by a contractor (plumber, builder) employed by the householder.

As well as this duty of care, businesses are required under Reg 12 of the Waste (England and Wales) Regulations 2011 to take all such measures as are reasonable in the circumstances to:

  • prevent waste; and
  • apply the “waste hierarchy” (which is a five step strategy for dealing with waste ranging from prevention through recycling to disposal which derives from the EU Waste Framework Directive (2008/98/EC)) when they transfer waste.

In doing so, business must have regard to any guidance developed on the subject by the appropriate authorities.

The responsible regulators are the Environment Agency/Natural Resources Wales/Scottish Environment Protection Agency and local authorities. They may issue enforcement notices, and fines may be levied.  If criminal action is taken, there is a sliding scale based on culpability and harm factors identified in guidance.  The culpability assessment deals with the question of whether the organisation has deliberately breached the duty, done so recklessly or negligently – or to the contrary, not been particularly at fault in this regard.

Assessment

These sectors operate under general rules set by parliament and refined by independent, evidence based regulators and the courts in a transparent, open and democratic process.   Modern effective regulation of these sectors supports trillions of pounds of economic activity by enforcing rights of individuals and companies. It also contributes to socially just outcomes as intended by Parliament through the internalisation of external costs and benefits.  The Government’s Internet Safety Strategy Green Paper detailed extensive harms with costs to society and individuals resulting from people’s consumption of social media services.  Social media services companies early stage growth models and service design decisions appear to have been predicated on such costs being external to their own production decision. Effective regulation would internalise these costs for the largest operators and lead to a more efficient outcomes for society.  There is a good case to make for market failure in social media services – at a basic level people do not comprehend the price they are paying to use a social media service – recent research by doteveryone revealed that 70% of people ‘don’t realise free apps make money from data’, and 62% ‘don’t realise social media make money from data’.  Without basic awareness of price and value amongst consumers it will be hard for a market to operate efficiently, if at all.

There are many similarities between the regimes. One key element of many of the regulators’ approach is that changes in policy take place in a transparent manner and after consultation with a range of stakeholders.   Further, all have some form of oversight and enforcement – including criminal penalties- and the regulators responsible are independent from both Parliament and industry. Breach of statutory duty may also lead to civil action.  These matters of standards and of redress are not left purely to the industry.

There are, however, differences between the regimes.  One point to note with regards to the telecommunications regime is that OFCOM may stop the provider from providing the service.  While the data protection regime may impose – post GDPR – hefty penalties, it may not stop a controller from being a controller.  Again, with regard to HSE, particular activities may be the subject of a prohibition notice, but this does not disqualify the recipient from being an employer.  The notice relates to a particular behaviour.  Another key difference between the telecommunications regime and the others is that in the telecommunications regime, the standards to be met are specified in some detail by OFCOM. For the other regimes although there are general obligations identified, the responsibility in both instances lies on the controller/employer to understand the risks involved and to take appropriate action, though high risk activities in both regimes are subject to tighter control and even a permissioning regime.  While the telecommunications model may seem an appropriate model give the telecommunications sector’s closeness to social media, it may be that it is not the most appropriate model for four reasons:

  • the telecommunications regime has the possibility of stopping the service, and not just problematic elements of the service; we question whether this is appropriate in the light of freedom of speech concerns;
  • the telecommunications regime specifies the conditions – we feel that this is too ‘top-down’ for a fast moving sector and allowing operators to make their own assessment of how to tackle risks means that solutions may more easily keep up with change, as well as be appropriate;
  • a risk-based approach could also allow the platforms to differentiate between different types of audience – and perhaps to compete on that basis; and
  • the telecommunications regime is specific to telecommunications, the data and workplace regimes are designed to cover the risk entailed from broader swathes of general activity.

Although the models have points of commonality, particularly in the approach of setting high level goals and then relying on the operators to make their own decisions how best to achieve that which allows flexibility and a certain amount of future-proofing- there are perhaps aspects from individual regimes that are worth highlighting:

  • the data protection and HSE regime highlight that there may be differing risks with two consequences:
  • that measures should be proportionate to those risks; and
  • that in areas of greater risk there may be greater oversight.
  • The telecoms regime emphasises the importance of transparent complaints mechanisms – this is against the operator (and not just other users);
  • the environmental regime introduces the ideas of prevention and prior mitigation, as well as the possibility for those under a duty to be liable for the activities of others (eg in the case of fly-tipping by a contractor); and
  • the Digital Economy Act has mechanisms in relation to effective sanctions when the operator may lie outside the UK’s jurisdiction.

The next blog post will discuss a duty of care and the key harms in a new regulatory regime.  We welcome comments to [email protected]