We need regulation as well as education to make the Internet safer for children
February 10, 2020
by Maeve Walsh, Carnegie Associate
As the world moves online, governments need to make sure that we have the right laws to keep us as safe as we would expect to be offline. Safer Internet Day is very much about the practical steps that children and young people can take to keep themselves safe online. But they can only go so far. It would be a great day for the UK government to show us how it will deliver its manifesto commitment to make Britain the safest place in the world to be online.
Children’s online lives are changing – and changing fast. According to Ofcom’s annual “Children and Parents: Media Use and Attitudes” report last week, the proportion of 12-15 year olds using Snapchat as their main social media messaging service had risen from 11% to 27% between 2015 and 2019, while those using Facebook had dropped from 58% to 23% in the same period. Eighty percent of 5-15 year olds now watch video on demand – nearly double the levels in 2015.
The report also shows that exposure to harm is rising – half of 5-15 year olds said that they had seen something hateful online about a particular group of people in the past year, up from a third in 2016. Almost half of parents are concerned about self-harm material (up from 39% in 2018); and fewer parents now feel that the benefits of being online outweigh the risks.
So, in this fast-changing environment, how can parents keep up? And how can children keep themselves safe? Much of the focus of Safer Internet Day is rightly on providing support: sharing resources, information and tools for parents and teachers to help young people and children make the most of the opportunities of the online world, while managing the risks. Much of this will be funded by the big social media companies themselves – and this investment in educational support and digital literacy is undoubtedly welcome and well-meant. In the four years my daughter has been at primary school, Safer Internet Day events have grown from being non-existent to a whole-school focal point for discussions and activities on internet safety, delivered by much more confident teachers and backed up with communications and resources for parents too. The kids get it, the parents feel reassured. Job done.
Except it’s not. As Ofcom’s research shows, the technology that children use, the social media sites they visit, the potentially harmful content they can access is changing year-by-year. Educational material for one Safer Internet Day will be out of date by the next. And yet the tech companies funding the educational outreach are the same companies who could take much more significant steps – eg designing their systems with a view to reducing the risk of harm to users – to prevent those harms happening in the first place, regardless of how new or innovative the product or service is, or who it is intended to be used by.
In the physical world, regulation ensures that if a system is not safe it cannot operate, if a product or service has not been thoroughly tested and risk-assessed, then it cannot be launched. Companies are held to account by the regulator if they operate a system or service knowing that it might cause reasonably foreseeable harm to users. They certainly can’t get away with passing the responsibility for protection from harm onto the users themselves.
Online harms regulation is long overdue. The government, back in April last year, committed to bringing forward legislation to introduce a statutory duty of care for online harm reduction, enforced by a regulator. Last month, Lord McNally – supported by Carnegie UK Trust – introduced his own Private Bill to bring some much need urgency to this issue, by giving Ofcom new powers to prepare for the implementation of a duty of care system.
Let’s hope we don’t have to wait until Safer Internet Day 2021 for them to do so.
Read more about the Carnegie UK Trust work on a duty of care here: https://www.carnegieuktrust.org.uk/project/harm-reduction-in-social-media/