“Digital Safeguarding is just Safeguarding”

June 27, 2019

Share this story

By Anna Grant, Senior Policy and Development Officer, Carnegie UK Trust

The #NotWithoutMe programme has been established to challenge digital exclusion for vulnerable young people. Since 2015, the programme has delivered a variety of practice, policy and research work across the UK. For this strand, we are in interested in exploring the inception, implementation and ongoing development of digital policies within youth organisations.


Mobile devices, messaging apps and social media platforms can be powerful tools for communicating with, supporting and empowering young people in complex and challenging circumstances.

However, these disruptive technologies often outpace the organisational structures, policies and practices put in place to protect both young people and staff. Competing needs, limited resources and fast-changing culture may result in organisational policies and practice ultimately failing to support young people in the best way possible. The default organisational position can become restrictive, practitioners may lack skills and confidence to interpret or implement digital policies effectively and young people may require different approaches depending on their needs and the risks they face. These many factors combine to create a complex strategic, practical and participation challenge, overall, impacting on outcomes for both young people and staff wellbeing.

The digital world is far from unique in facing these tensions and trade-offs between children and young people’s rights and safety, organisational responsibilities and staff autonomy, technical solutions with personal skills. We have seen these debates mirrored in other areas of work undertaken by the Trust, most notably our work on Kindness. How do services create the flexibility for relationships and still ensure that service users and staff are adequately protected? How do we define what is ‘appropriate’ in a way that reflects individual needs? And how do we balance kindness with fairness?

There is no straightforward solution. Individual organisations need to explore these issues within their own context, and, as with all safeguarding elements less about creating robust organisational policies and more about developing good organisational practice.

There is however, great value to be had in sharing with and learning from peers. So, to consider these ‘digital safeguarding’ issues, last month we brought together 12 youth organisations in our first #NotWithoutMe Lab: Digital and Social Media Use in Youth Support. The #NotWithoutMe Labs build on findings from across Carnegie UK Trust’s work and other relevant projects to explore the intersections between safeguarding, kindness and digital when working with vulnerable children and young people. The aim of the work is to contribute towards safer and more effective use of digital to achieve the best outcomes for young people.


Building a Working Hypothesis

To help us begin to unpick some of these thorny issues, we first created a Working Hypothesis:

  • Organisational policies on the safe and appropriate use of social media, connected devices and actual day to day practice can sometime be in conflict or are developed and managed in isolation. Organisational policies are also in tension with both organisational values and day-to-day practice to deliver the best outcomes for young people.
  • This can result in young people not benefitting from the best possible support, services not being responsive to technology changes and staff not feeling confident to manage risk.
  • Therefore, young people and staff at all levels will benefit from organisational policies that are more aligned and inclusive of practical needs alongside safeguarding risks and where policies are supported by skills development.

This hypothesis led us to our overall Challenge Statement:

  • What works well in creating and supporting digital and social media organisational policies when working with vulnerable children and young people, what are the barriers to good practice and what is needed to overcome them?


Grounding Activities in Real World Experiences 

To stress-test these elements we wanted to provide organisations the time and positive space for discussion and exploration of the issues around safeguarding in digital and social media use in youth support. The aim was to enable organisations and staff to feel increased confidence in addressing the issues both internally and with the people they work with, in turn working towards practical and incremental improvements within the youth support system.

The workshop mapped current and good practice, examined the barriers and challenges to implementing effective digital safeguarding polices and practice, and explored a number of scenarios focused on professional boundaries, duty of care and online peer-to-peer interaction. Each session focused on real-world scenarios to explore how we might respond practically, to describe what best practice might look like and identify what we need to do to get there.


A Note on Our Position:

To facilitate the most effective conversations we felt it was important that our role remain clearly agnostic of the solution, not to advocate for a technology first approach or that ‘going digital’ is the only response. The #NotWithoutMe Lab was developed to support organisations to critically explore and assess what is suitable and appropriate for their young people and organisation in the context in which they operate and have a clear and open justification for their position.  


Emerging Themes:

Through the activities a number of reoccurring themes arose, often generating more questions than answers. These themes included:

  • Time and Space – core to the discussions was the need for organisations to dedicate time and space to further consider their digital policy and practice to ensure robust thought went into the decision making process.
  • ‘Adding-On’ or ‘Embedding-Within’ – where should digital safeguarding sit within current safeguarding policies and practice? Two approaches emerged:
    1. Digital Safeguarding, should be pulled out as a separate safeguarding policy, at least in the short term, to enable specific focus, consideration and attention from organisations and workforces.
    2. Or, ‘Digital Safeguarding is just safeguarding’ and as such should be embedded within existing general safeguarding policies and practice given that it is essentially no different to traditional safeguarding. It is an extension rather than a new element.
  • Role of Funders – Despite not being involved in direct delivery work with young people, what is the potential of funders to act a ‘levers’ to enable more critical digital safeguarding engagement within organisations? Importantly, how can this be done without the process becoming a burden, being purely administrative or reinforcing the overly restrictive mind-sets? Key questions were summarised into three steps:
    1. Can funders ask better questions about digital safeguarding and know better answers when they hear them?
    2. Are recipient organisations able to respond?
    3. Is there sufficient support in place from funders to help organisations?
  • Role of Platforms – Platforms shape the interactions we have on them. Therefore, the functional abilities of platforms can limit or modify how staff are able to behave online, creating new challenges and tensions between service provision and safeguarding:
    • Example 1: Organisations need to promote their services. Historically, organisations have used physical leaflets. However, many young people are reporting that they would like to hear more from organisations through online channels, particularly Instagram. Organisations are highly reluctant to use this channel because as a public page they would not be able to hide their followers, therefore creating a readymade list of potentially vulnerable young people for anyone to access. Alternatively if they kept their profile private they are potentially preventing young people from finding that service when they need it. Organisations have to balance the challenge of promoting their service and protecting users.
    • Example 2: The terms of service of, for example, Facebook do not allow individuals to create pseudo accounts. While the reasoning for this seems straightforward and sensible to try and avoid fake accounts, this poses a challenge for staff particularly supporting young people in a care-experience setting where they want to support young people to access information and the clubs and groups they are a part of, which is mostly distributed through Facebook groups. So does a staff member:
    1. Use the organisational profile? But, does this risk signalling (to other peers) that a young person comes from a care setting and breach confidentiality?
    2. Use their personal profile? But, does this cross personal / professional boundaries?
    3. Create a duplicate or fake account for purely work related purposes? But this breaches the platform’s terms of services and can be taken down.

These questions arise even before the challenges of data privacy are explored…

  • Impact of ‘free’ technology – historically if you wanted a platform to enable communication between young people you would have to go through the procurement process to buy bespoke technology which would include risk assessments, safeguarding considerations to enable you to sign off a business case. However, given the abundance of free tech, some of this process can and is bypassed because it doesn’t have a financial cost attached to it.
  • Role of devices – to enable sufficient boundaries between the professional and the personal and to reduce overall organisational risk, should all staff have professional devices? Should this now just be classed as the cost of doing business? Though some organisations may argue they do not have the financial recourses to cover this cost, the counterargument affirmed that “You wouldn’t say you can’t have fire escapes because they’re too expensive”.
  • Impact of other organisational policies – no policy acts in isolation. So how do other organisational policies impact the safeguarding abilities or decision making of staff? For example, a key challenge raised was around personal/professional boundaries for staff. Could flexible working policies be used to reduce staff burden if they are paid to be online and available when the young people are available, such as later in the evenings, rather than staff being mandated to work office house and having to do additional unpaid overtime to cover the times young people need them. Further to this, organisational values became a significant discussion point. Values needs to be reflected and genuine across all policies and practices.
  • Risks of not using digital – while traditionally risk assessments can focus the potential negative impact of undertaking a particular activity, what does the flipside of the argument look like? Should more organisations be assessing the risks of not implementing digital practices within the work they deliver? YouthLink Scotland have a very useful template to explore this concept further.
  • ‘Laggards’ vs ‘Early Adopters’ – Often those slow to adopt technology are the ones most chastised for failing to keep up with the ‘modern times’ and within a service context, the impact this may have on the service being delivered to and with users. However, there is also the reverse of this argument: ‘What about those too quick to adopt’? What is the impact of those who are potentially rushing to tech without considering or understanding the full scope of consequences, and how the new technology fits with the intended outcomes of the organisation?


A final thought on passwords – one of the most decisive examples to split opinion, that exemplifies many of the tensions around policy versus practice, was around passwords. Imagine the following scenario:

“Young person B brings in their personal laptop and asks Staff Member A to help them set up a new email address to apply for a job and to help them think of a strong password. Young person B asks the staff member to make a note of the password because they keep forgetting their school password.”

For most organisations the policy is clear – “do not store passwords”. But practice was far more mixed. Participants understand the reasoning behind the policy, but also appreciate that the risk to the young person of them holding the password was far lower than them not having it and that young person potentially losing access to vital services. Participants felt conflicted at the idea of not following organisational policy, but were willing to shoulder that burden for the overall wellbeing of the young people they were supporting.


Next Steps

These were some of the opening conversations and we welcome further insights, good practice or challenge around this topic. We will continue to refine the workshop format and materials, with the intention of hosting future #NotWithoutMe Labs with practitioners and young people.

If you have any examples of digital & safeguarding best practice or would be interested in supporting future workshops please get in touch at [email protected]