4 Key Takeaways From The Online Safety Act - Child Protection.
In recent news, the Online Safety Bill, a set of new laws to protect children and adults online, received Royal Assent on 26 October 2023. With the Act in place, the UK government and Ofcom have set laws in motion to make the UK ‘the safest place in the world to be online’. Social media companies now face ‘world-first’ legal duties to keep their platforms safe for children. Teachers face a tricky battle when trying to teach online safety and advise children on how to stay safe from online harm. The Online Safety Act takes a ‘zero-tolerance’ approach to protecting children from online harm. In this post, we will provide 4 key takeaways from the Online Safety Act that teachers should be aware of when considering the online safety of their students.
Why Is The Online Safety Act Important?
Without rigorous scrutiny, children are susceptible to accessing harmful material from the online platforms they use. For Safer Internet Day, the NSPCC shared their support for the Online Safety Bill. In the article, police-recorded data of online sex crimes against children showed that 3500 crimes take place each month.₁ The Act will place a legal responsibility on tech companies to prevent and remove illegal content that is harmful to children in order to reduce this rate of crime. If companies fail to comply with the new laws, they will face fines that could reach up to billions of pounds, or their bosses could even face prison. The Online Safety Act has put child protection online at the forefront of its new regulations and aims to shift responsibility back onto online companies to protect children from harmful content.
4 Key Takeaways from the Online Safety Act 2023.
The Online Safety Act refers to ‘Primary priority content that is harmful to children’.₂ Primary priority content refers to content of any of the following kinds:
- Pornographic content.
- Content which encourages, promotes or provides:
- instructions for suicide.
- instructions for an act of self-injury.
- instructions for an eating disorder.
- Content which is abusive and which targets characteristics such as race, religion, sex, sexual orientation, disability or gender reassignment.
- Bullying content which:
- depicts real or realistic serious violence or injury.
- encourages a person to ingest, inject, inhale or in any other way self-administer a physically harmful substance.
1. User-to-User Services - Safer For Children.
User-to-user services are any internet services that allow user-generated content to be uploaded, shared and encountered by other users. This includes any social media platform, messaging apps and online forums. User-to-user services such as YouTube, TikTok, Snapchat and Twitch are popular sites with children in the UK.
The Online Safety Act will implement legal duties for user-to-user services to undertake risk assessments in regard to the content accessed by children. The services will be obligated to provide an accurate user base including the number of users who are children in different age groups and the level of risk of those children encountering harmful content on their platform. They will then have a duty to include provisions for preventing children from encountering this content.
2. Child Safety, Pornography, And The Protection Of Women and Girls.
The Online Safety Act has strengthened provisions to address violence against women and girls. It will now be easier to convict someone who shares intimate images without consent (otherwise known as ‘revenge porn’) and even further criminalise the non-consensual sharing of intimate ‘deepfakes’. Abusers who share these intimate images will face up to 6 months in prison, but those who share the images with intent to cause distress or harm could now face up to two years.
3. Search Services - A Duty To Prevent.
Search services, much like user-to-user services, will also face legal duties to provide a risk assessment for the potential for children to encounter harmful content that falls under the priority content guidelines. Search engines will have to consider how the design and operation may increase the risk of harmful encounters (e.g. predictive search functionality). Search services will have a duty to use systems that allow users to easily report harmful content.
4. Reporting Harmful Content.
Online services, including user-to-user and search services, will be obligated to provide systems and processes that enable users to easily report harmful content they have encountered. In combination with a duty to publish transparent risk assessments, social media platforms must provide clear and accessible ways for parents and children to report problems online. They must also rigorously enforce age limits and age-checking measures on their platforms.
Why should Teachers be aware of the Online Safety Act?
The new laws brought in by the Online Safety Act place the protection of children ‘at the heart’. This news will come as a relief to teachers and parents, who can expect online platforms that provide services in the UK to take responsibility for the content children encounter. With an increase in and legal obligation to prevent harmful content encounters, more children will be protected from avoidable online harm.
Teaching online safety to young children and teenagers can feel like a bit of a minefield. By accessing accurate and informative PSHE content, you can feel confident that you are working with UK laws like the Online Safety Act to help protect children online.
Our Chameleon PDE partner schools can be assured that online safety is covered in detail in the resources available to them. From online bullying to ensuring your online profile has considered future employment prospects, young people will be well informed on their rights and safety. If you want to find out more about becoming a partner get in touch info@chameleonpde.com.
We know that policies and legislation are continuously updating and it can be hard to keep up. Our flexible resource library does the work for you. The editable teaching resources and updated supporting guidance documents will help you feel confident that you are providing accurate information to your pupils.
Our ‘wrap round’ PSHE support won the best secondary resource category at this year’s National Education Resources Awards and our services aim to provide PSHE/PD leads with the tools to put them firmly in the driving seat. By working closely with our partner schools, listening to students, and collaborating with our teacher advisory board we can confidently say ‘we’ve got you covered’.
To find out more have a look at our webpage for secondary schools here https://www.chameleonpde.com/resources/about?resource_type=secondary or contact us for an informal chat at info@chameleonpde.com
References
1. “There will be more than 3500 online child abuse crimes every month the Online Safety Bill is delayed.” NSPCC, 5 August 2022, https://www.nspcc.org.uk/about-us/news-opinion/2022/child-abuse-crimes-online-safety-bill-delay/. Accessed 22 November 2023.
2. “Online Safety Act 2023.” Legislation.gov.uk, 2023, https://www.legislation.gov.uk/ukpga/2023/50/enacted. Accessed 22 November 2023.