Legal Safeguards for Children in an Era of Deepfakes and AI: A Practical Guide for Therapists and Educators
- auberginelegal
- Jan 20
- 6 min read

The rise of AI and the emergence of technologies that enable image-altering and even deep-fake media are a concern for many, especially for parents, educators, and those tasked with safeguarding children. It seems we’re only just starting to get to grips with educating children on responsible social media use and its potential harms.
Now, technology is taking huge leaps forward, which, while useful in many ways, also poses new dangers to all of us, particularly children. Whilst we grapple with how to keep ourselves, our businesses, finances, and identities safe, we must also be taking measures to protect those less able to protect themselves.
So, what does the Online Safety Act say about our obligations to keep children safe online? How can educators and therapists implement robust online safety measures, and what do we do when an issue is flagged?
In this article, we’ll explore the above and focus particularly on the newer threats posed online, how the UK government may tackle these, and what that means for our ethical and legal obligations.
This guide will be particularly useful for schools, therapists who work with children, educational psychologists, and other designated safeguarding leads, but also intends to provide some legal clarity and general advice for anyone concerned about keeping children safe online.
The Online Safety Act 2023
This Act has received much attention in the media, both positive and negative. But what is actually in it and what are its intentions?
The Online Safety Act was drafted by the Conservative party during their most recent time in government and received Royal Assent on 26th October 2023. Although illegal content duties and child safety duties are now in effect, categorised services obligations will not come into force until later in 2026.
Categorised services include large search engines, user-to-user online platforms that enable users to create and share content, and user-to-user services with 3million + users that have specific functions including direct messaging.
The aim of this legislation is to reduce the risk of online harm, particularly for children. It seeks to protect from illegal, harmful, and age-inappropriate content.
What has been unwelcome by some and highly praised by others is that this Act doesn’t place the responsibility so much on educators, and designated care providers, but instead seeks to ensure service providers are legally responsible for protecting their users. This includes social media platforms, search engines, content hosting platforms, and messaging services.
Certain voices, including those from the USA in particular, have criticised this move, believing this legislation restricts innovation and unfairly impacts businesses. They believe responsible use of the technology should be put upon the users and not the creators. However, others do not believe this legislation goes far enough in punishing technology providers and online platforms that enable the publication of harmful content.
As tempting as it is to dive into this debate, since the government has no plans to repeal the Act, let’s focus on what it includes and what this means for service providers and Designated Safeguarding Leads (DSLs).
Child Safety Duties Under The Online Safety Act
Online service providers are legally obliged to protect children from harmful content. This is particularly relevant if the service is likely to be accessed by children. However, those not intended for use by children are compelled to prevent children from accessing age-inappropriate materials through the use of Highly Effective Age Assurance (HEAA) measures.
This may include:
Age verification via IDs
Age estimation using facial scans or biometric data
Content filtering and moderation
Use of parental controls
Those services that are deemed likely to be accessed by children must ensure harmful content (particularly that relating to sexual exploitation, self-harm, abuse, and age-inappropriate content) is either blocked from publication or promptly removed. This usually requires regular moderation and the ability for users to report inappropriate content easily, and have issues dealt with quickly and effectively.
It’s fair to say the legislation focuses on prevention over removal, however removal is still an option for service providers. One concern over moderation and reporting processes (which should still be in place even if prevention is the key aim) is that some providers are opting for, or considering use of, AI technologies to perform this.
This may save costs for online service providers in the long run, but some people believe that human intervention is crucial for such an essential responsibility.
Risk assessment, as always, is important and emphasised in the Online Safety Act. Regularly updated and thorough risk assessments can help companies keep on top of advances in technology and unfavourable online activity, and keep their processes robust.
Illegal Content Duties
As an extension of the above, all service providers are legally tasked with the prevention and - when that fails - removal of illegal content. This is regardless of whether the content is deemed to be designed for minors or not.
Enforcement
Although the legislation sets out the terms for online service providers to adhere to, enforcement in the UK has been granted to Ofcom - the UK’s regulator for communication services. If platforms are found to be in breach of their legal obligations, Ofcom has the power to request changes to the operators' systems and policies, has the right to access information and conduct audits, and can impose substantial fines.
This is not limited to company fines and, where senior managers are found to be responsible, can lead to personal liability charges for non-compliance.
In serious cases, Ofcom may also apply to the courts to restrict or block the platform's services in the UK. This is of particular risk to X (formally Twitter) since its Grok AI chatbot has been found to enable users to generate inappropriate (sometimes illegal) content. However, the UK government may instead create a new law that strips the platform of its right to self regulate, as announced in a recent BBC article (published January 13th 2026). As demonstrated in this particular issue, UK legislation can only do so much in the wild west of the web, where content is created and accessed worldwide, and rules and regulations can vary widely.
How Educators, Therapists, and DSLs Can Support Online Safety For Children
Whilst the Online Safety Act has put the onus more on service providers, educators, therapists, safeguarding leads, and parents still have an important role to play in protecting children online, and often some legal responsibility too.
In terms of legal responsibility, professionals tasked with the safeguarding of children - including educators, therapists for children, and other professionals such as childminders and extra-curricular activity providers - will need to adhere to the Online Safety Act and the Children Act 1989/2004.
Some professionals may also need to adhere to safeguarding advice as laid out by professional governing bodies, such as HCPC and BACP.Educators, child therapists, and other organisations tasked with safeguarding children should be taking the following actions to tackle online risk and ensure compliance -
Risk Assessments
Those allowing children to access online content when in their care should conduct risk assessments. This process can help to expose risks and guide care providers in implementing measures to manage children’s exposure to content, social media, and AI-generated media.
Policy Updates
As technology and online communication platforms evolve at an extraordinary pace, it’s essential to be reviewing risk assessments and policies regularly. You may need to update your safeguarding and child protection policies to include, for example, explicit mention of online harms, AI content, and deepfakes.
Consent and Data Protection Of Children
When working with children, some professionals will be required to obtain consent to collect personal information, especially from children. They may also need to seek consent to provide their services to a minor.
The issue of consent can be difficult, depending on the age of the child you are working with. Consent can only be obtained if what they’re giving consent for is presented in a clear, understandable, and transparent way. This is why I have created a number of explainer videos designed for educational therapists and child psychologists working with children. These videos provide age-friendly explanations of what psychologists do, how assessments and therapy work, and how practices comply with legal, data privacy, and consent requirements. They're ready‑to‑use, unbranded, and tailored for independent practices, helping you communicate complex issues in a simple, accessible way. Find them in my online shop.
Children have advanced data protection rights and organisations handling data of minors will need to implement robust processes. Learn more in my guide to Children and Data Protection.
Implement Reporting Processes
It’s essential that you and your staff know what to do when online safeguarding issues arise. This may include discovery of harmful material or claims from minors that they’ve come to harm online. Educators and therapists should establish clear protocols for escalating concerns, both internally (safeguarding leads and senior staff) and externally (social services, law enforcement).
Training
One of the most important actions child service providers can take to protect children online is to educate themselves. To know the emerging risks that include AI-enabled harms, deepfakes, grooming, and online exploitation, and stay alert to Ofcom guidance and evolving case law.
Overall, it’s essential that those working with children practice proactive safeguarding and aim to keep themselves, their staff, and the children informed.If you are an Educational Psychologist or Clinical Psychologist, please do refer to the dedicated psychology pages on my website, offering a number of legal resources specific to your profession.



