DfE’s updated AI and education policy – what does this mean for the EdTech sector?
The “Generative artificial intelligence (AI) in education” DfE policy paper (10th June 2025) sets out DfE’s most definitive yet view on the adoption of AI tools in schools. With coverage of issues that include safety and effectiveness in use and cautionary advice about copyright, this paper should, if widely read and applied, influence how schools and colleges expect to work with suppliers of software (including learning platforms and management information systems) and apps (not forgetting equipment categorised within the “internet of things” that are adopted in the educational environment.)
The paper adopts a positive approach from the outset making clear that the policy is influenced by the wider government AI policy – the “AI Opportunities Action Plan”. DfE considers that “If used safely, effectively and with the right infrastructure in place, AI can support every child and young person, regardless of their background, to achieve at school and college and develop the knowledge and skills they need for life”.
Given the rapid advancement AI has made in a short space of time, the paper is also cautionary with an “early days” warning - DfE is committed to continuing to work with sector participants in relation to safe and effective use.
Schools and Colleges will be looking for assurances
For many engaged in AI development and, importantly, those adopting AI models within their education products, the credibility of offerings provided to educators is absolutely critical. We can for certain see requests for more sophisticated licence terms and conditions with requests for warranties in areas such as:
- the origins of adopted AI models and steps taken to ensure absence of bias, inputs to the model feeding into outputs not being out of date (an early version of the free Chat GPT model understandably denied the death of Queen Elizabeth II due to an earlier data cut off date)
- assurance that outputs will not be affected by bias (probably next to impossible to give an assurance on but with ICO guidance on this issue followed a qualified warranty should be possible)
- assurance that where curriculum content is generated the training dataset was specific to the curriculum adopted in the classroom
- use of a given product will not compromise statutory obligations of the school such as keeping children safe
Student use presents a range of risks
If you have ever seen the film Ferris Buller’s day off that 1986 film was a scene setter for the range of risks to our education system that can now follow in the path of AI! An example provided by DfE in the paper is the potential development of letters to parents that are AI generated by pupils.
Management of use by both staff and students becomes a serious responsibility for leadership teams. Key expectations of DfE are that:
- use of AI is evaluated to ensure that benefits outweigh the risks
- safety should not be compromised
- efforts are made to draw parameters around use by staff and students where a use has not been explicitly approved or formally adopted in a setting, and
- plans should be in place to mitigate in the case of unauthorised use cases
Schools maintain discretion in the use of AI
The paper makes clear that it does not have the status of guidance – schools and colleges remain free to make their own choices as to use cases, what access is provided, by what means and with judgements on supervision left to schools.
Refreshing homework policies and engagement with parents are also highlighted as actions that should, in appropriate cases, be considered.
Privacy should not be compromised
Data privacy implications feature in the paper as an issue to be understood and managed. The duty of transparency held by all data controllers becomes particularly important when use of AI is embarked upon. Schools and colleges will expect to understand the implications for data privacy of any given product adopted within the classroom and have a responsibility to update privacy notices with appropriate information – particularly where profiling may be part of the functionality of a given product. Consents may require to be taken when use of personal data changes.
AI and Copyright is also covered
The use of copyrighted materials in AI training models and through inputs generated in the classroom or through homework is also a subject covered in detail in the DfE paper.
Importantly, schools and colleges are advised not to permit student original work to be adopted within training models except with permission (this must be given in the form required by copyright legislation) or use is within a permitted exception.
The government maintains a valuable guide to the circumstances in which copyright permission may not be required available at this web page. Quite how this relatively straightforward regime of permissions will fare once the battle of AI and author rights is fought out and won (or compromised) should be monitored closely within the EdTech sector.
So how should the EdTech sector respond to the AI policy?
There are clear must do’s that can be taken from the above:
- understand the implications for users as well as your business from the adoption of AI tools and the use of AI within your apps and services
- have an ability to demonstrate through credible evidence the benefits to be derived from your product’s adoption of AI
- have clear, well presented, supporting technical information to help schools adopting your product to have sufficient assurance about use and understandings as to what limitations should be taken into account when deploying the product
- be able to demonstrate that the product is safe to use taking into account the legal duties of a school to protect the safety and well-being of students
- ensure that functionality provides for the protection at school, MAT or College level of personal data – in short, the personal data stays within the organisation,
- take into account the importance of supporting schools in observing copyright law, and
- through training programmes or explanatory videos, support schools to understand how the incorporation of AI has supported your ability to deliver a product that can be trusted, is safe and will contribute to educational excellence
The full policy statement can be read here.
If you have questions regarding the DfE's updated AI and Education policy please get in touch with Frank Suttie in our Commercial Education team.
Get in touch
The content of this page is a summary of the law in force at the date of publication and is not exhaustive, nor does it contain definitive advice. Specialist legal advice should be sought in relation to any queries that may arise.
Related expertise
Commercial Education
Involved in a business that is working in the Education and Skills sector? If so, we think you will want to know more about Freeths and our sector leading legal team.
Local Government
It is in our DNA
Commercial Law
Commercial Contracts & EdTech
Our expertise in commercial contracts and EdTech can help your education business navigate IP rights, compliance and new digital legislation.
Law Firm of the Year
We are proud to have been named Law Firm of the Year at the prestigious Legal Business Awards 2024!
Legal Business is the market-leading monthly magazine for the UK and global legal market. Its readership spans the UK, Europe, Asia and the US, and the awards celebrate the very best in the legal profession.
This win is absolute recognition for all the hard work across the firm over the past year.
Contact us today
Whatever your legal needs, our wide ranging expertise is here to support you and your business, so let’s start your legal journey today and get you in touch with the right lawyer to get you started.
Get in touch
For general enquiries, please complete this form and we will direct your message to the most appropriate person.