Generative AI Call for Evidence: Department for Education publishes its summary of responses
Following a period of evidence gathering relating to the use of generative AI tools in education settings in England, the Department for Education (DfE) has now published its summary of responses.
In June 2023, the DfE invited views and experiences from educational practitioners, artificial intelligence (AI) experts and those operating in the educational technology (EdTech) sector, on the use of generative AI tools in education (Call for Evidence).
The Call for Evidence, which ran for 10 weeks, not only follows widespread increase in the capabilities of AI, but a growth in recognition and interest by the public (including the education sector) as to its various use-cases.
The DfE subsequently published a report summarising its findings from the Call for Evidence on 28 November 2023.
Who responded to the Call for Evidence?
During the consultation period, the DfE reported that 567 responses were received. Over 80% of responses received were from institutions or organisations located in the UK, with the remainder being split between international countries including the U.S., Hong Kong, India, and others.
Of the 567 responses received, the DfE noted that the largest volume of responses came from Academies (124) and Higher Education Institutions (83). Of these, most respondents were reported to be teachers who were early adopters of generative AI.
The DfE also recognised a proportion of responses, categorised as “Other”, that included contributions from various think tanks, charities and non-profit organisations.
What were the key themes identified?
The report summarises the feedback received from respondents into four key themes relating to the adoption and use of generative AI in education. These include:
- Experiences: how generative AI is being used, the main challenges being faced and the key benefits from its use.
- Opportunities: how generative AI could be used further to improve education.
- Concerns and risks: the main concerns surrounding the use of generative AI in education.
- Enabling use and future predictions: the expectations surrounding future use of generative AI in education, and support that the sector would like to receive.
The report highlighted that the majority of respondents working in an education institution or organisation, have either: (i) experimented with generative AI, or (ii) adopted generative AI tools in their workplace.
With that said, typical use of generative AI by respondents (including teachers and students) includes tools that are widely available to the public, for example, Open AI’s ChatGPT, Google Bard, Canva and Dall-E; rather than those that have been developed for a specific purpose.
By leveraging generative AI tools, the report summarises that their use presents a variety of positive experiences in relation to:
- Creating education resources
- Lesson and curriculum planning
- Assessment, marking and feedback.
The feedback received from respondents are reported by the DfE as not too dissimilar from the experiences of both teachers and students resulting from their current use of generative AI.
For teachers, the key perceived opportunities include:
- Freeing up teacher time
- Improving teaching and education materials
- Supporting teacher professional development.
For students, the report suggests that the use of generative AI can offer:
- Adaptive methods of teaching
- Real time feedback
- Assessment and higher quality and more engaging learning materials.
Concerns and risks
Despite the optimism towards the use of generative AI tools, the report indicates a number of issues across all educational sectors and stages.
A widespread concern highlighted by the DfE was that teachers and students may come to over-rely on generative AI tools. This, supported by an uneasiness that generative AI may:
- Produce factually inaccurate information.
- Perpetuate biases, could ultimately impede on the ability of students to develop skills and knowledge – thereby reducing the ‘overall quality’ of teaching.
Other concerns, more commonly recognised across other sectors (and therefore not limited to education) include the risks associated with data protection, privacy, security and the safeguarding of users.
Enabling use and future predictions
The DfE report a widespread optimism towards the future of generative AI tools in education, and its potential for a ‘transformational and profound’ impact on the sector, if adopted safely and effectively.
In support of the education sector’s growing implementation and adoption of generative AI tools, the report indicates that a number of respondents called for training on generative AI.
The most requested training as set out in the summary were topics concerning:
- Basic digital literacy.
- AI literacy.
- Safe and ethical generative AI use.
- Alignment of generative AI with good pedagogical practice.
- The impact on the skills students will need as they enter an AI-enabled workforce.
To supplement the training, the report highlights a ‘desire’ for generative AI use in education to be supported by not only further regulation, but also a set of clear boundaries and rules which address academic malpractice, safe and ethical use, and data privacy.
The summary of responses reaffirms the positive welcome by education practitioners towards the application and use of generative AI in the sector.
Feedback from teachers across primary, secondary, and tertiary education, as summarised in the report, suggest that the use-cases of generative AI present a variety of novel benefits and opportunities which can be leveraged to free up time, provide additional educational support and offer adaptive methods to teaching and learning.
Whilst the responses highlight, in abundance, the optimistic view and clear appetite towards generative AI in education, almost all respondents reported concerns in relation to its use. A balancing act between benefit and risk will no doubt be required as the application of these models in education increase, to ensure that both students and staff alike, can use generative AI in a safe and secure manner.
If you have any queries or would like further information surrounding the development of AI in this area, please contact Joshua Day.
The content of this page is a summary of the law in force at the date of publication and is not exhaustive, nor does it contain definitive advice. Specialist legal advice should be sought in relation to any queries that may arise.
‘Doing the right thing’ is at the heart of Freeths. Find out more about our excellent client service and the strong set of values that guide the way we work.
Talk to us
Freeths are a leading national law firm with 13 offices across the UK. If you have a query about our services or just want to find out more, why not give us a call?
Contact: 03301 001 014