An AI in Education conference hosted at Brannel School, Cornwall on Saturday 17 May, 2025. This conference was for teachers, leaders and support staff working in Cornwall and the South West who want to learn more about how AI can change the way we teach, lead, learn and work in education.
This session was designed to discuss the impact of generative AI on assessment in education, bringing my experience of working in HE, and making connections with teaching in secondary education. We talked about what constitutes (in)appropriate use of AI and the potential for spotting such use in students' work. We also talked about how AI can help plan assessments (creating case studies, question banks and fostering creativity in assessment), during the assessment (AI-based assessments, marking with AI, etc) and after the assessment (from marking and feedback to data analysis and forward planning).
There is comprehensive and evolving guidance for UK secondary schools on the use of artificial intelligence (AI) in assessments. This guidance is designed to protect the integrity of qualifications, ensure fairness, and clarify the responsibilities of schools, staff, and students.
The Joint Council for Qualifications (JCQ) has published detailed guidance, most recently updated in April 2025, outlining how AI should and should not be used in assessments. https://www.jcq.org.uk/exams-office/malpractice/artificial-intelligence/
Key points include:
Authenticity: All work submitted for assessment must be the student’s own. If AI is used to the extent that the work is not independently produced, it constitutes malpractice.
Acknowledgement: If AI generates any part of a student’s work, this must be clearly identified. However, simply acknowledging AI use does not mean the student will receive credit if they have not independently met the assessment criteria.
Malpractice: Misuse of AI-such as submitting AI-generated work as one’s own-will be treated as malpractice and can lead to severe sanctions.
Centre Policies: Schools must update their malpractice and plagiarism policies to cover AI use, including clear guidance on referencing and acknowledging AI-generated content.
Education and Communication: Students and staff must be made aware of the risks and rules regarding AI. Schools are encouraged to communicate these policies to parents as well.
Detection and Investigation: Teachers should be vigilant for signs of AI misuse and are responsible for investigating suspected cases.
Ofqual (the exams regulator) prohibits the use of AI as the sole marker of student work. Human judgement must always be involved in marking, as reliance solely on AI could introduce unfairness and undermine confidence in qualifications.
Ofsted supports the use of AI where it improves education but requires that its use aligns with principles of safety, transparency, fairness, and accountability.
Schools and colleges can set their own rules on AI use, provided they comply with legal requirements around data protection, child safety, and exam regulations.
The National Education Union (NEU) and other bodies provide checklists and resources to help schools negotiate and implement AI policies [NEU checklist].
Schools are advised to:
Update Policies: Revise malpractice/plagiarism policies to explicitly address AI, referencing JCQ and relevant awarding body guidance.
Educate Stakeholders: Ensure students, staff, and parents understand what constitutes appropriate and inappropriate AI use, the risks involved, and the consequences of misuse.
Acknowledge AI Use: Require students to declare any use of AI in their work and provide guidance on how to do so properly.
Monitor and Investigate: Train staff to detect potential AI misuse and follow established procedures for investigation and reporting [See JCQ guidance].
Maintain Human Oversight: Ensure that any use of AI in marking or assessment is always overseen and validated by a human assessor [See Ofqual policy].
JCQ’s "AI Use in Assessments" (April 2025 revision) - https://www.jcq.org.uk/exams-office/malpractice/artificial-intelligence/
Ofqual’s approach to regulating AI in qualifications - https://www.gov.uk/government/publications/ofquals-approach-to-regulating-the-use-of-artificial-intelligence-in-the-qualifications-sector/ofquals-approach-to-regulating-the-use-of-artificial-intelligence-in-the-qualifications-sector
DfE and Ofsted policy papers on AI in education (Jan 2025) - https://www.gov.uk/government/publications/generative-artificial-intelligence-in-education/generative-artificial-intelligence-ai-in-education
NEU AI policy checklist for schools - https://neu.org.uk/latest/library/artificial-intelligence-ai-schools-checklist
NAHT Artificial intelligence (AI) in education https://www.naht.org.uk/Advice-Support/Topics/Management/ArtMID/755/ArticleID/2425/Artificial-intelligence-AI-in-education
Gov.uk Blog: Education Hub (2025) AI in schools: What you need to know https://educationhub.blog.gov.uk/2025/03/artificial-intelligence-in-schools-everything-you-need-to-know/
OCR (2024, 9th Feb) Guidance and support for the use of AI in English https://www.ocr.org.uk/blog/guidance-and-support-for-the-use-of-ai-in-english/
The Open University (Aug 2024) Developing robust assessment in the light of Generative AI developments https://www.ncfe.org.uk/help-shape-the-future-of-learning-and-assessment/aif-pilots/the-open-university/
Source(s)
Text above was curated with the help of Perplexity.ai: https://www.perplexity.ai/search/what-guidance-exists-for-deali-_bA6BPK9QDWkvgKGupd7bg#0
I helped write a chapter in this book, released in May 2025.
Cotton, D.R.E., Wyness, L., Jane, B. and Cotton, P.A. (2025). Redefining assessments in the age of AI. In R. Corbeil and M.E. Corbeil (Eds), Teaching and Learning in the Age of Generative AI: Evidence-Based Approaches to Pedagogy, Ethics, and Beyond. Routledge. [Routledge]
Find out more about me here: About This Site