Second-level teachers have raised significant concerns about the permitted use of artificial intelligence (AI) in project work for the Leaving Certificate. This follows the release of new guidelines by the National Council for Curriculum and Assessment (NCCA) recently. These guidelines apply to subjects such as chemistry, biology, physics, and business and address the implementation of the Additional Assessment Component (AAC), which is set to officially launch in September 2025.
The AAC introduces project work that will contribute at least 40% to students’ final Leaving Certificate grades in each subject. Designed by the State Examinations Commission, this initiative aims to ease the reliance on high-stakes final exams.
The guidelines acknowledge the use of AI tools like ChatGPT for sourcing references but emphasize that students must provide “appropriate details” of any AI-generated material included in their projects. The NCCA also reiterated that plagiarism, including the use of AI-generated content without proper attribution, remains a “serious offence.”
Despite these clarifications, many teachers have expressed apprehension about incorporating AI into the Leaving Cert framework. The Teachers’ Union of Ireland (TUI), representing educators at secondary level, highlighted “serious concerns” regarding the impact of AI on assessment methods.
Some science teachers have criticized the guidelines, arguing that they “ignore the reality of generative AI” and could inadvertently encourage unethical practices among students. Many teachers have questioned how teachers could verify the authenticity of project work when no reliable mechanisms exist to detect the use of generative AI. Teachers are saying that the Leaving Cert is a high-stakes exam and that allocating 40% of marks to project work is significant even without the challenges posed by AI.
Using a suggested prompt, the Irish Examiner recently asked free online AI tools such as Chat GPT and Perplexity to generate a hypothetical project for a Leaving Cert student, alongside a set of fictional results. Within less than 30 seconds, the tools generated research questions and background knowledge in response to the question while also simultaneously designing a hypothetical experiment.
As prompted, the tools also generated a believable but completely fictional scientific method followed by a student, alongside a realistic timeline for the experiment. AI was also easily able to discuss the fictional yet realistic results, issue a logical conclusion, and generate a well-rounded reflection on the project.
The feeling among teachers is that the vast majority of students will be doing genuine projects, and their teachers will be supporting them along the way, but the burden of responsibility will be put on the teacher to authenticate each student’s work, and many believe that it won’t be possible to do that completely.
In response, an NCCA spokesperson emphasized that teachers and the Department of Education were involved in developing the guidelines, which neither encourage nor require students to use AI. However, the debate over the role of AI in education continues, highlighting the challenges of balancing technological advancement with academic integrity.