External Exam Style Question Generators
TL;DR
Built custom GPTs to generate external-exam-style practice questions so students could access more revision material without relying only on past papers.
Benjamin Hyde
Education Leader & AI Builder
In Digital Solutions, the demand for high-quality practice material is constant. My students regularly wanted more questions they could use at home in preparation for the external exam, but the available pool of past papers and sample material was limited. That left teachers carrying the load of continually creating fresh content.
That was the driver behind building the external-exam-style question generators. The aim was simple: give students more opportunities to practise with questions that could feel like the kinds of things that may appear in an external exam, while reducing some of the repeated workload involved in producing new revision material.
Important context
These generators were not endorsed by QCAA, and they were never intended to be presented as official QCAA materials. They were teacher-made tools designed to produce additional practice questions in a QCAA-like style only.
That distinction matters. The outputs are there to provide options for students to find more practice questions that could be like what may appear on the external exam for their subjects. They are not predictions, not official papers, and not part of the QCAA endorsement process.
How I built them
To make the generators useful, I gathered the publicly available QCAA materials I could access and used them as the reference point for the custom GPTs. The goal was to help the model pick up the tone, structure, and level of rigor that students would recognise from the existing assessment materials.
The actual build process for a custom GPT is simple enough. The difficult part is writing a prompt that produces consistent results. Testing was often frustrating because one output could look excellent and the next could miss the mark entirely. It took repeated iterations to get something that was reliably useful rather than occasionally impressive.
Why I shared them
Once the generator was working well enough, I shared it because it solved a problem I knew other teachers were facing as well. If students are motivated and want more practice, that should be supported. A tool like this helps make that possible without demanding endless new question writing from teachers.
The response was strong. Students valued the steady stream of fresh practice, and teachers saw the benefit of having another resource to point students toward. It also reinforced for me that AI can be genuinely useful when it is aimed at a narrow, practical classroom problem rather than trying to do everything at once.
Build Notes
Approach
Use publicly available QCAA materials as reference points and iterate on prompt design until the GPT could produce useful external-exam-style practice questions.
Tools Used
OpenAI Custom GPTs, prompt engineering, QCAA public materials
What Worked
The generators gave students access to far more revision material and reduced the constant pressure to create every extra practice question by hand.
What Failed
Consistency was the hardest part. Some outputs were excellent, while others still missed the nuance or formatting expected in subject-specific contexts.
What's Next
Keep refining the generators, expand them into more subjects, and make the disclaimers and usage guidance as clear as possible.