External Exam Style Question Generators
TL;DR
I wanted to make study material a little easier for my Digital Solutions students. So I took what I learned from the Data Test CustomGPTs and made some external exam style ones.
Benjamin Hyde
Education Leader & AI Builder
In Digital Solutions my kids always wanted more and more questions to be able to practice. When teaching this in 2024 it was hard, because we only had a few past papers from QCAA so it was difficult to just rely on those because we quickly ran out.
Some quick context
These GPTs were not endorsed by QCAA, and I never presented them as official QCAA resources. They were teacher-made support tools I built using public materials plus guidance from Science colleagues. The goal was simple: give students more practice opportunities.
That distinction matters. The tools were never meant to replace endorsed assessment design, represent QCAA, or act as official material. I did hear some schools used generated data tests in endorsement workflows, which was wild to hear, but that was never the original intent.
What I did
This time around it was much easier, I had a formula that worked for CustomGPTs with the Data Tests — I basically followed the same process as last time.
This time around I was able to use the Practice Tests from QCAA as the 'training data' because their Creative Commons license that was on the test papers allowed for use with modification with attribution. I tried to feed the model as much of the information as I could, so this meant not only the test papers but also the marking guides that are publicly available. I cited the QCAA where I could on the CustomGPT itself, but that was a little limited.
What I learned
The amount of training data really changes the quality of the output you get. It was tricky to manage getting as much data into the model as that 'training data' (for lack of a better term) and hitting the limit of the number of documents I could add. At the time of creating these, there was a limit of 10 documents as attachments, which made things tricky so the quality of the prompt was really important.
Replicating this for different subjects was even easier than the data tests because we had the publicly available information that I could use to help inform the model.
Closing thought
After the success of building the Data Test GPTs I found it really quick and easy to make these ones and get stuff that was mostly successful quite quickly. Getting word out about these ones was much easier, I got friends to forward the Science ones through the EQ Discussion Lists for me with some information about just searching my name on the GPT Library on ChatGPT and they could find the other subjects that I had built.
Build Notes
Approach
Use public QCAA materials as references, then iterate prompts until outputs were consistently useful as external-exam-style practice questions.
Tools Used
OpenAI Custom GPTs, prompt engineering, QCAA public materials
What Worked
Students got access to much more revision material, and teachers had less pressure to handcraft every additional question.
What Failed
Consistency remained the hardest issue; some outputs were strong while others still missed subject nuance or expected formatting.
What's Next
Keep refining generator quality, keep subject lists current, and expand only where there is clear demand and strong teacher feedback.