Skip to content
HomeAboutWorkProjectsResourcesBlogServicesContact
Build LogCustom GPTsScienceEducation Tools
August 4, 2024·3 min read

Data Test Custom GPTs for Science Teachers

TL;DR

Built a set of Custom GPTs to help science teachers generate practice data tests with less prompt writing and less repeated workload.

BH

Benjamin Hyde

Education Leader & AI Builder

When OpenAI first released Custom GPTs, I saw them as a way to build fit-for-purpose versions of ChatGPT that I could actually use in teaching without rewriting a huge prompt every time. I made a couple for myself first, and they were useful enough that I started thinking more broadly about where they might help other staff.

Around that time, the science staff were preparing to run data tests across Chemistry, Physics, Biology, and Agricultural Science. The tasks themselves were already endorsed, but there was a clear appetite for more practice material so students could prepare properly. That felt like a strong use case: if the GPTs could generate useful practice tasks, they could reduce staff workload while also giving students more opportunities to practise.

Important context

These data-test GPTs were not endorsed by QCAA, and they were never presented as official QCAA materials. They were my own creations, built as practice tools using public materials and guidance provided by colleagues in the science faculty.

They should be understood as teacher-made support resources only. They are not a reflection of QCAA, their endorsement system, or any official assessment judgment. The aim was to generate extra practice opportunities, not to reproduce or represent endorsed assessment.

What I built

The goal was to build prompts that would reliably produce outputs aligned to syllabus expectations while still being easy for teachers and students to use. I wanted the GPTs to generate practice-style data test material in a format that could be taken straight into class preparation or student revision.

The biggest challenge was that I am not a science teacher. That meant I relied heavily on feedback from science staff to identify issues, judge usefulness, and refine the outputs. That is still the main maintenance challenge now, especially as syllabuses change and the tools need updating.

What I learned

The outputs were never meant to replace proper assessment development. The real value was in creating practice opportunities. Even if a generated table or scenario was not perfect in a purist sense, it was still often useful if it gave students something to analyse, interpret, and make judgments about.

What surprised me most was the uptake. I originally built these with my own school in mind, but they ended up being shared much more widely across Queensland. At the time of posting this entry on August 4, 2024, the Biology Data Test GPT alone had already recorded more than 250 uses, which made it clear there was a real need for this kind of tool.

Build Notes

Approach

Start with a clear science-teacher use case and build Custom GPTs that could generate practice-style data test material without staff needing to craft long prompts from scratch.

Tools Used

OpenAI Custom GPTs, prompt engineering, staff feedback loops

What Worked

The tools created extra practice opportunities for students and reduced some of the repeated workload around preparing revision material.

What Failed

Accuracy and subject-specific nuance were always the weak point, because I was building for disciplines I do not teach directly and the outputs were never official or endorsed materials.

What's Next

Refine the GPTs for newer syllabuses and continue expanding into related subjects where staff want similar support.