Skip to content
HomeAboutWorkProjectsResourcesBlogServicesContact
Build LogQueryForgeAIEducation Tools
July 6, 2025·3 min read

QueryForge Dashboards and AI Feedback

TL;DR

About six weeks later, QueryForge gained admin and student dashboards plus AI help so students could get feedback even when I wasn't available.

BH

Benjamin Hyde

Education Leader & AI Builder

About six weeks after the initial QueryForge ship, the next major update added the student dashboard, the admin dashboard, and an AI help layer for instant feedback. This was the point where the project started to feel less like a SQL sandbox and more like a teaching tool.

The main reason for adding AI help was practical. I wanted students to be able to get feedback on their work when I was not available, especially when they were completing work at home or outside class time. Instead of being stuck until the next lesson, they could get immediate guidance and keep moving.

What changed

The student dashboard gave learners a clearer sense of progress and made the platform feel more personal. The admin dashboard gave me visibility into how students were working and where they were getting stuck.

The AI help feature added a support layer on top of the challenges. The idea was not to replace teacher feedback, but to make sure students could still get timely guidance when they were working independently.

Why it mattered

One of the biggest frustrations with student coding work is that momentum disappears when help is not immediately available. That is even more true when students are working from home. This update was about reducing that delay.

By combining dashboards with AI feedback, QueryForge became more useful both for students needing support and for me as a teacher trying to see progress across the class.

Screenshots

QueryForge student dashboard showing progress across challenge levels.
The student dashboard gave learners a clearer picture of where they were up to across the challenge sequence.
QueryForge admin progress dashboard showing class-wide completion data.
The admin dashboard made it much easier to see class progress, patterns, and where students were getting stuck.
QueryForge challenge screen showing AI advice after an incorrect SQL query submission.
AI feedback gave students immediate guidance when a query missed the task requirements, even when I was not available.
QueryForge challenge screen showing a successful SQL query result.
Students could test and refine queries in the same workflow, with the platform reinforcing correct outputs and progress.

Build Notes

Approach

Build the next layer around visibility and feedback: give students a dashboard, give the teacher oversight, and add AI support for faster response when students are working independently.

Tools Used

Next.js, TypeScript, Prisma, SQLite, OpenAI

What Worked

The platform became much more useful once students could see their own progress and get immediate feedback without waiting for the next lesson.

What Failed

AI feedback still needed careful framing so it supported learning without encouraging students to rely on it as a shortcut.

What's Next

Keep refining the quality of the feedback, improve teacher visibility, and make the challenge flow feel even more seamless.

Resources Mentioned