Responsible use of AI.
Using AI and Our cechat Platform Responsibly: Supporting Deep Learning Across All Year Levels.
At Carroll College, we are committed to helping students become independent, reflective learners across all year levels. One of the innovative tools we’ve introduced to support this goal is cechat, a suite of AI-powered chatbots designed specifically for our classrooms, with guardrails in place to align with Catholic education values and standards.
Unlike general-purpose AI tools such as ChatGPT, Claude, Gemini, or Perplexity, cechat agents are carefully constructed for each subject and class, aligned with syllabus outcomes and learning goals. These chatbots are not just answer engines, they are learning companions that encourage students to think critically, reflect deeply, and engage meaningfully with content.
Many cechat agents use the Socratic method, prompting students with questions that guide them toward understanding rather than simply providing answers. This approach helps build essential skills in analysis, reasoning, and problem-solving. However, not all cechat agents are Socratic, some are designed to provide structured support, revision scaffolds, or writing frameworks. Regardless of their format, all cechat agents are built with ethical guardrails to ensure safe, appropriate, and bias-aware interactions.
While these tools are available to all students, they are particularly relevant for Years 11 and 12, where deeper learning and independent thinking are critical. There has been some discussion among students about using AI to “cheat” or shortcut their work. We want to be clear: cechat is not designed to do the work for students. It is a tool for learning, not a replacement for thinking. Misusing chatbots to copy and paste answers, known as bypassing cognitive effort, undermines the learning process and limits the development of deeper understanding. If a student is found to be misusing a chatbot in this way, they will receive an email with a transcript of their chat log to help them reflect on their approach.
We also recognise that this kind of misuse cannot be policed at home, especially as students increasingly access AI tools outside the classroom. In response, the assessment landscape is evolving. We are designing tasks that reward deep thinking, synthesis, and originality; skills that AI cannot replicate on a student’s behalf. These changes aim to ensure that assessments remain authentic and reflective of genuine student understanding.
We encourage families to talk with their children about how they’re using AI in their studies and to reinforce the importance of engaging with learning, not just completing tasks. If you’d like to learn more about cechat or how it’s being used in your child’s class, please don’t hesitate to get in touch.
Mr Jason Szkwarek
AI Lead
Classroom Teacher