The KWK AI Playbook

The KWK AI Playbook

This guide equips you to help scholars use AI as a learning tool, not a learning replacement. You'll find clear descriptions of each habit in the AI Playbook along with facilitation strategies to help you bring them to life throughout camp.

Our Vision

At Kode With Klossy, we believe scholars should leave camp as critical consumers and empowered users of AI who understand how to leverage these tools to enhance their learning, amplify their capabilities, and build their confidence as coders. In our AI/ML camp, we go further: scholars also become responsible builders who design AI tools with ethics, transparency, and user wellbeing in mind.

Your Key Message

Throughout camp, your goal as an Instructional Leader is to set a positive and permissive tone when it comes to AI. You’ll introduce the 5 habits as practical guidelines on Day 1 and encourage scholars to revisit them throughout camp!
AI is everywhere, and we're definitely going to use it at KWK to support your learning. But we want to make sure you're using it in a way that builds your skills and confidence! These five habits will help you decide when AI is your best tool, and when your brain needs to take the lead.

The 5 Habits: Your Facilitation Guide

  1. Question Your Shortcuts | Pause before using AI when learning and ask: “Is this helping me or am I skipping the hard part?” The struggle to understand a new concept is often where the real learning happens!
    1. Your role as a facilitator:
      • Create a culture that values the struggle that comes with learning challenging content
      • Help scholars self-assess their AI use
      Be on the lookout for scholars reaching for AI immediately when stuck. Caution scholars against avoiding challenging problems by letting AI solve them. Encourage scholars to embrace the challenge: “I know this is hard, but give it one more try on your own. If you’re still stuck in 5 minutes, let’s troubleshoot together.”
  1. Brain First, Bot Second | AI should refine and extend thinking, not replace it. Scholars should always do the cognitive work first, then use AI to enhance that work. Additionally, scholars should always cite AI use and maintain their own voice.
    1. Your role as a facilitator:
      • Model this habit by showing your own thinking process before using AI.
      • Ask probing questions: “Before we use AI, how would you explain this concept in your own words?”
      Look out for scholars copy-pasting AI generated code without understanding it or using AI for their very first attempt at a problem. Support scholars by focusing on understanding when it happens: “I notice you got this code from AI. Can you walk me through what each line does? Let’s make sure you really understand it.”
  1. Stay Skeptical | AI makes mistakes and it’s important to fact-check information. If something looks suspicious, it probably is! For scholars in the AI/ML camp, they’ll also need to audit their AI tools and test for bias with a diverse set of users.
    1. Your role as a facilitator:
      • Model skepticism when reviewing AI-generated content: “How could we verify this?”
      • Share real examples of AI failures or biases (age-appropriate and relevant examples)
      Watch out for scholars accepting AI outputs without verifying them. Encourage scholars to test AI-generated content: “AI gave you this answer, but let’s run it and see what happens. Can you predict what will happen?”
  1. Guard Your Data | Warn scholars again sharing personal information, secrets, or images of themselves or friends with public AI tools. Personal data is valuable and needs to be protected. In AI/ML camps, make sure scholars consider privacy concerns when building their capstone projects.
    1. Your role as a facilitator:
      • Set clear guidelines about what’s okay to share with AI
      • Normalize privacy awareness and make it cool to care about data privacy
      Discourage scholars from sharing personal information or photos of themselves or friends with AI tools. Suggest alternatives if scholars are sharing personal data: “I see you’re using your real information here. Let’s use a fake example instead to protect your privacy.”
  1. Keep It Real With Real People | AI can’t replace human connection, mentorship, or emotional support. Encourage scholars to turn to trusted people for advice, feelings, or relationship challenges. In the AI/ML camp, make sure scholars don’t design tools that manipulate emotions or pretend to be human.
    1. Your role as a facilitator:
      • Create community through pair programming, group discussions, and peer support.
      • Normalize asking for help and relying on relationships for personal support.
      Warn scholars against asking AI for personal advice or emotional support. In the AI/ML camp, watch out for projects that create “AI friends” or companions without discussing ethical implications. Encourage scholars to reach out if they are using AI for personal problems: “This sounds like something you might want to talk through with a person who knows you. Want to chat about it, or should we grab another IL?”
      A note on this habit: You may not see this issue arise during camp, but we include it because teens are increasingly turning to AI for emotional support and personal advice. As trusted adults working with a vulnerable population, we have a responsibility to name this concern. Be proactive in creating human connection at camp so AI never feels like the best option for personal support.