top of page

Act 4: Scratch + AI Curriculum

This Act 4 Curriculum for middle school students was developed under the Computing & AI for All Project to introduce AI literacy through the CreatiCode platform. It incorporates project-based learning, accessible AI tools, ethical discussions, and real-world applications to help students critically explore and build with AI technologies.

Unit3
Unit4
WechatIMG382.jpg

Unit 1: Generative AI Tools

Lesson 1.1: Intro to AI & CreatiCode

Learning Objectives:

  • ​Understand the basic types of Artificial Intelligence (Generative and Predictive AI).

  • Recognize real-world applications and the significance of AI in modern society.

  • Interact with AI tools within the CreatiCode platform, including ChatGPT ("Chat with Einstein") and CreatiCode XO.

  • Develop an appreciation for the role of AI in changing the world and its ethical implications.​

​

Duration: 50-55 minutes

Lesson 1.2: Searching/Generating Backdrops with AI

Learning Objectives:

  • Understand how to search for existing AI-generated backdrops in the CreatiCode library.

  • Learn to create custom backdrops using detailed text prompts.

  • Develop skills in crafting effective prompts for AI image generation.

  • Recognize the importance of refining prompts and iterating to achieve desired results.

​

Duration: 50-55 minutes

Lesson 1.3: Searching/Generating Sprites with AI

Learning Objectives:

  • Learn how to search for existing AI-generated sprites using appropriate categories.

  • Develop skills in crafting detailed prompts for generating custom sprites.

  • Practice refining queries to achieve desired sprite characteristics.

 

Duration: 50-55 minutes​​

​

Lesson 1.4: Project: "Journey of a Waterdrop"

Learning Objectives:

  • Apply AI Tools: Utilize AI-generated sprites and backdrops to create simple animations; use the image editing tool to generate new variations of images or fix issues.

  • Storytelling: Develop a basic narrative through visual and textual elements.

  • Prompt Engineering: Enhance skills in crafting effective prompts for AI image generation.

  • Basic Coding: Implement fundamental coding blocks to control sprite actions and scene transitions.

  • Creativity: Foster creativity by designing and presenting a simple AI-driven story.

 

Duration: 100-110 minutes​

Lesson 1.5: Ethics Discussion: AI-generated Images

Learning Objectives:

  • Understand the ethical implications of using AI-generated images.​

  • Explore concepts of copyright, licensing, and ownership related to AI art.

  • Recognize the importance of image moderation in K–12 settings.

  • Develop critical thinking skills regarding the responsible use of AI in creative projects.

​

Duration: 50-55 minutes

Lesson 1.6: Learning with CreatiCode XO

Learning Objectives:

  • Understand how to effectively use CreatiCode XO to ask “Explain to me…” and how-to questions about coding blocks or concepts.

  • Develop the ability to ask follow-up questions to maintain context in conversations with XO.

  • Learn to decompose a large challenge into smaller steps

  • Learn to control and customize XO’s response styles through prompt engineering.

  • Learn to handle syntax errors in XO’s response

  • Enhance problem-solving skills by leveraging XO as a coding assistant.​

 

Duration: 100-110 minutes

Lesson 1.7: Debugging with CreatiCode XO

Learning Objectives:

  • Understand debugging as identifying differences between expected and actual outcomes.

  • Master debugging techniques: Logical Deduction, Simplifying Programs, Logging, Step-by-Step Execution, Breakpoints, and Explaining Code to Others.

  • Apply systematic debugging processes to locate and fix bugs independently.

  • Leverage CreatiCode XO effectively to assist the debugging process when stuck.

  • Recognize XO’s limitations and maintain primary responsibility in debugging.

 

Duration: 100-110 minutes​

​

Lesson 1.8: Using XO to Generate Quizzes

Learning Objectives:

  • Understand how to utilize CreatiCode XO to generate quizzes and provide feedback on our answers.

  • Learn to customize quiz difficulty levels and formats (e.g., multiple-choice, short-answer, coding challenges).

  • Develop skills in crafting effective prompts to obtain meaningful and accurate quizzes.

  • Evaluate the limitations of AI-generated quizzes, particularly in assessing coding challenges.

  • Enhance problem-solving abilities by using XO to check and improve quiz answers.

 

Duration: 50-55 minutes​

Lesson 1.9: Using XO for Project Design

Learning Objectives:

  • Utilize CreatiCode XO to brainstorm and outline new project ideas.

  • Develop skills in crafting specific and manageable project plans with XO’s assistance.

  • Implement step-by-step project plans by engaging in iterative questioning with XO.

  • Recognize and understand the limitations of XO in project development.

  • Apply best practices in prompt engineering to effectively use XO as a project planning tool.​

​

Duration: 50-55 minutes

Lesson 1.10: Getting Feedback from XO

Learning Objectives:

  • Understand Importance of Evaluation: Grasp the significance of project evaluation and constructive feedback in the learning process.

  • Utilize Rubrics for Self-Assessment: Learn how to use predefined rubrics to assess their own projects.

  • Interact with CreatiCode XO: Develop skills in using CreatiCode XO to receive automated feedback based on rubrics.

  • Iterative Improvement: Apply feedback from XO to enhance projects through iterative development.

  • Develop Prompt Engineering Skills: Practice crafting effective prompts to obtain meaningful feedback from XO.

​

Duration: 50-55 minutes

Lesson 1.11: Generative AI Tools Assessment

Learning Objectives:

  • Recap AI Tools: Reinforce understanding of core AI tools used in the first module.

  • Reflect on Learning: Encourage students to reflect on their experiences and share effective prompts and strategies used with CreatiCode XO.

  • Assess Prompt Engineering Skills: Evaluate students’ ability to craft effective prompts through an assessment.

​

Duration: 50-55 minutes​

​

image.png

Unit 2: Basic Generative AI Apps

Lesson 2.1: Introduction to the ChatGPT Block

Learning Objectives:

  • Identify each parameter of the ChatGPT Request block (prompt, result variable, temperature, length, mode, session).

  • Experiment with setting different parameter values (especially temperature, length, and session) to see how they affect ChatGPT’s responses.

  • Recognize that ChatGPT’s output can be influenced by the prompt (brief mention of prompt quality).

​

Duration: 50-55 minutes

Lesson 2.2: A Simple "Ask-Me-Anything" App

Learning Objectives:

  • Understand Conversation Flow: Students will learn the basic text-based conversation flow: capturing user input, sending it to ChatGPT, and displaying the response via a sprite.

  • Familiarize with Key Blocks: Students will wire up the “ask and wait” block and the ChatGPT block, understanding each parameter (request, result, mode, length, temperature, and session).

  • Engage in Peer Collaboration: After completing the basic project, students will work in pairs to explore additional challenges that extend the functionality of the project.

​

Duration: 50-55 minutes

Lesson 2.3: A "Chat with Einstein" App

Learning Objectives:

  • Role-Playing with ChatGPT: Students will learn how to prompt ChatGPT to assume the persona of Albert Einstein, enabling the AI to role-play as the famous physicist.

  • Chat Interface Construction: Students will create a chat interface using the chat window widget, handle user input events, and display responses in a conversational format.

  • Understanding Prompting Techniques: Students will explore how the initial prompt influences ChatGPT’s responses and how to adjust parameters (such as response length and session type) to refine the conversation.

  • Collaborative Exploration: After building the basic project, students will work in pairs to implement additional challenges.

​

Duration: 50-55 minutes​​

​

​

Lesson 2.4: An Improved Chat App

Learning Objectives:

  • Enhance Chat Application Functionality: Students will refine a basic ChatGPT chat app by implementing streaming responses and improved role enforcement.

  • Master Advanced Prompt Engineering: Students will learn how to modify and enhance system requests to ensure ChatGPT consistently maintains its assigned persona.

  • Implement Streaming Responses: Students will configure their projects to display partial responses in real time by using streaming mode, thereby improving the user experience.

  • Develop Problem-Solving and Debugging Skills: Through guided troubleshooting, students will learn how to detect when ChatGPT’s response is complete and handle situations like chat limit exceedance.

  • Collaborative Exploration: After completing the basic improved version of the project, students will work in pairs to experiment with additional challenges and creative modifications.

​

Duration: 50-55 minutes

​

Lesson 2.5: "Guess a Historical Figure" Game

Learning Objectives:

  • Creative Application Development: Students will explore how changes in the prompt alone can create entirely new applications.

  • Collaborative Problem Solving: After constructing the basic version, students will work in pairs to tackle additional challenges and experiment with creative variations.

​

Duration: 50-55 minutes

​

Lesson 2.6: An "MBTI Personality Test" App

Learning Objectives:

  • Prompt Engineering Process: Students will learn how to adapt and refine a prompt in a step-by-step manner based on observed issues in ChatGPT’s output.

  • Application Adaptation: Students will transform a basic chat app into an MBTI Personality Test application by gradually modifying the prompt to better meet project goals.

  • Problem-Solving in Development: Students will simulate a real app development process by identifying issues in the initial output, brainstorming solutions, and applying prompt modifications iteratively.

  • Collaborative Improvement: After building the basic version, students will work in pairs to explore additional challenges and creative prompt enhancements.

​

Duration: 50-55 minutes

Lesson 2.7: AI-based Story Writer

Learning Objectives:

  • Creative Text Generation: Students will learn how to use ChatGPT to generate creative text — in this case, a short story — based on user-defined parameters. This is a different use case from chatting.

  • User Interface Design with Widgets: Students will build a user-friendly interface using widgets (label, textbox, button, layout row) instead of using a chat window, demonstrating real-world app design.

​

Duration: 50-55 minutes

Lesson 2.8: A Quiz Writer

Learning Objectives:

  • Quiz Generation with ChatGPT: Students will learn how to generate quiz questions on a user-specified topic and evaluate user answers using ChatGPT.

  • Advanced Widget Usage and Positioning: Students will gain hands-on experience with building a user-friendly interface by adding and positioning various widgets (labels, textboxes, buttons) using the "widget positioning" tool. Note that this is a more complex user interface compared to the story writer from the previous lesson.

  • Dynamic Request Composition: Students will learn to dynamically compose a ChatGPT request by combining fixed instructions with user input, and process streaming responses.

​

Duration: 50-55 minutes​​

​

​

Lesson 2.9: An AI Book Writer

Learning Objectives:

  • ​Comprehensive AI Text Generation: Students will learn how to harness AI to generate an entire book by dynamically composing requests, managing output length limits, and ensuring quality through iterative prompt refinement.

  • Strategic Prompt Engineering: Students will understand the importance of breaking a large writing task (i.e., writing a book) into manageable steps — first generating a table of contents, then writing individual chapters.

  • User Interface (UI) Design with Widgets: Students will build a sophisticated UI using various widgets (labels, textboxes, dropdown menus, buttons, and rich textboxes) and utilize the "widget positioning" tool for precise placement.

  • Iterative Development and Testing: Through a step-by-step process, students will learn to refine prompts based on testing results and incorporate additional details to improve output quality.

  • Collaborative Enhancement: After constructing the basic book writer, students will work in pairs to explore advanced challenges, simulating real-world iterative development.

​

Duration: 70-75 minutes

​

Lesson 2.10: A Cloze Game

Learning Objectives:

  • Complex Prompt Design: Students will learn to design and refine advanced prompts that instruct ChatGPT to generate fill-in-the-blank (cloze) texts.

  • Dynamic Request Composition: Students will practice combining fixed instructions with user input to create a dynamic AI request, ensuring the output is easy to parse and process in code.

  • UI Integration and Data Processing: Students will build an interactive user interface using widgets (e.g., textboxes, dropdowns, buttons) and learn techniques to manipulate the AI response (e.g., replacing headers, splitting text).

  • Iterative Problem Solving: Through guided analysis of ChatGPT’s output, students will identify issues (such as excessive length, variable question counts, and unwanted answer exposure) and iteratively refine the prompt to meet precise requirements.

  • Collaborative Enhancement: In pair work, students will explore additional challenges to further improve their cloze game, simulating a real-world iterative development process.

​

Duration: 70-80 minutes

​

Lesson 2.11: 2 Chatbots Debating Each Other

Learning Objectives:

  • Multiple Chatbots Usage: Students will learn how to instantiate and manage two AI chatbots (Pro and Con) in a single CreatiCode project.

  • Role Separation & Conversation Flow: Students will see how to assign distinct roles to each chatbot and control the conversation so that each bot processes the correct messages.

  • Prompt Engineering & Session Control: Students will understand how to use “continue” sessions to maintain chat history for each bot and how to incorporate prior messages.

  • Event-Driven Logic: Students will practice adding UI elements (textboxes, buttons) and structuring a debate flow with multiple phases (Opening, Cross-Examination).

  • Collaboration & Creativity: After building the base debate system, students will work in pairs to brainstorm or implement additional features (e.g., closing statements, AI judge).

​

Duration: 70-75 minutes

Lesson 2.12: Effective Prompting with the TIRE Method

Learning Objectives:

  • Understand the T.I.R.E. Method: Students will learn the four key components of effective prompting—Task, Instruction, Refinement, and Example—and understand how each contributes to better AI responses.

  • Develop Effective Prompting Skills: Students will practice composing prompts that clearly state the task, provide specific instructions, allow for iterative refinement, and use examples to guide output formatting.

  • Apply Prompting Techniques to Real-World Scenarios: Students will work through a detailed example (e.g., writing a how-to guide) to see the T.I.R.E. method in action, and then attempt similar tasks on their own.

  • Encourage Iterative Improvement: Students will learn to review and refine their prompts based on AI output, developing the mindset that effective prompting is an iterative process.

  • Collaborative Exploration: Through pair work, students will experiment with the T.I.R.E. method on various tasks, exchanging ideas and refining their prompts collaboratively.

  • Assess Prompt Engineering Skills: Evaluate students’ ability to leverage the TIRE method for effective prompts through an assessment.

​

Duration: 70-75 minutes

Lesson 2.13: Group Project A

Learning Objectives:

  • Integrate Learned Skills: Students will apply all the concepts learned so far—including prompt engineering (using the T.I.R.E. method), chat interface design, widget-based app development, and response processing—to design a complete generative AI application.

  • Collaborative Design & Development: In groups of two, students will collaborate to ideate, design, implement, test, and refine a functional Gen AI app. The project may be a game, tool, or interactive app using either a chat interface or a custom widget-based interface.

  • Iterative Problem Solving & Refinement: Students will employ iterative development practices, continuously refining both the app’s interface and the AI prompting until a high-quality product is achieved.

  • Real-World Publishing & Peer Feedback: Students will prepare their projects for publication on creaticode.com and participate in a final demo day where they will test and provide feedback on each other’s apps.

​

Duration: 200-250 minutes​​

​

​

image.png

Unit 3: Voice and Vision AI Apps

Lesson 3.1: Speech to Text Recognition

Learning Objectives:

  • Understand Speech Recognition Concepts: Students will grasp the core idea of converting speech to text without necessarily understanding meaning.

  • Explore Speech Recognition Modes: Learn the difference between basic and continuous speech recognition, and their respective process flows.

  • Design Decisions in Speech Recognition: Analyze critical design choices such as when to stop recording and how user interaction affects usability.

  • Accessibility Emphasis: Recognize speech recognition's role in enabling users with disabilities to interact with computers without using hands.

  • Block Familiarization: Master key blocks for both modes and understand timing and flow of data.

​

Duration: 75-80 minutes

Lesson 3.2: Text to Speech

Learning Objectives:

  • Understand Text-to-Speech Functionality: Students will learn how to synthesize speech from text using the CreatiCode Text to Speech block, including how to adjust its various input parameters.

  • Familiarize with Input Parameters: Students will thoroughly examine the block’s inputs—Sentence, Speaker Language, Speaker Type, Talking Speed, Pitch, Volume, and Sound Clip Name—and understand how each affects the synthesized output.

  • Explore Sound Storage and Playback: Students will learn how to store synthesized speech as a sound clip and later replay it using other blocks.

  • Develop a Mini Application: In pairs, students will create a simple app that maps directional commands ("up", "down", "left", "right") to corresponding sound clips, with optional enhancements such as sprite direction changes or additional commands.

​

Duration: 50-55 minutes

Lesson 3.3: Talk to an AI Sprite

Learning Objectives:

  • Integrate Multiple AI Blocks: Students will combine speech-to-text (voice recognition), ChatGPT processing, and text-to-speech synthesis to create an interactive AI sprite.

  • Review and Apply Prior Knowledge: Students will review concepts learned in previous lessons (speech recognition, text-to-speech, ChatGPT) by building a project that integrates these functionalities.

  • Encourage Creative Extensions: Students will have the opportunity to extend the base project by modifying the AI character’s attributes (e.g., costume, gender, speaking speed).​

​

Duration: 50-55 minutes​

​

Lesson 3.4: AI Voice Translator

Learning Objectives:

  • Integrate Multiple Translation Technologies: Students will learn how to combine speech-to-text, translation, and text-to-speech functionalities to build an AI voice translator.

  • Familiarize with Translation Blocks: Students will be introduced to the translation blocks from the original MIT Scratch extension and understand their usage in converting text from one language to another.

  • Enhance Problem-Solving and Creative Extension: After building the base project, students will work in pairs to extend the functionality (e.g., allowing language selection or two-way translation).

​

Duration: 50-55 minutes

​

Lesson 3.5: Vision-based AI Assistant

Learning Objectives:

  • Integrate Multi-modal AI: Students will learn how to build an AI assistant that can "see" by capturing camera images and "talk" by interacting with AI.

  • Utilize New Capabilities: Students will understand how to take a camera image and store it as a costume, and how to attach that image to an AI chat session.

  • Comprehend Project Workflow: Students will gain proficiency in combining multiple AI blocks (camera, widgets, speech recognition, and AI) to build a cohesive interactive project.

  • Develop Extension Ideas: Students will collaborate in pairs to extend the base project with enhancements such as follow-up questions, dynamic UI adjustments, or further prompt customization.

​

Duration: 80-85 minutes​

Lesson 3.6: Introduction to AI Motion Sensor

Learning Objectives: 

  • Conceptual Understanding:

    • ​Explain that a camera video is a sequence of frames, and motion is detected by comparing differences in pixels between consecutive frames.

    • Understand how the Video Sensing Extension enables the program to detect object movement in a camera feed.

  • Technical Proficiency:

    • Learn how to add the Video Sensing Extension to a project.

    • Utilize key video sensing blocks such as "Turn video (On/Off/On Flipped)", "Set Video Transparency", and the reporter block "Video (motion/direction) on (sprite/stage)".

    • Use the "when video motion > threshold" block to trigger actions based on detected movement.

  • Application and Problem Solving:

    • Build a simple project that uses video sensing to detect movement (e.g., making a sprite move when sufficient motion is detected).

    • Experiment with different threshold values to observe how sensitivity to motion changes.

​

Duration: 50-55 minutes

Lesson 3.7: Bouncing Ball with Motion Sensor

Learning Objectives:

  • Understand Motion Sensing Concepts:

    • Recognize that a camera video is a sequence of frames and that motion is detected by comparing pixel differences between frames.

    • Learn how the Video Sensing Extension converts movement into numerical data (motion and direction).

  • Utilize Video Sensing Blocks:

    • Add the Video Sensing Extension to a project.

    • Use key blocks such as "Turn video (On/Off/On Flipped)", "Set Video Transparency", the reporter block "Video (motion/direction) on (sprite/stage)", and the "when motion detected in camera video" block.

  • Apply Motion Data to Game Mechanics:

    • Program a bouncing ball game where a board is controlled by motion sensor data.

    • Implement variables to control board speed and constrain its position.

    • Integrate collision detection so that the ball bounces off the board and stops when it touches the bottom.

  • Encourage Creativity:

    • In pairs, students will choose one creative extension (e.g., speeding up the ball with each bounce, altering rebound angles, or adding obstacles) to personalize the game.

​

Duration: 50-55 minutes

Lesson 3.8: AI for Hand Tracking

Learning Objectives:

  • Learn Table Variable Concepts:

    • Understand the structure of table variables as compared to lists.

    • Learn to create a table and manipulate its data manually

    • Learn to read or search for values from a table.

  • Understand Hand Tracking with AI:

    • Comprehend how AI hand detection works by identifying finger keypoints.

    • Recognize that hand pose data is stored in a table variable where each row represents an "item" with multiple properties (columns).

  • Implement a Finger-Counting Project:

    • Use the hand detection block to detect finger positions.

    • Analyze the "curl" values to determine which fingers are stretched out.

    • Update the project’s visual output (switching costumes) to reflect the number of fingers extended.

  • Collaborative Creativity:

    • Work in pairs to brainstorm and extend the project with creative ideas (e.g., detecting gestures beyond finger counting, recognizing two hands, or identifying specific hand signs).

​

Duration: 120-140 minutes​

​

Lesson 3.9: Fitness Game Using Body Pose

Learning Objectives:

  • Understand Body Pose Detection and Review Table Variables:

    • Explain how the AI for Body Part Recognition block detects body parts and stores data in a table variable.

    • Review the structure of a table variable (from the previous lesson)

  • Interpret and Use Table Data for Pose Analysis:

    • Learn to read specific values from the table using row numbers and column names.

    • Calculate key metrics (e.g., hip distance and knee distance) from the table data.

    • Use these metrics to determine a player’s pose (e.g., "Squat" versus "Stand").

  • Develop an Interactive Fitness Game:

    • Design a game that challenges the player to achieve a target pose.

    • Implement game mechanics such as target pose switching, visual and sound feedback, and score tracking.

  • Collaborative Problem-Solving:

    • Work in pairs to extend the game with creative enhancements (e.g., additional poses, refined detection using arms, timed challenges).
      ​

Duration: 90-100 minutes​

Lesson 3.10: Group Project B

Learning Objectives:

  • Integrative Application of Voice & Vision AI: Students will synthesize and apply voice-based and vision-based AI techniques learned in Module 3 — such as speech recognition, text-to-speech, camera usage, and video/hand sensing — in a novel, functional application.

  • LLM AI Integration: Students may also incorporate LLM AI into their app to enhance interactivity and provide dynamic responses, combining it with voice or vision inputs.

  • Collaborative Design & Iterative Development: In pairs, students will collaboratively plan, design, implement, test, and refine their app. They will practice iterative problem solving, prompt refinement (using techniques like the T.I.R.E. method), and effective UI design.

  • Real-World Publishing & Peer Feedback: Students will prepare their projects for publication on CreatiCode and participate in a final demo session where they test and provide constructive feedback on each other’s apps.

​

Duration: 200-250 minutes​

WechatIMG375.jpg

Unit 4: Advanced Generative AI Apps

Lesson 4.1: Who's the Spy?

Learning Objectives:

  • Integrate LLM as a Reasoning Engine: Students will understand how to use ChatGPT as a game master to drive an interactive detective game by processing natural language and generating context-sensitive responses.

  • Develop a Multi-Sprite Interactive Game: Students will build a project involving multiple sprites (Doctor, Dog, Reindeer, and Monkey) that communicate with the player and with each other through broadcast messages and event handling.

  • Apply Prompt Engineering: Students will learn to craft detailed system prompts to instruct ChatGPT, ensuring it maintains the game narrative and role consistently throughout gameplay.

  • Enhance Problem-Solving Skills: Students will troubleshoot multi-step code integration and debug broadcast and message-handling issues in a complex project.

  • Foster Creative Collaboration: Students will work in pairs to extend the base project with creative modifications, such as altering game context, enhancing character interactivity, or refining ChatGPT’s responses.

​

Duration: 80-85 minutes

Lesson 4.2: Guardian of History

Learning Objectives:

  • Integrate LLMs as a Reasoning Engine: Students will understand how to use Large Language Models (LLMs) as a game master to drive an interactive detective game by processing natural language and generating context-sensitive responses.

  • Apply Prompt Engineering: Students will learn to craft detailed system prompts to instruct the AI, ensuring it maintains the game narrative and role consistently throughout gameplay.

  • Enhance Problem-Solving Skills: Students will troubleshoot multi-step code integration and debug broadcast and message-handling issues in a complex project.

  • Foster Creative Collaboration: Students will work in pairs to extend the base project with creative modifications, such as altering game context, enhancing interactivity, or refining AI responses.

  • Strengthen Historical Knowledge and Logical Reasoning: Through gameplay, students will reinforce their understanding of historical events and practice logical deduction.

​

Duration: 110-120 minutes

Lesson 4.3: Text Summarization: Product Review

Learning Objectives:

  • Understand Incremental Summarization: Students will learn how to handle large amounts of text data (e.g., product reviews) when ChatGPT has a token limit, by breaking the text into smaller batches and incrementally updating a summary.

  • Practice Prompt Engineering: Students will explore techniques for refining ChatGPT prompts (e.g., bullet points, limited points, combined categories) to improve summary clarity.

  • Develop Coding Skills with Data Tables: Students will learn to retrieve data from a table of reviews, process them in batches, and integrate new reviews into an evolving summary.

  • Enhance Creativity and Collaboration: After completing the base tutorial, students will work in pairs to customize or expand the project with creative ideas.​

​

Duration: 70-75 minutes​

​

Lesson 4.4: Text Summarization: Web Search

Learning Objectives:

  • Revisit Text Summarization Concepts: Students will reinforce incremental text summarization techniques (from Lesson 4.2) by integrating a real-time web search step.

  • Integrate Web Search & Summarization: Students will learn to retrieve the top search results for a user’s query, display them, and let ChatGPT produce a concise summary.

  • UI Design with Widgets & Tables: Students will practice building a small user interface using textboxes, buttons, and data tables in the CreatiCode Playground.

  • Creative Problem-Solving & Collaboration: After the base tutorial, students will work in pairs to enhance the search-and-summarize flow with their own ideas.​

​

Duration: 60-65 minutes​

Lesson 4.5: Tool Use: Math Calculations

Learning Objectives:

  • Identify LLM Weaknesses: Students will learn how ChatGPT (and similar large language models) can fail at math by “guessing” rather than calculating.

  • Introduce “Tool Use” for LLMs: Students will discover how to augment ChatGPT’s abilities by providing it with a specialized “calculator tool” and instructing it to use that tool for arithmetic.

  • Prompt Engineering Best Practices: Students will practice writing prompts that enforce a fixed response format (e.g., “CALC: expression”) to ensure ChatGPT consistently asks the system for math help.

  • Creative Problem-Solving: Students will work in pairs to explore other ways of extending ChatGPT’s abilities, such as providing the current date or algorithmic solutions.​

​

Duration: 65-70 minutes​

Lesson 4.6: Tool Use: Web Search

Learning Objectives:

  • Revisit Tool Use Concept: Reinforce the approach from Lesson 4.4 (Tool Use: Math Calculations) by extending ChatGPT’s capabilities to include web searches for new information.

  • Understand LLM Limitations & Cut-off Dates: Students will learn that ChatGPT’s training data is often limited to a specific cutoff date, making it unaware of recent events.

  • Prompt Engineering for Web Search: Students will practice crafting prompts that direct ChatGPT to produce “WEB:” queries whenever it needs new information.

  • Develop Multi-step Reasoning Skills: Students will code a workflow where ChatGPT identifies a need for external data, requests a web search, and the system feeds back relevant snippets.

  • Creativity & Collaboration: Students will work in pairs to devise or modify the project with new ideas (e.g., more robust search formatting or additional “tools”).

​

Duration: 80-85 minutes

Lesson 4.7: An AI-Powered Calendar Assistant

Learning Objectives:
By the end of this lesson, students will be able to:

  • Understand why Large Language Models (LLMs) cannot remember information across sessions and how external storage like Google Sheets solves this problem.

  • Use Google Sheets to store and manage calendar data for an AI assistant.

  • Teach the AI to manage a calendar by adding, removing, searching, and finding events using custom commands.

  • Handle date and time information correctly since AI models lack built-in awareness of the current time.

  • Build problem-solving and debugging skills through hands-on coding.

​

Duration: 120-130 minutes

Lesson 4.8: A QA Bot Using Semantic Search (RAG)

Learning Objectives:

  • Understand Semantic Search

    • Recognize how searching by meaning differs from searching by keywords.​

    • Appreciate the benefit of “embedding” text into vectors to compare similarities in meaning.

  • Implement a QA Bot with Semantic Search:

    • Create a semantic database from a data table of question-answer pairs.

    • Integrate the semantic search results with ChatGPT’s prompt to produce accurate, knowledge-based answers.

  • Practice Retrieval Augmented Generation (RAG):

    • Combine retrieved QA pairs with a user’s query in a single prompt.

    • Refine prompts to produce focused responses and avoid irrelevant details.

  • Apply Creative Problem-Solving:

    • Work in pairs to design a new QA bot on a chosen topic, ensuring ChatGPT accesses only the relevant knowledge base for answers.

​

Duration: 100-110 minutes​

​

Lesson 4.9: Group Project C

Learning Objectives:

  • Integrate Advanced LLM Capabilities:

    • Students will utilize ChatGPT (or a similar LLM) to perform higher-level tasks such as text processing, summarization, reasoning, tool use (e.g., calculator, web search), and semantic search.

    • They will incorporate these features into a creative application that addresses real-world needs or classroom-related tasks.

  • Collaborative App Development:

    • Students work in small groups (2–3) to design, implement, and refine an application that leverages multiple advanced LLM features.

    • They will plan functionality, create user interfaces, handle AI interactions, and present a polished final product.

  • Iterative Problem-Solving & Refinement:

    • Groups will continually test and refine their AI prompts, user flow, and UI components.

    • They will employ prompt engineering techniques to ensure the AI output meets the intended purpose.

  • Real-World Publishing & Data Management:

    • Students may explore saving app data to an external sheet (e.g., Google Sheets via CreatiCode’s Cloud blocks), simulating real data storage and retrieval.

    • They will present their finished apps to classmates, offering peer demonstrations and feedback.

​

Duration: 200-220 minutes​

WechatIMG379.jpg

Unit 5: Advanced AI Topics

Lesson 5.1: Image Analysis: Number Recognition

Lesson Overview:
     This lesson introduces students to a basic image analysis technique by scanning and detecting edges of drawn shapes (digits “0” or “1”). It marks the first lesson in the new Predictive AI module (Module 5), where students shift from generative AI to more analytical/recognition-based AI methods.
     The project uses pen-drawing blocks, simple geometry, and color detection to classify whether the user-drawn digit is “0” or “1.” By scanning a specific row of pixels for the number of “edges,” the program infers which digit has been drawn.

​

Duration: 100-110 minutes

Lesson 5.2: KNN Classifier for Diabetes

Lesson Overview:
     In this lesson, students will explore the K-Nearest Neighbors (KNN) classification method using a real-world medical dataset (diabetes data). This lesson builds on the foundation of predictive AI introduced in Lesson 5.1, emphasizing how AI can make classifications based on similarity to existing data points. Students will import a dataset with key health metrics, build a KNN classifier, and evaluate its performance on unseen data.

​

Duration: 100-110 minutes

Lesson 5.3: Neural Network for Training and Prediction

Lesson Overview:
     This lesson introduces Neural Networks (NN) as a foundational concept in modern AI, demonstrating how a model can learn to predict numeric results from provided examples. Students will step through building a 3-layer neural network that learns a simple numeric pattern (z = 2x + 3y) from training data and then evaluates its predictions on new test data. They will also see how to save and load their trained models.

​

Duration: 110-120 minutes​

​

Lesson 5.4: Self-Driving Car AI: Starting and Stopping the Car

Learning Objectives:

  • Understand Basic Car Control: Students will learn how to send commands to start and stop a simulated car by controlling engine force and brake parameters.

  • Introduce AI Driver Logic: Students will understand the concept of an AI driver reading real-time data (car’s speed, position) and adjusting commands accordingly.

  • Familiarize with the Simulation Environment: Students will learn to navigate and use a 3D car simulation project in CreatiCode, focusing on the “AI” sprite for controlling the car.

  • Experiment with Conditional Logic: Students will practice using if-else conditions and timers to manage car speed limits and stop durations.

  • Practice Problem-Solving: Through additional challenges, students will apply and extend their knowledge by setting speed limits or using positional data to stop at a precise location.

​

Duration: 50-55 minutes​

​

Lesson 5.5: Self-Driving Car AI: Lane Centering

Learning Objectives:

  • Steering Command Basics: Students will learn how to send steering commands to a simulated car, turning the wheel left or right to maintain a desired lane position.

  • Safety and Lane Centering: Students will understand the importance of lane centering in real-world self-driving cars for passenger safety and preventing collisions.

  • Sensors & Virtual Objects: Students will explore how sensor data (in the form of a “lane marker” object) is used by the AI driver to position the car correctly in the lane.

  • Conditional Logic for Lane Alignment: Students will use the marker’s angle data to decide when and how much to steer, ensuring the car remains within the lane.

  • Hands-On Practice & Challenges: After replicating the main tutorial, students will work in pairs to tackle additional challenges, such as faster corrective steering or speed adjustments during lane centering.

​

Duration: 50-55 minutes​

Lesson 5.6: Self-Driving Car AI: Making Right Turn

Learning Objectives:

  • Introduce Finite State Machine Concept:  Students will understand how to use “driving modes” (e.g., normal, stopsign, turnright) to organize the AI’s logic.

  • Recognize & Handle Stop Signs: Students will learn to detect when the lane marker reaches a stop sign and make the car come to a complete stop.

  • Right-Turn Maneuver: Students will implement a “turnright” mode that reorients the car onto the new street after it has stopped.

  • Refine Steering & Direction Logic: Students will leverage the car’s direction (car dir) and a calculated target dir to determine when the turn is complete.

  • Collaborative Problem-Solving: Students will practice pair programming to tackle additional challenges, such as smoother turning or turning into a different lane.

​

Duration: 70-75 minutes

Lesson 5.7: Self-Driving Car AI: Following Driving

Learning Objectives:

  • Use a Driving Directions List: Students will understand how to store and retrieve instructions (left, right, straight) from a list, and advance through them as the car progresses.

  • Extend the Finite State Machine: Students will expand the AI driver’s state machine to include additional modes like “turnleft,” “gostraight,” and dynamic lane switching.

  • Implement Lane Switching: Students will learn how to detect the presence of a left or right lane (“left marker” or “right marker”) and instruct the car to move into the correct lane before a turn.

  • Develop Advanced Turning Logic: Students will code specialized blocks for handling left vs. right turns and learn to adjust steering angles to achieve smoother maneuvers.

  • Practice Problem-Solving & Collaboration: After following the tutorial, students will pair up to tackle additional challenges, such as handling “S” (go straight), switching to the right lane, and stopping after all directions are completed.

​

Duration: 100-110 minutes

ACT1 Unit 2.png

Unit 2: Basic Generative AI Apps

Lesson 2.1: Introduction to the ChatGPT Block

Students will be introduced to the computer science concept of sequence by participation in an unplugged hands-on activity.

​

Duration: 50 minutes

Lesson 2.2: A Simple "Ask-Me-Anything" App

Students will use the TIPP&SEE strategy to play, predict and explore sequencing within an existing project. They will observe the actions of sprites (move, play sound, say or think something) that run in the proper order.

​

Duration: 50 minutes

Lesson 2.3: A "Chat with Einstein" App

Students modify an existing project by adding a sequence of actions to sprites to complete their Quest. They also explore through a Name Poem project. They will plan what they want their sprites to do before building their project.

​

Duration: 50 minutes

​

​

Lesson 2.4: An Improved Chat App

Students build their Name Poem project. They will create sequences of blocks that contain one event block and at least one action block.

​

Duration: 50 minutes

​

Lesson 2.5: "Guess a Historical Figure" Game

Students reflect on their own Name Poem projects, then share their projects with a partner.

​

Duration: 50 minutes

​

ACT1 Unit 3.png

Unit 3: Voice and Vision AI Apps

Lesson 3.1: Speech to Text Recognition

Students will learn how Events act as commands that can trigger one or multiple actions to occur at the same time.

​

Duration: 50 minutes

Lesson 3.2: Text to Speech

The concept of Events in programming is introduced. Students use the TIPP&SEE strategy to play, predict and explore an existing Scratch project that contains basic events.

​

Duration: 50 minutes

Lesson 3.3: Talk to an AI Sprite

Students modify an existing project by adding event scripts to complete their Quest. Students also think through how Events work as they plan their About Me project. Students will plan what they want their sprites in their program to do.​

​

Duration: 50 minutes

​

​

Lesson 3.4: AI Voice Translator

Students build their About Me project from nine Scratch blocks that students have been introduced to.

​

Duration: 50 minutes

​

Lesson 3.5: Vision-based AI Assistant

Students reflect on their own About Me projects, then share their projects with two partners.

​

Duration: 50 minutes

​

ACT1 Unit 4.png

Unit 4: Advanced Generative AI Apps

Lesson 4.1: Who's the Spy?

Students play an unplugged game, Scratch Charades, to introduce themselves to using Scratch loop blocks in scripts.

​

Duration: 50 minutes

Lesson 4.2: Guardian of History

Students modify a Scratch project to animate animals in a parade using loops.

​

Duration: 50 minutes

Lesson 4.3: Text Summarization: Product Review

Students explore through a Dance Party project using loops. Students will plan what they want their sprites to do before building their project.​

​

Duration: 50 minutes

​

​

Lesson 4.4: Text Summarization: Web Search

Students will build their own Dance Party project.​

​

Duration: 50 minutes

​

Lesson 4.5: Tool Use: Math Calculations

Students reflect on their own Dance Party projects, then share their projects with a partner.​

​

Duration: 50 minutes

​

bottom of page